Oil IT Journal: Volume 24 Number 4


800lb gorilla crashes OSDU party

Schlumberger to open source data management system and ‘contribute’ code to The Open Group’s Open Subsurface Data Universe.

In our last issue we reported on no less than four ‘emergent’ upstream data ecosystems, from Agile, Halliburton, Schlumberger and The Open Group’s OSDU, the Open subsurface data universe. Schlumberger has now confirmed, as we reported earlier this year, that it is to ‘open source’ its Delfi-based data ecosystem. Moreover, Schlumberger is to contribute its data system to OSDU to ‘accelerate’ delivery of the data platform.

According to the Schlumberger release, the teaming will accelerate operators’ digital progress by moving data to the cloud using the OSDU platform, either when deploying their own applications or adapting Delfi applications to their needs. The move is to ‘avoid creating fragmentation of data and workflows that inhibits collaboration or creates barriers to innovation within the industry’.

Whether having an 800lb gorilla pitch into the game at the last minute (TOG earlier told Oil IT Journal that the OSDU code would be publicly available in September) is a help or a hinderance is another matter. An OSDU panel session at the PPDM-backed 2019 Houston Professional Petroleum Data Expo (i.e. before Schlumberger joined the party) addressed ‘industry scepticism’ as to what would be delivered and when, responding that OSDU operates in a ‘very agile’ manner, using ‘modern collaboration tools including Gitlab and Slack’.

The initial Amazon Web Services implementation demo release with ‘limited datasets and scope’ is (was?) targeted for summer 2019. Other ‘reference implementations’ are planned for Azure and the Google cloud. At the PPDM event, OSDU received endorsements from ConocoPhillips, BP and Chevron along with Shell, which conceived the initiative. The group has multiple subcommittees working on the architecture, data definitions, and security, as well as outreach and communications. More from TOG.


Quantum computing in oil and gas

Total HPC guru outlines extensive technology watch program to track quantum technology’s potential. Potential applications in computational chemistry, materials science and (maybe) seismics.

Speaking at an event hosted by BPI France, Total’s seismic and high-performance computing guru Henri Calandra explained how his company was supporting various lines of research into quantum computing. Today’s HPC is looking for new approaches, as conventional approaches ‘run out of steam’. Moore’s law is to end by 2025 and parallelism is limited by Amdahl’s law. Energy is also getting to be a big issue (a subsequent speaker opined that a ‘post exaflop’ conventional HPC computer would need a 900 MW power station to run). Oil and gas needs HPC for seismics, reservoir simulation on heterogeneous, complex reservoirs, uncertainty management and, increasingly, for machine learning and high-performance data analytics. Other fields requiring HPC include MINLP (mixed integer non-linear programming) problems including refinery blending, scheduling, production, shipping and oil field/reservoir optimization. Possibly closer to the quantum bailiwick we have computational material science, where the ability to accurately model ground states of fermionic systems would have significant implications for many areas of chemistry and materials science such as catalysis, solvents, lubricants and batteries.

Total is therefore exploring quantum technology as a potential groundbreaking new approach to pave the way to a ‘beyond exascale’ future. Quantum is however, ‘very challenging technology’ and is currently limited to a few tens of qubits on a ‘Nisq’, a ‘noisy, intermediate scale quantum device’. Total’s objectives are to understand and track the evolution of quantum computing across initiatives such as Quantum Computing’s D-Wave, IBM Q, Google’s Bristlecone and Rigetti Computing’s 16Q Aspen.

In addition to the novel hardware, quantum algorithmics is a ‘brand new science’. Total is working to accelerate and build in-house competencies, working with research partners and an ecosystem of hardware providers to develop algorithms for Total business use cases, and to be ready when industrial quantum computers become available and quantum supremacy is demonstrated. Total acquired a 30 qbit ATOS QLM 30 in 2018 and is currently upgrading to a QLM 35 system*.

We chatted briefly with Caldera and asked what were the most promising short(ish) term applications of QC. He confirmed that chemistry and combinatorial optimization were the most promising areas. We asked for more specifics on potential geophysical applications. QC has potential application in some geophysical modeling problems. Full waveform inversion? ‘Yes, but easily 5 to 10 years out’. Caldera was circumspect as to the likelihood of QC having oil and gas application any time soon.

* More on Atos’ quantum initiatives.


Editorial

Neil McNaughton ponders on the rather loud message from the London EAGE, that salvation for the oil and gas industry mandates carbon capture and storage. Well, that was the message from the plenary sessions. For those on the conference floor it was business as usual. Cognitive dissonance for anyone?

At the London EAGE* earlier this year, cognitive dissonance was the order of the day. Moving from the plenary sessions (which we duly report elsewhere in this issue) into the exhibition area we heard one group of explorations geophysicists (for that is what the EAGE is) discussing the threat of global warming, to the world, and to the oil and gas industry, while another group (the exhibitors) carried on with their usual business of oil and gas exploration. While wondering how to editorialize on the above, I read Malcom McBarnett’s piece in First Break, and thought that I would just point you in his direction where you can read his take on the ‘uncertain future’ of the industry and the ‘impossible dilemma’ that faces oil companies.

Since the EAGE, industry sentiment appears to be swinging steadily towards the need to address (or at least pay lip service to) the carbon issue. An SPE-backed event in France, the Gaia Summit, asked the question ‘Is the oil and gas industry on the right side of history?’ What did they come up with? A cute diagram and a photo-op laden Twitter feed where SPE president Sami Alnuaim effused that the group was ‘setting the stage for the oil and gas industry to claim the pride and show the responsibility of what we do: energizing the world, improving people’s lives, leading social development and protecting the planet’. The Summit does not appear to have published its findings but fortunately, Gaffney Cline was there in the person of Nigel Jenvey who reported that** …

The oil and gas industry are participants in a global energy system that is already transitioning to be lower carbon, and we need to be considerate and responsive to what society needs and wants. There is no silver bullet to achieving this. It will require carbon management and the scale-up of every low-carbon option, including […] carbon capture, use and storage (CCUS) [and] will be the backbone of our future industry.

In my 2017 editorial, COP23 BECCS, FECCS and the future of fossil fuel, I doubted whether the world would ever be prepared to pay the cost of CCS. Sitting in on Philip Ringrose’s presentation at EAGE I was puzzled as to how he got to such a relatively positive economic take for CCS. In 2017, I concluded that infrastructure costs made it unlikely that the world would ever be prepared to pay the price. In an email exchange, Ringrose kindly responded to my points (see below). Given that CCS is the only way forward for low carbon fossil fuel energy production, it is interesting to imagine what kind of plays would support the extra cost of CCS.

But first, some more bad news for oil and gas from the green energy movement. McKinsey’s Insights has it that ‘by 2030, new build renewables will be out-compete existing fossil fuel generation on energy cost in most countries – a key ‘tipping point’ in the energy transition’. Most countries will reach this tipping point before 2025.

So, oil and gas will be squeezed, between cheaper, green energy and the extra cost of climate-saving CCS. This means that only the very cheap resources will survive. What will the cheap fossils be? Well they won’t be the oil sands. Shale/unconventionals will be squeezed even more than they are squeezed today. Which leaves us with coal, although that may seem surprising, Middle East oil and other cheap oil and gas. In which context, back to the EAGE where WoodMac’s Neil Anderson mentioned twice the ‘really low cost’ of ExxonMobil’s Guyana discoveries which are conventional deep-water oil finds. So, there you have it, a glimmer of hope for the geophysicists. Tempered perhaps by the way the seismic business is going ‘asset light’ with Schlumberger selling its vessels last year and now CGG!

* EU Association of geoscientists and engineers. Previously the EU Association of exploration geophysicists.

** More in the June 21 Gaffney Cline Insights.


Letter to the Editor

Following his presentation at the London EAGE we invited Philip Ringrose (Equinor & NNTU) to respond to our 2017 editorial*, where we doubted the economic likelihood of carbon capture and sequestration. He kindly got back with the following.

Hello Neil,

Your persistence in asking me to reply to your Editorial has paid off – sorry I was so hard to contact.

CCS is generally under attack from the ‘left’ and the ‘right’ of climate politics – so addressing more criticism is not something I look forward to.

It is true CCS has made slow progress. However, your summary tends to exaggerate both the cost of and ambitions for CCS. Below is my summary of the state of play at the moment:

1. According to the IEA (1), CCS is anticipated to support approximately 13% of total cumulative emissions reductions through 2050, requiring around 120,000 million tonnes (Mt) of cumulative CO2 reduction by 2050. Annual storage rates in 2050 are expected to be 6-7,000 Mtpa.

2. Currently 17 large-scale CCS facilities are in operation together with a further 4 under construction. These have an installed capture capacity of 37 Mtpa (see GCCI report (2)) so the scale up required is a factor of about 200.

3. The IPCC (3) has argued that emissions reduction costs without CCS deployment could be as much as 29% to 297% higher by 2100. So in the long-term CCS is cost effective, on the assumption that governments/societies actually want significant emissions reductions.

4. The most recent IPCC Report on Global Warming of 1.5°C requires CCS in most scenarios, and tends to focus on Bio-energy and CCS (BECCS) to create negative emissions (but at levels that seem ‘very challenging’).

5. There are a few signs in the EU, the US/Canada and in China that CCS is starting to be in focus again. Angela Merkel went public by stating the need for CCS, which is quite a change for Germany! The US recently passed the 45Q tax incentive for CCUS.

6. Currently Norway is pushing ahead with a new full-scale CCS project, reducing emissions from industry (cement and waste incineration). But there are only a few signs that other nations are serious about CCS.

7. Regarding economics – Equinor and Northern Gas networks (UK) recently published a detailed analysis of developing a hydrogen economy using CCS, including a full economic analysis. Read the report on the H21 North of England project.

So yes, CCS is more costly than not decarbonizing – but if we want to significantly reduce emissions, we need it (in addition to renewable energy and energy efficiency measures).

Best regards

Philip Ringrose

Specialist, Reservoir Geoscience, Equinor ASA

Adjunct Prof. NTNU, Trondheim, Norway

References:

(1) IEA, Carbon Capture and Storage: The solution for deep emissions reductions (International Energy Agency Publications, Paris, 2015).

(2) Global CCS Institute, Global status of CCS: 2018.

(3) Edenhofer, O. et al., (Eds), Mitigation of Climate Change. Working Group III (WG3) of the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change. (Cambridge University Press, 2014).

* COP23 BECCS, FECCS and the future of fossil fuel.


Book review - Linked Data: storing, querying, reasoning

A new book describes the evolution of the web into a web of ‘linked documents’ and the potential that semantic technologies have for making sense of it all. Plethoric, rather impenetrable, research avenues are described, as semantics meets-up with the big data movement. Making sense of ‘heterogeneous data from different sources’ appears to remain an elusive goal.

The semantic web, Tim Berners-Lee’s notion to ‘bring structure to the meaningful content of the web’ and allow ‘software agents to roam from page to page and carry out sophisticated tasks for users’ was first mooted around 2000. In the early days, a multitude of books were published elaborating on the merits and magic that was to flow from RDF, the seemingly straightforward resource description format. For our sins, we were on the semantic bandwagon from the get-go (almost) and have over the years traced the evolution of the movement in its slide down the hype curve and into its current resting place in academia. One thing that characterizes these early publications (and indeed much of today’s writing on technology) is a focus on its perceived or anticipated benefits. When we spotted the new publication on Linked Data*, we wondered whether this was going to be more of the same or might it provide enlightenment as to the direction the semantic web has taken since it was reborn as ‘linked data’. In what follows we capitalize ‘Linked Data’ when referring to the book. A non-capitalized ‘linked data’ refers to the subject of … linked data.

The 236-page publication begins with an explanation of how the world wide web has ‘has evolved from a web of linked documents to a web including linked data’. ‘The adoption of linked data technologies has shifted the web from a space of connecting documents to a global space where pieces of data from different domains are semantically linked and integrated to create a global web of data. Linked data enables operations to deliver integrated results as new data is added to the global space. This opens new opportunities for applications such as search engines, data browsers, and various domain-specific applications. … The web of linked data contains heterogeneous data coming from multiple sources and various contributors, produced using different methods and degrees of authoritativeness, and gathered automatically from independent and potentially unknown sources.’ This clearly makes ‘linking’ heterogeneous data sets tricky. Indeed …

Such data size and heterogeneity bring new challenges for linked data management systems. While small amounts of linked data can be handled in-memory or by standard relational database systems, big linked data graphs, which we nowadays have to deal with, are very hard to manage.

The authors seem to conflate heterogeneity and size and from this point on, Linked Data becomes a discussion of ‘modern’ database technology, big data and graph technology. The critical issue of heterogeneity, and the near impossibility of ‘reasoning’ across inconsistent data sets seems to take a back seat. In fact, the examples given perpetuate the awkward manner in which RDF captures essential metadata such as units of measure. For instance …

Let us revisit two types of time labels for representing the time information of RDF stream elements. An interval-based label is a pair of timestamps, commonly natural numbers representing logical time. A pair of timestamps, [start, end], is used to specify the interval in which the RDF triple is valid. For instance, :John :at :livingroom, [7, 9] means that John was at the “living room” from 7 to 9.

7 to 9 what one asks… seconds, minutes, days? This may seem trivial, but the treatment given to data in the RDF world is almost always idiosyncratic and error prone, not at all suitable for an engineering usage. There is no discussion of how engineering units are managed in a consistent manner in Linked Data. This is unfortunate as one the purported extensions of linked data is in the field of streaming sensor data. It appears that ‘current linked data query processing engines are not suitable for handling RDF stream data and … the most popular data model used for stream data is the relational model’. Nonetheless ...

The research trend for RDF stream data processing has been established as the main track in the Semantic Web community. With sensors getting deployed everywhere, the availability of streaming data is increasing. Handling such time-dependent data presents some unique challenges. Along with RDF triples, sometimes provenance information such as source of the data, date of creation, or last modification is also captured. In large RDF graphs, adding provenance information would only make the graph larger. Suitable mechanisms are needed to handle and manage provenance information.

Well, provenance is an interesting field. Linked Data has almost 300 references and a whole Chapter devoted to the subject.

… provenance has been of concern within the linked data community where a major use case is the integration of datasets published by multiple different actors… The unconstrained publication, use, and interlinking of datasets that is encouraged by the linked open data initiative is both a blessing and a curse.

Linked Data covers various provenance models from Dublin Core, the W3 and others. The section is stuffed full of references and a pointer to the W3’s publication on the subject. Describing provenance introduces new words ‘the monus operator’, new symbols and structures ‘the m-semiring’, the ‘seba-structure’. The section on RDF provenance dives in with an explanation of the multiple gotchas of embedding provenance in RDF and querying data sets with Sparql. But all in a rather impenetrable tone that may be understandable to the specialist but not by us!

Other chapters cover ‘storing’, ‘querying’ and ‘reasoning’ linked data at a similar academic level. The conflation with the ‘big data’ movement permeates the book with plethoric references and discussion on topics revolving around Hadoop, Spark etc. There are over 400 references cited and IT-related issues of partitioning, in (or not) memory processing and caching…

Our conclusion is that Linked Data is a large collection of research avenues and references in the field that are rather impenetrable for the outsider. It is different from those early books on the semantic web in that it does not proselytize (too much), but rather enumerates so many different research avenues that it is hard to see the wood for the trees. Moreover, if the semantic web has not taken off, it is at least partly because there is no straightforward, widely accepted way to unequivocally embed real data in a web page. Until that happens, the academics will go on researching techniques that, when they are confronted with the inconsistent data that makes up the web, are doomed to fail.

* Linked Data: storing, querying, reasoning by Sherif Sakr (King Saud bin Abdulaziz University), Marcin Wylot (TU Berlin), Raghava Mutharaju (GE Global Research), Danh Le Phuoc (TU Berlin), Irini Fundulaki (ICS Greece). Springer 2018 ISBN 978-3-319-73514-6 ISBN 978-3-319-73515-3 (eBook).


PNEC E&P Data and Information Management 2019

Woodside’s seismics to the cloud. ExxonMobil/RedHat on automating data pipelines with OpenShift. BHP on AI-enhanced data management. Equinor moves seismics to Triton’s TerraStor on AWS. ExxonMobil’s digital transformation. ‘Pegasus’, Chesapeake’s Appra-based prospect and play inventory. ConocoPhillips’ ‘citizen data scientist’ program. Total’s DataLab. Kadme Whereoil for YPF. Shell on drone data onslaught. Blue Marble on major geodetic changes.

Woodside’s seismics to the cloud

Jess Kozman (Woodside) and Matthew Holsgrove (Wipro) presented Woodside’s project to ‘accelerate seismic data cloud ingestion through machine learning and automation’. Woodside set out to maximize the benefits from its seismic investment with a move to optimized cloud data storage. The authors report that ‘there is currently no available COTS solution that supports the whole of the optimized workflow’ and so a bespoke solution was implemented with help from Wipro, embedding existing vendor cloud-native technologies and services where available. Woodside wanted to ‘liberate’ data from proprietary application formats and deliver workstation-ready seismic data to geoscientists. A ‘cloud-first’ approach means that all Woodside tenders for acquisition surveys and processing projects now specify that data must be delivered to the cloud. Woodside has implemented a machine-learning driven automated workflow for uploading data to the cloud. Bluware’s Open VDS compressed seismic data format and data streaming capabilities is used to reduce the storage footprint and optimize delivery of data to consumer applications. The Open VDS format has been adopted by the Open Subsurface Data Universe (OSDU) industry consortium. Woodside’s PPDM/Oracle master data repository is linked to the system through an application programming interface.

ExxonMobil/RedHat on automating data pipelines with OpenShift

A joint presentation from ExxonMobil’s Audrey Reznik and John Archer (RedHat*) covered the automation of analytics and data pipelines. Exxon was looking to make the results of its researchers’ efforts more accessible while preserving their stability. For a data scientist, ‘the ability to rapidly deploy code and quickly obtain feedback from a user is extremely valuable’. Enter RedHat’s OpenShift containerized Kubernetes platform. This has allowed Exxon to manage and deploy Jupyter notebooks, the tool of choice for combining python code, a GUI and documentation for sharing research tools with geoscientists and engineers. The RedHat/Exxon proof of concept (PoC) sets out to create a reproducible and interactive data science environment with user interaction from a Google Chrome browser. The PoC trialed the use of a cloud-based GPU cluster to read and analyze log data. ML-derived analytical models developed on the cluster are pushed to Azure for deployment in what are describes as ‘emerging data science ops workflows’. The presentation slides are a window into a modern data analytics pipeline with more Kafka/Hadoop/Spark et al brands than you can shake a stick at! Those interested in testing the OpenShift environment should sign up with the OpenShift Commons ML SIG and grab some JupyterHub OpenShift templates. More too from the RedHat OpenDataHub project.

* A wholly-owned IBM unit.

BHP on AI-enhanced data management

Another data pipeline specialist is BHP’s Richard Xu who is developing an AI-based data pipeline, expanding traditional data management. Xu is addressing bulk/batch data issues such as those involving legacy data process, integrating data from new ventures or prepping data for divestment. The approach is to use an iterative/agile development process involving proofs of concept on increasingly large datasets. The pipeline involves text analysis with natural language processing, taxonomy creation, word to vector conversion and machine learning. The Google/TensorFlow Word2Vec tool is used to parse documents prior to cross-validation with a taxonomy-based classification. Scanned documents are also processed and geotagged for spatial classification. Xu reported a 40% time reduction and a 30% cost reduction. On accuracy, Xu told Oil IT Journal, ‘Because there are thousands of data types and a lot of noise, text classification target is 70%. Our test result is 90%. Because the training sample is small, the approach combines AI and taxonomy. The worst data types give a 70% success rate, the best, 90%. We perform multiple iterations and sprints, moving low accuracy results into the next iteration. The project has been in action over a year. It has gone through vision, PoC, development and now it is in production mode. Current efforts are more concentrated in fully parallel AI in big data platform in order to improve performance.

Equinor moves seismics to Triton’s TerraStor

Equinor North America’s Justin Frost teamed with Andrew Timm (Triton Data Services) to report on a major seismic-to-the cloud project. Previously, Equinor North America (ENA) operated a revolving three to seven-year procurement cycle for seismic data storage, often involving migration to a new database. To break the data migration cycle, ENA elected to use a cloud-based seismic archive for its petabyte-scale dataset. The contract was awarded to Triton Data Services whose TerraStor is now configured to utilize cloud storage via Amazon AWS. Each trace was scanned, QC’d and its metadata captured to TerraStor. Archive data was encapsulated and transported into archival storage in AWS. Most data was uploaded over the wire but for some super-large marine datasets, Amazon’s ‘Snowball’ 72 terabyte network-attached storage devices were loaded and shipped physically for ingestion. Another interesting facet of the project was the possibility to comply with certain jurisdictions’ export controls that preclude export of seismic data. The solution here is to geolocate data to a cloud environment in the country. Metadata in TerraStor is also replicated to Equinor’s internal seismic entitlement database, providing an abstraction layer to physical storage and enabling future data transfer to other cloud providers.

ExxonMobil’s digital transformation

Max Gray presented components of ExxonMobil’s ‘digital transformation’ in the form of three projects that set out to leverage Exxon’s massive legacy data archive. Exxon has some 12 million geoscience and engineering paper documents in storage in the Houston area. Following a pilot, a contract was awarded to L&T Technology Services* for metadata capture and enhancement and document scanning. All documents are georeferenced and depth registered as appropriate. As of Q1 2019, some 1.4 million documents have been processed. The next step is to migrate the documents into the ExxonMobil cloud storage account, with access provided via Katalyst’s iGlass data manager. A parallel workflow addresses other physical media (non-paper) such as CDs, DVDs, thumb drives, stored in the hardcopy record boxes. Gray outlined another Exxon digital transformation project involving the capture of ‘orphaned’ data and interpretation results in legacy applications and systems. A proof of concept trail on some 172 projects found that interpretation data could be extracted at a cost of around $4,000/project. Further price reduction is expected as the work is to be tendered to external service providers. Some 18,000 projects have been identified for data harvesting. The endgame is to develop a ‘managed service provider’ factory migration workflow to ‘harvest data at scale’. Finally, Exxon is also looking to remaster its humungous legacy seismic data of almost 2 million tapes, currently stored in an underground salt mine. Following a pilot, Katalyst is now processing some 30,000 tapes/month. The project is expected to take 4-5 years. As of May 1, 2019, Katalyst had processed 108,000 tapes, recovered 750 TB of data, placed it online and made it accessible through iGlass. Retrieval times are down from weeks to hours.

* LTTS is a publicly listed subsidiary of Larsen & Toubro Limited, an ‘$18 billion Indian conglomerate’.

‘Pegasus’, Chesapeake’s Appra-based prospect and play inventory

Kelli Witte and Gina Lawson presented ‘Pegasus’, Chesapeake’s project, play and prospect inventory management application that stores and manages the consolidated efforts of explorationists, managers and reservoir engineers. Pegasus provides technical guidelines and milestones for the delivery of technical products, setting data integrity standards and enabling collaboration, documentation and knowledge transfer. The solution replaced a suite of spreadsheet tools that had become ‘ineffective and obsolete’, limiting the speed at which the relevant information could be collected and managed, and hampering decision making. Pegasus deploys a ‘centralized database platform’ that captures play level/assessment data to be captured from multiple disciplines. Today some ten internal administrators maintain the application. The tool is said to be robust in the face of changes in other interpretation tools. One component of Pegasus, Origin, is based on 3-Gig’s Appra, a packaged solution combining 3-Gig’s Prospect Director with Rose & Associates software suite.

ConocoPhillips’ ‘citizen data scientist’ program

Andy Flowers presented ConocoPhillips’ citizen data scientist program (CDS), one of several ways that the company is benefiting from data science. The CDS sets out to train subject matter experts in data science with the expectation that this will lead to data-driven solutions to problems that are not amenable to conventional methods. Teaching includes the use of standard workflows for data extraction and conditioning, statistical techniques to describe data features and significance and the derivation and use of computational models from data. If interested, CDS graduates will be offered positions in the advanced analytics team to apply data science to support other business units and functions.

Comment: In the 19th Century gold rush, it was said that more money was made by those selling shovels than the miners. Could it be the same for the data science training business today?

Total’s DataLab

Baptiste Joudet, Elie Maze and Florian Bergamasco teamed to present Total’s DataLab, an ‘innovative approach to data management’. Total created its DataLab in 2018 to address a growing number of incoming documents (+20% year on year), with a new approach to data management and data automation. The vision is to provide Total’s data management community with self-service solutions to automate data loading into business applications and corporate data bases. Initially the lab is working on early stage data management of incoming files. This involves sorting and classifying files received, achieved by the automated extraction of key information from file headers, pictures and reports. A first pass classifier sorts files into data family (seismics, well data, geochemistry) and data type. This allows files to be routed to the appropriate workflow. Here a machine learning-derived process extracts metadata (company names can be derived from their logo), performs geotagging, identifies data in tables and extracts images. A 95% accuracy level is claimed for the classifier. The solution is enabling optimized search across the full content of reports, including imagery. The authors conclude that artificial intelligence brings many benefits to the data manager. But AI needs a large volume of clean and organized data. Here good procedures, business knowledge and quality databases are key to successful AI. For more on the early days of Total’s data lab read our report from Octo’s event.

Kadme Whereoil for YPF

Dirk Adams presented Kadme’s work for YPF on the management of physical and digital data in Argentina in an Azure environment. Kadme has tailored its Whereoil application to crawl YPF’s data sources and fulfil the company’s business and operational requirements. The solution was retooled to run in the Microsoft Azure cloud, benefitting from Microsoft’s presence in Buenos Aires and YPF’s Azure contract. The choice of Azure was also dictated by the fact that no other major cloud provider has an Argentinian presence! The project has involved the migration of some 2 million documents and associated metadata. The system also manages check-out and returns for physical assets. Whereoil’s ‘Insight Engine’ adds a search function to the Azure data lake and a geotagging function adds location to document metadata. The system is currently in final test phase before rolling out to ‘over 1,000 users.’ Kadme acknowledged assistance from local partner Innovisión.

Shell on drone data onslaught.

Adam Serblowski (Shell) explored the potential for robotics as a source of analytics data. Following early use in deep water operations, robotics are now moving from specialist applications to become ‘an integral part of daily operations’ in many oil and gas contexts. Robots bring new data management challenges as they generate data volumes that are orders of magnitude more than a human worker. Serblowski presented a 2018 proof of concept that used drone technology to monitor Shell’s US unconventional operations. The PoC involved the surveillance of 60+ wellsites using a beyond visual line of sight (Bvlos)-enabled drone. A single wellsite flyby generates over 2GB, i.e. around 120GB per day for a full inspection round. Data management and the image processing pipeline needed some serious planning. The POC trialed two types of change detection, RGB and state change. The former was effective at highlighting defects such as debris on a wellsite. State change proved capable of tasks such as estimating volume of liquid inside of a tank. Both approaches are viewed as pre-processors for further analysis by an operator. Serblowski concluded a viable solution involves trade-offs between cost, data volumes and inspection scope, but that ultimately, the approach will become the de-facto way of working.

Blue Marble on major geodetic changes

Blue Marble’s Kris Berglund warned of geodetic changes coming in 2022 in a reprise of a talk given by NOAA’s Dan Martin earlier this year. In 2022, the National Geodetic Survey will be replacing the US horizontal and vertical datums (NAD 83 and NAVD 88) with a new International Terrestrial Reference Frame ‘gold standard’, ITRF-based framework. The continent will be divvied up into four plate-tectonic-fixed terrestrial reference frames, each with its own time zero ‘epoch’ and relative motion. For more on the changes and on the history of US geodetics, read Martin’s original presentation here.

More on PNEC from the conference home page. Next year’s edition is scheduled for May 19-20, 2020.


Software, hardware short takes…

Safe Software’s FME 2019.1. Stratas Advisors announces TEXIS forecast. 3GiG announces Appra. ATEK’s TankScan. CGM Studio on subscription. DroneDeploy automates drone fleet management. Emerson/Paradigm’s machine learning rock type classification. Geolog 19 roll-out. ESG tests HPHT HotShot. ArcGIS QuickCapture. Geographics GeoCalc SDK 7.5/Geographic Calculator 2019 SP1. Troika launches GEOM 3D seismic dataset analyst. Implico updates SAP S/4HANA downstream. INT - J-Geo toolkit on Angular, IVAAP and Elastic Search, OSDU . Lasser’s free Well Info app. Sercel moves to MEMs. Target/Meera’s Hybrid sim engine. MRC Global’s MRCGO digital supply chain. 2019a release of MRST. 2019.2 release of PerGeos. Petrosoft’s information security certification.

The 2019.1 release of Safe Software’s FME sees performance improvements to 35 transformers, a connector for Apache Kafka streaming data, a Mapbox vector tiles reader and a Google cloud pub/sub connector.

Stratas Advisors has announced the TEXIS energy infrastructure analytical service, providing quarterly and periodic forecast updates of future infrastructure investments and capacity requirements from a ‘full-value chain and cross-commodity viewpoint’. The service helps clients anticipate ‘price-crushing’ constraints in local demand or infrastructure in Texas amid ‘burgeoning production’ from the Permian Basin and other shale regions of Texas.

3GiG has announced Appra, a special edition of its Prospect Director E&P prospect inventory, risk and reserves tracking solution integrated with Rose & Associates MMRA/MZM risk analysis tools.

ATEK Access TechnologiesTankScan TSC fuel inventory system connects to existing automatic tank gauging systems and transmits the tank data to the ATEK Intelligence Platform for remote access to tank levels.

Larsen’s CGM Studio geoscience montaging tool is now available via subscription. CGM Studio creates large format, high-resolution montages for print and large screen displays.

DroneDeploy has upgraded its software to offer automated drone fleet management, enhanced workflow integrations, a low-altitude inspection mode and advanced analytic capabilities. DroneDeploy now syncs with other software from Autodesk, Procore, Bluebeam and Plangrid.

Emerson’s Paradigm unit has rolled out a machine learning-based solution for rock type classification from seismic data. The new supervised machine learning approach, democratic neural network association (DNNA) reconciles multiple data sets to predict facies away from the wellbore. Paradigm has also rolled out Geolog 19 with new query and graph views for locating data in large well databases, a ‘timeline view’ that tracks operations across a well’s life cycle and a new mud gas analysis module for the early detection of gas kicks and lost circulation. Connectivity has been enhanced with direct streaming of Witsml data into Geolog Geosteer and connectivity with Petrel 2019. A Python package is available for customization.

ESG Solutions has successfully tested its new HotShot downhole microseismic acquisition system in the Eagle Ford shale. HotShot operates in high-temperature, high-pressure operational environments up to 177°C/350°F and 10,000 psi.

Esri has announced ArcGIS QuickCapture for the collection of field observations on a GPS, smartphone or tablet. Attributes can be captured in the field and transmitted back to the office for real-time analysis. QuickCapture is compatible with ArcGIS Online and ArcGIS Enterprise 10.7.1.

The 7.5 release of Blue Marble Geographics GeoCalc SDK updates several geoid models, with the latest version of the EPSG database and enhances WKT, PRJ, and GeoTIFF coordinate system matching. The latest release brings a foretaste of the ‘major overhaul’ of the North American geospatial reference framework scheduled for 2022. The 2019 SP1 release of Blue Marble’s Geographic Calculator adds support for the Open Document spreadsheet (ODS) format, includes a new tool for batch processing of area calculations and supports Esri Projection Engine database files.

Troika has launched GEOM, for ‘superfast’ 3D seismic dataset geometry analysis. GEOM enables quick geometry QC prior to lengthy data loading operations. GEOM extracts coordinates of inline and crossline corner points, survey azimuth, bin dimensions, trace count and an outline surface coverage image. It analyses textual headers and can export the header definitions for integration into other software applications. It can also provide shapefiles for visual display and/or input to GIS applications. In tests, a 137 GB file was analysed in 12 seconds on a 32 GB RAM PC running Windows 10. A 2.2 terabyte volume took 9 minutes. The software can also catch errors in positioning data before ingestion into an organisation’s database.

Implico has announced a new release of its SAP S/4HANA end-to-end solutions for the downstream industry with new integrated application for dispatching and transport planning and a comprehensive service station overview. Both functionalities come as ‘user-friendly Fiori apps’ and were awarded a ‘Fiori Lighthouse Scenario’ badge of approval by SAP.

INT’s J-Geo toolkit is now available with the Angular J-Script IDE. A new release of IVAAP with Elastic Search connects 3-way clouds, PPDM and natural language processing. INT has also been selected as provider of the visualization platform for the Open subsurface data universe OSDU.

Lasser has announced a free Well Info app for iOS, Android, and web browsers. Well Info offers search on area of interest or by API number, operator name, field or reservoir. Optionally with a subscription, the app shows production summaries. Data can be delivered by email for use in industry standard forecasting programs.

Sercel has published a white paper ‘Moving over to MEMs’ on the transition from analog to digital technology for use in seismic data acquisition. This ‘has taken longer than might have been expected*’. But the low-noise performance of microelectromechanical systems-based sensors and the accuracy of their recordings, in combination with their reduced power consumption and lower price, means that industry is increasingly looking to take advantage of new performance capabilities.

* Our first encounter with MEMs was in the 1980s!

Target/Meera has announced a combined AI/conventional reservoir simulator the Meera AI hybrid sim engine which combines machine learning with a sensitivity analysis based on coarse grid dynamic modelling. The model produces an ‘evergreen’ production forecast, updated daily.

MRC Global has introduced MRCGO, a digital supply chain solution for pipe, valve and fitting purchases. Customers to search, shop, track, expedite and connect with live support, all from a single platform. MRCGO integrates with third-party marketplaces and ERP systems.

SINTEF has announced the 2019a release of MRST, the Matlab reservoir simulation toolbox. The new release includes adaptive implicit, explicit and fully-implicit support in all solvers, support for Mex-acceleration and additional features for simulation of petroleum recovery from Eclipse-type decks.

The 2019.2 release of Thermo Fisher Scientific’s PerGeos, Thermo Scientific’s 2D/3D digital rock imaging and analytical solution adds large data handling and image segmentation in the petrography feature, new volume rendering capabilities, and a new colorize by measure tool that generates a colormap based on measure. PerGeos includes machine learning-based segmentation for analyses of porosity, total organic content, grain size distribution, pore classification and permeability.

Petrosoft has achieved ISO 27001:2013 certification in information security management for its cloud-based business solutions for petroleum, convenience and retail.


2019 ECN Oil & Gas Machine Learning Conference, Houston

Verdazo Analytics ML for geosciences. Lux Research’s ‘spotlights’. NETL’s SIMPA data-driven subsurface leak prediction. Machina Automation on office automation trends. BHGE’s digital drilling twin.

Introduction

Energy Conference Network’s Oil & Gas machine learning conference took place in Houston earlier this year and heard from AI/ML aficionados in various contexts. A post show poll of attendees indicated current areas of interest in this community. Most showed interest in predictive maintenance, with less interest in natural language processing. With regard to obstacles to AI/ML deployment, the main cause was a ‘lack of a clear ROI/business case’. Having said that, attendees considered themselves to be in the vanguard of innovation and believe AI and automation to be the ‘main accelerator for IoT and digitization’.

Verdazo Analytics ML for geosciences

Brian Emmerson from Pason Systems’ unit Verdazo Analytics presented on the use of machine learning in a geosciences context. A joint study between Verdazo and GLJ Petroleum Consultants investigated the performance of hydraulic fracturing operations at the Spirit River formation of Western Canada. Pason’s analytics platform, which is ‘used by over 120 companies in North America’, was applied to a large data set of production, fracking parameters and geological data. ML tools of choice included ICE and PDP, aka individual conditional expectation and partial dependence plots. The technique investigates a multi variate data space of formation tops, porosity from cores, pressure data, along with copious information completions and frac technology. The ICE/PDP addresses AI model ‘interpretability’ and provides an understandable entry point to the big data analysi, finding, for instance, that for the same proppant intensity, a tighter stage spacing yields a greater impact i.e. more gas. Emmerson concluded ‘choose your model parameters wisely, more is not necessarily better’. It is also important to include economics and the dollar value of a feature. The ICE/PDE approach shows how and why well performance varies. More on the Spirit River study from the Verdazo website.

Lux Research’s ‘spotlights’

Harshit Sharma from Lux Research described the ‘host of startups’ now offering solutions to three key downstream oil and gas challenges viz: emission reduction, feedstock optimization, and margin optimization. In the emissions space, Lux has spotlighted two startups offering remote GHG/methane monitoring. GHGSat offers its ‘Claire’ microsatellites with on-board imaging sensors that create spectral footprints of different trace gases. GHGSat is backed by Boeing and Schlumberger and was trialled last year in the Permian Basin. Another of Lux’ spotlights is Satelytics, a provider of analytics for aerial imagery data. The Satelytics platform has been used to detect methane and liquid hydrocarbon leaks in the Eagle Ford basin, identifying leaks as low as 10 ppm, without supervision. Backers include BP and Phillips66. In the robotics space, Lux sees Inuktun’s robotic pipeline crawler vehicles as promising. Inuktun’s robots perform visual inspections in confined and hazardous environments like inside pipes using amphibious pan, tilt, and zoom cameras, operating on a tether cable up to 7,000 ft long. The system has been used on BP’s BP Thunder Horse GoM platform. BP and Dow/DuPont are backers. On the AI/software front, Maana and Seeq got a plug from Lux. Sharma concluded that the downstream sector will need to act fast with impending fuel regulations and that automation and robotics will be part of the solution.

NETL’s SIMPA data-driven subsurface leak prediction

Kelly Rose described work performed at the US National Energy Technology Laboratory (NETL) investigating the likelihood of fluid and/or gas migration through the subsurface. SIMPA, Spatially integrated multivariate probabilistic assessment is a science and data-driven, fuzzy logic approach that is said to improve prediction of subsurface leakage and flow from fractures, wellbores and other pathways. SIMPA computes the magnitude and extent of natural and anthropogenic subsurface pathways and can help with site selection, risk analysis and production optimization. Rose went on to present the NETL’s work on subsurface trend analysis (STA) that seeks to constrain estimates of subsurface property values with a combination of geologic knowledge and spatio-temporal statistical methods. STA has analyzed data from 53,000 wells, some 500,000 US offshore datasets and 30 TB of data, to define offshore provinces with a common geological history. The work is to inform subsurface pressure prediction for oil and gas exploration and carbon dioxide sequestration. More from NETL. NETL is now working to incorporate more ML into its the spatial analysis of subsurface properties and on a natural language processing NLP function which collects information from relevant publications. SIMPA V.1 is available for download from the NETL.

Machina Automation on office automation trends

Nathan Yeager (Machina Automation) described trends in intelligent (office) automation, notably ‘robotic process automation’ that ‘emulates human interactions with a digital system in the execution of a business process’. Yeager warned that few companies are currently deploying RPA but that the RPA ecosystem is evolving rapidly. Companies to watch include UiPath, Enate and ABBYY. Basic RPA can be augmented with machine learning with tools such as Skymind.

BHGE’s digital drilling twin

Simon Mantle presented BHGE’s use of machine learning and AI to solve complex problems in both upstream and downstream oil and gas. BHGE is working on digital twin for drilling optimization using deep learning. The idea is to leverage drilling log data to identify formations, combining historical data in the area with logging while drilling (LWD) data on gamma ray logs, neutron density and vibration. Working from training data, formations are identified using and unsupervised clustering algorithm. This data is used to determination the optimal drilling rate of penetration using a ‘global optimization algorithm’. A similar digital twin approach is used to model offshore platforms or refineries as ‘systems of systems’ in a network/graph approach. All is rolled up into BHGE’s AI Pipeline for model building. The Pipeline leverages TensorFlow, CubeFlow, Nvidia NGC containers and GCP Elastic Compute. More from BHGE.

More from Energy Conference Network.


EAGE London 2019

Plenary: Delivering the world’s low carbon energy needs. Geoscientists in changing energy landscape. EAGE Special Interest Community: Decarbonization and the energy transition. Schlumberger Delfi Developer Portal. 2019 EAGE Member Meeting.

Plenary: Delivering the world’s low carbon energy needs

The main plenary session of the 2019 EU Association of Geoscientists and Engineers (EAGE) was provokingly titled ‘Delivering the worlds low carbon energy needs’. The chair, BP’s Sebastian Chrispin, set the scene reminding us that to combat climate change, CO2 emissions need to halve by 2030, this in the face of ‘rapidly rising energy demands’. Today, oil and gas make up half of the energy mix. In 2030, oil and gas will still be needed ‘even in a Paris world’. WoodMac’s Neil Anderson added that the energy transition is a critical issue to us all and that it is happening now and moving quickly. The divisions between ‘them’ and ‘us’ and ‘good’ and ‘bad’ are unhelpful. Oil and gas has a critical role in driving the energy transition but needs to be more involved in the debate. Hydrocarbons have a role ‘out to 2040’. The question is how to make this role sustainable. Oils need to capture carbon and be a part of the transition. Luca Bertelli (ENI) added that natural gas has a role as a bridge fuel, displacing coal which current powers 33% of the world’s electricity. OMV’s Gary Ingram listed three key elements of the transition, carbon emissions, affordability and reliability of supply. OMV is engaged in the transition, ‘the future of the company is at stake’. Carbon capture and storage is necessary for the future of the oil and gas business. Luke Warren from the Carbon Capture and Storage Association was not going to disagree. CCS is expected to play big role in the 21st century. Although it is 'frustrating in that there has been little progress so far’. Having said that, there are 23 large industrial projects either operating or under construction. CCS now has to follow the economic path of renewables, driving down costs. While this is a harder problem, companies, countries and regions are addressing the issue. The US provides tax credits and the UK an investment framework. But where are the big champions? Industry needs to get behind this. Emily Shuckburgh from the British Antarctic Survey started with a disclaimer, ‘no I’m not part of the green movement I’m a climate scientist’. Already the world has seen 1°C warming above pre-industrial level, deadly heatwaves, floods and drought. Current rates of emissions will exceed 1.5°C between 2030 and 2050 and up to 3° by the end of the century. In its largest ever field expedition, BAS studied a remote part of the West Antarctic ice sheet which is showing signs of retreat. A total collapse of the ice sheet would produce a 3m sea level rise. We have only 10 years at current rates of consumption before we exhaust our carbon budget from fossil fuels. From 2040 on, net negative emissions will be needed.

Chrispin asked how the debate is playing out internationally? Anderson stated that in the EU ‘everyone wants to talk about this’ and there are concerns in regard to oils’ license to operate. The US is behind on the debate, especially with the Trump administration questioning global warming. Elsewhere, access to energy is the priority for the developing world.

In the Q&A the panel was asked, ‘if we have already discovered enough oil, should we stop exploring? Anderson observed that the energy transition means moving to low cost and low carbon resources. This means that Canadian oil sands are unlikely to be developed, as are Arctic fossil resources. But ‘yes, we still need to invest in exploration to plug a looming production gap, otherwise there will be a supply crunch’. Chrispin agreed that we should continue to explore, but maybe not develop everything. It will be the cheaper stuff that gets developed. Bertelli said ‘I’m not nervous’. If we stop exploring there will be a supply crunch. The is a lot of coal to be replaced. Oil and gas is to remain an important part of the energy mix. Exploration is currently only replacing 30% of production and demand is going up! Warren stated that there was huge uncertainty, but the transition will happen. Coal missed out on the CCS journey, lacking a proactive role, and has lost its future in a many of markets. Oil and gas needs to play an active role in CCS. Bertelli agreed, CCS is not broadly accepted, we need to convince people of its importance. Warren added that the IEA puts the CCUS market at a possible $100 bn/year.

A questioner raised the issue of the CCS energy balance asking, ‘Is it really useful and economic?’ Warren agreed that there will be an energy/cost penalty. But that CCS and bioenergy are a good complement to nuclear. The IEA foresees a need for 5 Gtonnes/year of CCS by 2050. But if you break down the costs these are comparable with the development of other industries.

Chairman Chrispin asked the panel what the impact of digital would be in the energy transition. Ingram ventured that digital would enable operations with less people, less aviation. Shuckburgh opined that artificial intelligence is being applied to environment/low carbon issues in Cambridge. AI is used in energy balancing and to improve predictions with data-driven forecasting.

On the roles of IOCs and NOCs in the energy transition, Anderson observed that some NOCs have plentiful low cost reserves and are ‘sitting this one out’. EU IOCs are in the vanguard of the energy transition. The Middle East NOCs need to be on board. BP is addressing its role in the energy transition, in part to recruit talent. Ingram asked ‘is the industry in terminal decline?’ Regarding new hires with geo-skills, you need to be honest and authentic. How will you handle these things that will come up 10-20 years’ time? Otherwise people will doubt you and go into other industries. Again, CCS is seen as saving the day in a recruitment context. It will be a major topic where geoscience will make a difference.

Another question from the floor asked about shareholder value in a shrinking industry. Anderson suggested looking forward to what the energy transition means to us as individuals. Probably solar panels, a Tesla, a battery in the basement. What kinds of companies will prosper here? Chinese PV, utilities, battery providers, EV manufacturers. In other words, a stark picture for the oil and gas industry. On the other hand we need low cost resources to avoid a supply crunch. Unfortunately, much of the industry (read shale?) is predicated on high oil and gas prices ... ‘I doubt these will be the case’.

Finally, the possibility of a world carbon tax was raised. Anderson thinks yes, of course the world needs one! We need incentives for CCS etc. This raises challenges of say, a manufacturer in India sans a C-Tax, and from the Trump administration. Shuckburgh thinks it would have been a good idea if it had been done before, ‘today we need more regulation than tax’.

Comment: The subject of a carbon tax was raised briefly at the opening ceremony by the IEA’s Tim Gould who stated that the EU carbon trading scheme was part responsible for the demise of British Steel. While this narrative is convenient for anti-taxers, it is not quite what happened. British Steel had a carbon tax liability of €100 million, but this was because the EU withheld a carbon credit pending Brexit negotiations. The UK Government agreed to sub the EU fine so this is unlikely to have played a direct role in British Steel’s demise.

Geoscientists in changing energy landscape

The opening plenary set the scene for this special session on geoscience hiring. Iain Manson (Korn Ferry) spoke of disruption from the changing landscape of the energy transition and a ‘deep trough’ in oil and gas that ‘will not peak as before’. This is down to environmental pressures and the fact that baby boomers are on the way out and millennials are now in the majority. 35% of important skills today will not be so, real soon now. Enter the Korn Ferry ‘self-disruptive leader’. CGG’s Sophie Zurquiyah offered a more conventional view of a ‘cyclical’ industry, CGG lost 50% of its people in the last downturn. In digital, there is competition from Google and Microsoft. Is it hard to recruit? No. CGG get lots of CVs although in some circles oil and gas is seen as an industry of the past. Ann Muggeridge Imperial College London thinks it is a great time to be an earth scientist as the world needs ‘more energy and minerals, more sustainably’. But there are fewer 18 year olds from demographics and a larger fall in earth science entrants. Why? Because of a lack of awareness of geology as a subject and because parents tell their kids ‘you won’t get a job in this dodgy/dying industry’. There are concerns about the environment and climate change, and latterly plastics. Muggeridge sees openings in CCS, geothermal and in making industry carbon neutral.

Halliburton’s Naphtali Latter warns of the threat of further reduction in green energy cost and the reduction of demand due to the shared economy. Regarding climate change, what’s the real sentiment in society? Do we really care? Are we overreacting to millennials/greens? The oil price drop is reducing margins and industry is looking to cut costs. ML/AI are not new, they have been used for ‘over a decade, there is no disruption, just an evolution driven by low margins’. In 10 years time there will be new oil and gas businesses with chief sustainability officers. Latter thinks that we are being a bit dramatic regarding the threat to geoscientist. Oil and gas will be 40% of the energy mix in 2040.

A questioner asked if geoscience jobs will be completely digitalized. BP’s James Hamilton-Wright believes that no matter how automated your workflows are, you will need people to make sense of data and noise. Automation gives you freedom to think about your work.

EAGE Special Interest Community – ‘Decarbonization and the energy transition'

Philip Ringrose (NTNU & Equinor) kicked-off the first meeting of the EAGE Special Interest Community on 'Decarbonization and the energy transition’. Ringrose ran through the many climate forecasting scenarios (Bloomberg, IPCC, IEA, New Green Deal, EU Clean Planet 2050, BP, Shell, Equinor and others) In particular the IPCC’s 1.5° report is an ‘essential read’. Latterly the CO2 trend has eased-off but is still increasing. It needs to flatten and then reduce by 50% by 2050, the IPCC says by 80%. How is this to be achieved? Investors want to invest in this, but in what? The IPCC offers several mitigation pathways. Of interest to the EAGE/geoscience community is fossil fuels emissions reduction along with CCS. Across the board, decarbonizing is hard, expensive and not happening, at least not in heavy industry and transport. CCS is moving slowly but has the potential to decarbonize parts that other approaches cannot reach. There are positive signs. In Norway, Sleipner has a single well sequestering 1 million tonnes/year, equivalent to 300k cars or a 500MW gas fired power station. Overall Norway’s emissions are around 50million tonnes/annum. Sleipner works! It doesn’t leak and could easily scale. Last week Angela Merkel said ‘maybe Germany needs CCS’! Hydrogen from methane is also looking encouraging. The H21 Green project in the North of England is extracting hydrogen from methane. The remaining carbon turns to CO2 and is sequestered offshore. Hydrigen is a promising fuel for heavy industry and shipping.

Karin de Borst reported on Shell’s ‘QUEST’ to put carbon back underground. CCS is the only technology that can decarbonize heavy industry (cement, steel …) but CCS needs to accelerate. Today there are 18 large scale projects in operation and 5 in construction. Nearly 40 million tonnes was sequestered in 2017. The IEA 2°C scenario implies a 100x scale up to gigatonne levels by 2050, ‘a significant and urgent increase in current investment levels’. The technology is there. Capture solvents have been in use since the 1930s and CO2 transport and storage since the 1970s in oil and gas. Shell is involved in Northern Lights, OGO Clean Gas, Quest, TCM, Pernis, Gorgon and Boundary Dam. Quest is commercial scale CCS in Alberta sequestering a million tonnes/year for 25 years, i.e. 1/3 of the CO2 from the upgrader. But to reach 3GT/year we need a few thousand Quests and a new mind shift. As Professor Wallace Smith Broecker (Columbia University and (allegedly) coiner of the phrase ‘Global Warming’) has put it, ‘humanity has dealt with garbage, sewage and now CO2, we must learn how to capture and bury it’.

Iain Stewart (Plymouth University UK) wondered aloud how geoscience should be framed for the public and for policy makers. In regard to the world’s sustainable development goals, geology can be ‘squeezed-in’, but it is not central except to CCS. Unfortunately, there is little sense of urgency regarding CCS. ‘If you do a geoscience degree today you don’t see sustainability (or CCS). The same can be said of UK PLC’s priority research challenges. Geology is getting marginalized and there is a new cohort of student who ‘don’t want to work in oil and gas, (or in mining)’ hence the need for a rebrand on sustainability. Sign-up to the EAGE sustainability LinkedIn group or visit the EAGE decarbonization minisite.

Schlumberger Delfi Developer Portal

In a Schlumberger booth presentation, Ahmed Adnan Aqrawi unveiled the Delfi developer portal (DDP). Aqrawi stresses the importance of ‘open software’ claiming openness in Schlumberger’s Ocean (2005), Ocean Store (2010) and now Delfi. ‘Open’ allows users to 'take things out and bring things in’. Today ‘1500 developers are using our openness’. Delfi is Schlumberger’s data ecosystem running under shared services in support of apps such as Petrel, DrillPlan, Olga and general-purpose analytics/ML. The DDP leverages the Google ApiGEE gateway technology in an E&P developer sandbox. SmartDocs and Drupal also ran. Delfi services provide coordinate conversion and access to Petrel objects from Jupyter notebooks and third-party map software. More from Developer.delfi.slb.com and the SIS Forum (which is not ‘open’) to be held in Monaco 17-19 September 2019. Comment: It is tempting to see the DDP as a counter to the embryonic OSDU!

2019 EAGE Member Meeting

In 2018, the then president elect Juan Soldo quit to return to his daytime job. This year, Peter Lloyd, president elect for 2020-2021 also resigned to enjoy his retirement. Jean-Jacques Biteau’s extended presidency will end next year as secretary treasurer Everhard Muijzert was unanimously voted in as the new president elect. Biteau (ex Total) observed that artificial intelligence has been applied to reservoir characterization for some 40 years! But now it’s firmly on the EAGE radar in the form of a AI special interest community. The EAGE is also jumping onto the digital bandwagon with an inaugural EAGE Digitalization Conference and Exhibition scheduled for 7-10 April 2020 in Vienna. EAGE finances appear somewhat challenged. The EAGE holding company loses money from its conferences and EAGE continues to spend its cash reserves. But Biteau assures members, EAGE is on the road to full recovery while maintaining service levels.

Next year’s EAGE will be in Amsterdam from 8-11 June 2020.


Folks, facts, orgs …

AADE, Bruel & Kjaer, BCCK , Eliis, Energistics, Morgan & Eklund, Okeanus, TechnipFMC, DNV GL, ATI, C-Innovation , Bracewell, Schlumberger, Calfrac, Helmerich & Payne, Environmental Partnership, ONE Future, Cheniere, RocketFrac, Aker Solutions, Data Gumbo, Pioneer, Hexagon, Produced Water Society. , Newpark Resources, OspreyData, Canada Oil Sands Innovation Alliance, ENGlobal, Parker Drilling, IHRDC, Purple Land Management, BHGE, Liberty, Total Safety, Helix, Aquilon, Landdox, Argus, Atwell, GRC, Katalyst.

The American association of drilling engineers Lafayette Chapter has elected Jude Boudreaux (Offshore Energy Services) to president.

Bruel & Kjaer Vibro has appointed Marcel van Helten as president. He hails from Current (a GE company).

John Peterson has been promoted to SVP of business development at BCCK.

Eliis is now member of the Energistics data transfer standards consortium.

Robert Collaro is director of the hydrographic and land survey business unit at Morgan & Eklund.

Phil Clements is Okeanus’ lead engineer.

Olivier Piou and John Yearwood are now members of the TechnipFMC board. Catherine MacGregor is CEO of the TechnipFMC spin-out ‘SpinCo’. Doug Pferdehirt heads-up the ‘RemainCo’ organization.

Kenneth Vareide is now CEO of DNV GL’s digital solutions business area.

David Hess and Marianne Kah are now members of ATI board.

John Michael Boquet is now HSE manager at C-Innovation.

Bracewell has hired Nina Howell as partner at the energy team in London. Howell was previously with King & Spalding International.

Olivier Le Peuch is now CEO at Schlumberger succeeding retiree Paal Kibsgaar. Mark Papa is non-executive chairman of the board. Jamie Cruise has joined Schlumberger to head-up the data management products department. Cruise hails from Meera/Target.

Fernando Aguilar is to retire as Calfrac’s president and CEO. Ronald Mathison is now executive chairman. Greg Fletcher has been appointed independent lead director. Lindsay Link has been promoted to president and COO.

Tina Wininger (Next Wave Energy Partners) has been appointed to ION Geophysical’s board.

Mary VanDeWeghe (Forte Consulting) is now member of Helmerich & Payne’s board.

Warwick has joined the Environmental Partnership.

Williams has joined the ONE Future natural gas coalition.

Michele Evans (Lockheed Martin Aeronautics) is now on the Cheniere board.

RocketFrac has appointed Edward Loven as president and director and Pavan Elapavuluri as CTO.

Ole Martin Grimsrud is CFO at Aker Solutions replacing Svein Oskar Stoknes, who is taking over as CFO of Aker ASA.

Sergio Tuberquia is Data Gumbo’s new chief commercial officer.

Tamara Morytko (Norsk Titanium) is now independent director on Pioneer’s board.

Økokrim, the Norwegian economic crime authority, has decided not to appeal the acquittal verdict in the case against Hexagon president and CEO Ola Rollén.

Lisa Henthorne SVP and CTO at Water Standard has been elected president of the Produced Water Society.

David Paterson has been named Corporate VP, and president of the fluids systems business at Newpark Resources. He hails from Weir Oil and Gas.

OspreyData has hired Richard Wuest as VP sales and Alex Lamb as data scientist. Wuest was previously with Oracle.

Wes Jickling is now CEO at COSIA, the Canada Oil Sands Innovation Alliance. He was previously deputy minister of intergovernmental affairs at the government of Saskatchewan.

Michael Clark, former Honeywell UOP executive, is now ENGlobal’s VP business development.

Gary Rich is to retire as president, CEO and director of Parker Drilling. He remains in his current roles until the board finds a permanent successor.

Christopher Davin is director of instructional programs at IHRDC, the International human resources development corporation.

Mark Morrison has joined Purple Land Management as VP technology and innovation. He was most recently SVP at RMinds.

Jud Bailey is BHGE’s VP investor relations. He hails from Wells Fargo.

Gale Norton, former US Secretary of the Interior, is now a member of the Liberty board and chairs the nominating and governance committee.

Paul Tyree has been promoted to CCO at Total Safety.

Amy Nelson (Greenridge Advisors) is now Class III Director at Helix.

Aquilon Energy Services has appointed Randy Wilson as CEO and board member. He was previously with Deloitte.

Brandon Sage is Landdox head of business development. He hails from Oseberg.

Argus has appointed Christopher Johnson as Global SVP for Editorial, leading Argus’ global editorial team.

Jim Bieda has joined Atwell as Project Engineer.

GRC, a wholly-owned Sercel unit, has opened a new state-of-the-art-facility in Tulsa, Oklahoma.

Katalyst Data Management has opened a new digital transformation center in Houston.



Done deals

American Energy Partners acquires Hickman. AspenTech acquired Mnubo. Competition Bureau OK’s Thoma Bravo/Aucerna deal. Drilling Tools bags WellFence. Emerson buys Zedi businesses. MicroSeismic spins-out FracRx. Endeavor Business Media acquires PennWell unit PNEC. Halliburton buys Promore. Sure Shot Holdings acquisitions. TechnipFMC to ‘demerge’. Terra Drone buys RoNik. TGS acquired Spectrum. Weir Flow Control sold to First Reserve.

American Energy Partners has acquired Hickman Geological Consulting in exchange for 40,500,000 AEP’s preferred shares (approx. 5% of outstanding common stock). The deal is said to be ‘one of a series of acquisitions planned for the future’ and the beginning of AEP’s ‘buy and build’ strategy.

Aspen Technology has acquired Mnubo, a Montreal-based provider of artificial intelligence and analytics infrastructure for the internet of things. AspenTech also recently acquired Sabisu, a UK-based specialist in visualization and workflow solutions for real-time decision support.

The Canadian Competition Bureau has come to an agreement with Thoma Bravo in respect of ‘competition concerns’ over its acquisition of Aucerna (previously 3esi-Enersight). The Bureau found that the transaction resulted in monopoly supply of oil and gas reserves valuation and reporting software to medium and large producers in Canada. The agreement means that Thoma Bravo is to sell Quorum’s Mosaic business to an ‘acceptable’ purchaser that will preserve competition. San Francisco-headquartered Thoma Bravo is a private equity investment firm. Its Calgary-based Aucerna markets the ValNav reserves software. In a separate deal, Quorum Software (another Thoma Bravo unit) has acquired Archeio Technologies, a provider of oil and gas document classification and ‘smart search’ technology. The acquisition expands Quorum’s software and services with a cloud-based document management tool.

Drilling Tools International has acquired WellFence a provider of patented data automation systems and credentialed wellsite access services for drilling and production locations. WellFence deploys check point towers with 360° cameras to monitor personnel entering and exiting a wellsite. DTI is majority-owned by Hicks Equity Partners, a Dallas-based family office investment group.

Emerson has acquired Calgary-based Zedi’s software and automation businesses. Zedi’s cloud-based scada platform augments Emerson’s portfolio of automation technologies. The technology currently monitors over two million sensors and ‘thousands of devices and applications’.

MicroSeismic has spun-out ‘FracRx’, a data analytics company that serves unconventional upstream operators. FracRx is said to be the ‘formalization’ of MicroSeismic’s data analytics and synthesis platform that integrates multi-physics data in a ‘prescriptive solution’ for improved economics, completion optimization and risk mitigation. MSI continue to operate as a geophysical company that acquires and images microseismic data.

Endeavor Business Media has acquired multiple PennWell operating units from the previous owner Clarion Events, including the PNEC E&P data management conference.

Halliburton has acquired the Promore downhole pressure gauge business. The permanent well monitoring systems will integrate Halliburton’s Intelligent Completions group.

Sure Shot Holdings (the parent company of Sure Shot Drilling) has acquired four Colorado-based utility service companies: DrillTech Directional Services, Pinnacle Development, BAS Rentals, and A to Z Cable Construction.

TechnipFMC is to ‘demerge*’ into two units ‘RemainCo’, a technology and services provider incorporated in the UK with headquarters in Houston and ‘SpinCo’, a spin-off of TechnipFMC’s onshore/offshore segment that will be headquartered in Paris, France.

* Paris-based Technip merged with Houston-based FMC in 2017.

Terra Drone Corp. has acquired RoNik Inspectioneering, a Dutch provider of visual and ultrasonic inspections using wireless robotics. RoNik Inspectioneering will be renamed Terra Inspectioneering. The company will focus on the inspection of industrial, hazardous, and enclosed spaces, such as storage tanks, boilers, super heaters, furnaces, stacks, pipelines and more.

TGS-Nopec has acquired Spectrum ASA in a cash and paper deal that puts a $422 million value on Spectrum.

The Weir Group has sold its Weir Flow Control unit to First Reserve, a private equity investment firm ‘exclusively focused on energy’. The unit will henceforth operate as Trillium Flow Technologies.


IBM Develops ‘world’s most powerful commercial supercomputer’ for Total

Pangea III, with 25 petaflops, 50 petabytes of storage and a hybrid CPU/GPU architecture, comes in at overall number 11 on the Top500 list.

IBM has built Pangea III, ‘the world’s most powerful commercial supercomputer’ for Total. The IBM Power9-based supercomputer will be used for seismic imaging, reservoir modeling and ‘asset valuation and selectivity’. Pangea III has a compute bandwidth of 25 petaflops with 50 petabytes of storage and comes in at N° 11 in the latest Top500 ranking of supercomputers overall, and at N° 1 amongst ‘commercial’ machines*. Pangea III is relatively economical power-wise, requiring a mere 1.5 MW, compared to 4.5 MW for its predecessor, or 10% the energy consumption per petaflop. The previous machine, Pangea II, was a 6.7 petaflop Silicon Graphics machine installed in 2016.

To satisfy Total’s requirements for GPU-accelerated computing, IBM worked with Nvidia to jointly develop the ‘industry’s only’ CPU-to-GPU Nvidia NVLink connection, linking the Power9 with Nvidia Tesla V100 Tensor Core GPUs. Pangea’s operating system is Red Hat Linux 7.6. The main compiler is IBM’s IBM ‘XL’ 16.1 C/C++ with the IBM ESSL 6.2 engineering and scientific subroutine library. More from IBM.

* As we report elsewhere in this issue, ExxonMobil claims a 50 petaflop peak capacity for its Houston-based HPC, which would put it at N°10 on the Top500 list if it had entered the challenge.


MQTT, OPC-UA, HTTP, REST ... qu'est-ce que c'est?

Kepware specialist helps Oil IT Journal get to grips with the novel ‘internet of things’ technologies that are gaining traction in oil and gas production and process control.

In our write-up of the ABC Wellsite Automation conference earlier this year we have reported on emerging internet of things-style communications protocols with what appeared to be a battlefront between OPC-UA and MQTT. In an effort to get a grip on the technologies we quizzed the experts at Kepware asking the following:

Oil ITJ - We see that interest continues in MQTT. But also, some talk of OPC-UA. These two protocols have both been touted as being vehicles for the other – i.e. MQTT over OPC-UA and OPC-UA over MQTT? Does either of these ways of using both protocols make any sense – or are they even possible?

Sam Elsner (Kepware) - MQTT is a data transfer ‘framework’ i.e. a combination of a data transport protocol (TCP) and a level of standardized application-layer behavior sets. In MQTT these include username and passwords and client IDs along with sets that specify what MQTT publisher/consumer/server roles should do for message confirmation beyond what TCP provides. MQTT doesn’t define what is transferred, it doesn’t define the data payload, just a way to transfer it. You could transfer pictures, music, video, read commands, write commands, text, binary encoded values – it’s all just bytes to standard MQTT.

There are a few emerging MQTT-based data models. JSON for general purpose, text-based messages, generic, binary protocol buffers for specific use cases (Google Protobuffers, for example), OPC UA data models for OPC UA Pub/Sub (more on that below) and SparkplugB, which is a new, formal specification built on binary protocol buffers, and that includes a number of application-layer behavior sets above and beyond what’s found in standard MQTT. It’s created by the same people that invented MQTT. You would probably want to consider SparkplugB a new, MQTT-based protocol rather than just a data model for MQTT.

In contrast, OPC UA (and most all OPC specifications) define both the data transfer framework and the data payloads, so comparing MQTT and OPC UA is a bit like comparing a basket (MQTT) with a basket of apples and oranges (OPC UA). OPC UA defines that apples and oranges exist in a basket where MQTT just defines that a basket exists.

OPC UA Pub/Sub is a mapping of OPC UA data models onto MQTT (and AMQP, a similar protocol). In other words, OPC UA Pub/Sub actually uses MQTT (or AMQP). More from the OPC Foundation.

Oil ITJ – That’s clear. Another question. Where does HTTP and REST fit into this picture (if at all)? We are curious to know how a protocol like Energistics Witsml (which is REST-based) fits into the picture. This, afaik defines both low level protocol and payload.

Elsner - REST is a client/server protocol design paradigm for Web Services that is stateless, where ‘stateless’ means that each request contains the complete amount of information needed for the server to process a response: authentication information and request details and any other information that’s required in order for the server to properly identify the requester and service the request command. The RESTful client requires no higher level (presentation, application layer) setup messages to begin sending requests to the RESTful server, and there’s no expectation by the server of the client issuing subsequent requests. The data transport protocol can be either TCP or UDP. With TCP, the higher layers (presentation, application layers) can define that the TCP socket be maintained but these higher layers do not ‘maintain’ a connection with the server themselves. There’s a ton of great info on Wikipedia.

This contrasts with a stateful interface where multiple messages are used to construct an active ‘session’ or conversation between client and server before the client can request certain actions to be taken by the server. The server is also given the indication that the client will be using this active conversation for a period of time, so the server should expect additional requests. The paradigm doesn’t make any statements about the actual protocols that follow this paradigm, meaning that it’s possible for a RESTful protocol to define only the ‘data transfer framework’ as I define MQTT above, or the whole transfer framework and the data payload.

HTTP is an example of a RESTful protocol where, native to HTTP’s design, each request contains the total information needed to process its response. There are others, but HTTP is by far the most commonly utilized, so the names REST and HTTP are often used interchangeably. HTTP, like MQTT, does not define a data payload. From Wikipedia I learned that the HTTP 1.1 protocol and the RESTful protocol design paradigm were developed at the same time by the same person, which is probably why the terms are so interchangeable today. Witsml is an HTTP-based and therefore RESTful protocol that also defines a data payload.


HPC in O&G at Rice

Imperial College on Devito Python framework for HPC. Chevron trials seismic imaging in the cloud. Halliburton’s ‘generic and holistic’ distributed HPC. Nvidia on Rapids.ai, ‘open’ data science for GPUs. ExxonMobil moves multi-petabyte dataset across Houston. Shell trials latest AMD chips.

Gerard Gorman (Imperial College London) presented ‘Devito’, a ‘high-productivity, high-performance’ Python framework for finite-differences that is used by DownUnder Geosolutions and Shell. Devito is an abstraction layer for processing kernels that avoids ‘impenetrable code with crazy performance optimizations’. Devito generates optimized parallel C code and provides support for multiple architectures including Xeon and Xeon Phi, ARM64 and GPUs (real soon now). Devito was used by Georgia Tech’s Seismic Library for Imaging and Modeling (SLIM) to develop its JUDI seismic migration code base. Shell, DUG and Nvidia are to kick off a consortium to support open source development. More from Devito.

Ling Zhuo and Tom McDonald presented Chevron’s trials of running seismic imaging applications in the cloud. This currently runs on Chevron’s on-premise HPC clusters. The software runs in a master/worker mode that initiates multiple seismic migrations during a workflow that is an ‘ideal’, embarrassingly parallel candidate for the cloud. The test compared a 900 node public cloud with a 128 node/24 core on-premises machine. Performance speedup was an underwhelming 2x due to the performance bottlenecks of sending large data sets to multiple nodes simultaneously and a single-threaded task scheduler. However, ‘HPC in the cloud is feasible if you know your application well’ and can manage communication patterns, I/O and computing requirements. Performance speedup is limited without changes in application design. Chevron is now revisiting its application architecture and design assumptions and performing a cost analysis to find the most effective solution.

Halliburton’s Lu Wang and Kristie Chang presented a ‘generic and holistic’ high performance distributed computing and storage system for large oil and gas datasets. The cloud provider-agnostic solution leverages a generic processing microservice running on a stable distributed cluster computing framework. The stack includes Apache Spark, Hadoop, Cassandra, MongoDB and Kafka to name but a few components. Users interact via Tensorflow, Scikit-learn and Keras. Seismic processing runs on RabbitMQ microservices. All of the above can be deployed either on premise or in the cloud.

Ty Mckercher presented Nvidia’s ‘open’ data science a.k.a. Rapids that exposes Cuda-based tools for analytics (cuDF), machine learning (cuML) and graphs (cuGraph). Along with these, PyTorch, Chainer Deep Learning, Kepler GL visualization. Apache Arrow in GPU memory and the Dask scheduler also ran.

Ken Sheldon (Schlumberger) enumerated some of the challenges facing reservoir modeling for unconventionals. User workstations have limited capacity and simulation times are ‘hours to days’. Also, engineers are not software developers or system administrators. Fracture simulations are best performed in a quasi-batch mode aka ‘simulation as a service’, offering parametric sweeps for sensitivity analysis. Schlumberger provides access to remote HPC services, tightly integrated with the fracture design workflow in a hybrid cloud. Intriguingly, Sheldon reports that ‘different hardware can yield different results’ (and this is not unique to cloud solutions) and QA/QC can be challenging. And ‘Peter Deutsch’s Fallacies of Distributed Computing still apply’.

Mike Townsley explained how ExxonMobil moved its multi-petabyte dataset across Houston and into its new Spring Campus data center. XOM’s total HPC bandwidth is currently in the 50 petaflop range which would put it around system #10 in the TOP500. The new machine is a Cray Discovery 3. The move involved the creation of an 800Gb/s network between data centers (24 LNET routers) and an in-house developed high-performance Lustre-aware copy tool. The transfer worked at 1-2PB/day, slowed by metadata and small files.

Ron Cogswell presented early results from Shell’s trials of the latest AMD HPC chips warning that ‘all projects have minor hiccups at the start and this one was not different in that regard’. Background is the observation that the addition of cores to the Intel platform over the years has moved many algorithms from being compute-bound to being memory-bound. The AMD platform promises a greater memory bandwidth. The tests performed on the first generation ‘Naples’ architecture showed a reduction in flop performance for seismic imaging, mitigated by the higher core count and more memory. But ‘current pricing allows us to get more cores on an AMD node to make up for it’. ‘For small jobs that could live in cache, Intel is the way to go, but for our seismic code we need the higher memory bandwidth’.

More from the Rice Oil & Gas HPC home page.


Sales, partnerships, deployments …

Ikon Science, Fairfield. Kerlink, Tata. AspenTech, Hexagon. Bardasz, RigNet. Ping Identity, Bentley Systems. ION, iSeismic. Carbo Ceramics, FracGeo. Jacobs, Coreworx. Dover Fueling Solutions, Gazprom Neft. SAP, Esri. Groupon, GasBuddy. SKK Migass, Halliburton. OGsys, Landdox. McDermott, YPF. Cimarex, OAG. CGG, Microsoft. Energy Engine, Destwin, Qualpay. Recon Technology, CNPC. Chevron, OneSubsea. TechnipFMC, Novatek. DriverDash, Shell. Wood Mackenzie, P2 Energy Solutions. Geo R&D, Rock Flow Dynamics. Toho Gas, Fracta.

Ikon Science and Fairfield Geotechnologies have jointly opened a new shared office in Denver to reinforce their commitment to offer seismic data and rock physics driven geoprediction to the region.

Kerlink and Tata Communications Transformation Services are teaming up to promote and deploy LoRaWaN IoT networks globally.

AspenTech and Hexagon have signed a MoU to help accelerate their joint customers’ digital transformation journeys by providing a data-centric workflow supporting the flow of digital project information from conceptual design and FEED to detailed design and, ultimately, into the systems that operate and maintain the plant.

Bardasz has partnered with RigNet to become a regional provider of RigNet’s Intelie Solutions, a complete suite of machine learning, AI and real-time solutions, to Mexico’s oil and gas market.

Ping Identity’s intelligent identity platform has been chosen by Bentley Systems. Bentley will deploy PingFederate, PingAccess, PingDataGovernance and PingDirectory across its global business.

ION Geophysical and iSeismic have signed a MoU to collaborate on seabed acquisition technologies. iSeismic is to deploy the full suite of ION’s 4Sea ocean bottom acquisition and imaging technology to enhance the safety, efficiency, quality and turnaround time of seabed surveys.

Carbo Ceramics has signed a long-term, strategic agreement with FracGeo, integrating FracGeo’s products with Carbo’s FracPro software.

Jacobs Engineering is to use Coreworx interface management system at a $400 million US refinery rebuild project.

Dover Fueling Solutions and Gazprom Neft have partnered to deliver DFS’ Wayne Helix dispensers in Russia, Belorussia, Kazakhstan and Kyrgyzstan.

SAP and Esri have partnered to offer fully integrated spatial analytics and advanced visualizations in one multi-model database management system. The SAP HANA service is now a supported enterprise geodatabase with Esri’s ArcGIS Enterprise and ArcGIS Pro.

Groupon and GasBuddy have partnered on card-linked offers distribution. Consumers will be able to load offers directly to an eligible payment card after enrolment in the GasBuddy app.

Indonesia’s SKK Migass, the special task force for upstream oil and gas business activities has selected Halliburton/Landmark’s DecisionSpace suite.

OGsys has joined the Landdox partnership ecosystem. Integration with OGsys’ oil and gas accounting software enables real-time synchronization of revenue decks, JIB decks and owner information between customers’ Landdox and OGsys accounts.

McDermott has been awarded a pre-FEED contract by YPF for the Vaca Muerta LNG project in Argentina.

Cimarex has implemented OAG’s AI platform to improve drilling unit economics, completion design, well spacing and timing.

CGG has partnered with Microsoft to deliver its geoscience products, data and services on Microsoft Azure and accelerate exploration and development workflows for clients.

Energy Engine and Destwin have integrated Qualpay’s software into their platforms to eliminate downgrade charges incurred with traditional payment processors.

Recon Technology, Zhejiang CNPC and Alipay have signed a tripartite agreement to collaborate on a smart gas station project by combining the gas station resources of Zhejiang CNPC, the software, the Recon’s hardware and e-voucher capabilities and the digital technology and marketing capabilities of Ant Financial Services Group (operator of Alipay).

Chevron has awarded Schlumberger’s OneSubsea unit a 20-year subsea equipment and services master contract for the Gulf of Mexico. A OneSubsea custom equipment catalog includes HPHT equipment that can withstand up to 20,000 psi.

TechnipFMC has been awarded a major EPC contract by Novatek and its partners for the Arctic LNG 2 project located in the Gydan peninsula in West Siberia, Russia.

Wex’ DriverDash is now available to fleet drivers at Shell-branded locations. DriverDash enables drivers to authorize a fuel transaction from inside their vehicles.

Wood Mackenzie and P2 Energy Solutions are combining their forces to deliver data-driven insight to US energy sector. This will be achieved by leveraging WoodMac’s well data and P2’s lease and production unit content and will be accessible on Wood Mackenzie Lens, an entirely new analytics platform offering real time data, intuitive design and powerful visualisation.

Geo Research & Development (Uzbekistan) has acquired licenses of Rock Flow Dynamics’ tNavigator.

Toho Gas has awarded Silicon Valley-based Fracta a contract for a proof of concept trial of Fracta’s proprietary artificial intelligence algorithm to assess the condition of its gas pipelines. The trial will use some 1,000 environmental variables and inspection data acquired along Toho’s 30,000 km network across Japan.


Standards stuff …

API, IOGP sign MoU. ExxonMobil accelerates Open Process Automation standard. FERC adopts XBRL. PIDX releases Supply Notification standard. Data Gumbo joins PIDX.

The American Petroleum Institute (API) and the International Association of Oil and Gas Producers (IOGP) have signed a memorandum of understanding to ‘establish a stronger partnership, share world-class practices and collectively engage on the industry’s contributions to society’.

ExxonMobil has signed collaboration agreements with six other companies to accelerate the development its Open Process Automation (OPA) standard. Companies in the initiative are now Aramco Services Co., BASF, ConocoPhillips, Dow Chemical, Georgia-Pacific and Linde. The partners are members of the Open Process Automation Forum (OPAF), a group established by The Open Group. The goal is for a standards-based, open, interoperable, and secure automation architecture. An OPA test bed is targeted to be operational by year-end 2019.

The Federal Energy Regulatory Commission (FERC) is to adopt the XBRL standard for utilities reporting, transitioning from a legacy Visual FoxPro system. FERC is to build an XBRL taxonomy to accommodate its reporting forms and will convene staff-led technical conferences to discuss the draft.

PIDX, the Petroleum Industry Data Exchange has released its Supply Notification standard a royalty-free public standard that enables suppliers, trading partners and terminal operators to notify when an event has occurred that could prevent the ability to load at a given terminal. Supply events are important to all parties in the supply chain and proactive notification ‘improves the efficiencies for dispatching trucks to terminals where product is available’. In a separate announcement, PIDX announces that ‘blockchain-as-a-service’ provider Data Gumbo has joined the organization.


Safety first …

CSB tells API ‘create drilling alarm management guidance’ and ‘promote equipment data sharing’. Det-Tronics - gas safety standards/software verification. E2S sounds horns, flashes lights. Hexagon applies AI to driver safety. Schneider Electric announces Safety Advisor.

The US Chemical Safety Board (CSB) has released its report on the 2018 blowout and fire at the Pryor Trust gas well in Pittsburgh County, OK, that killed five workers. The CSB calls for safety changes from regulators, trade associations and companies. The report also recommends API to create guidance on alarm management for the drilling industry to ensure that systems are effective in alerting drilling crews to unsafe conditions. Following its investigation of an explosion and fire at the Enterprise Products Pascagoula Gas Plant in Pascagoula, MS, the CSB has pitched-into the data sharing debate with a recommendation to the API that it should promote information sharing relating to failure hazards of heat exchangers from thermal fatigue.

A white paper from Det-Tronics reports on new gas safety standards to be introduced in 2020 asking ‘Will your plant be in compliance?’ The new International Electrotechnical Commission standards, notably IEC 60079-29-1:2016, ‘Explosive atmospheres — Part 29-1: Gas detectors — Performance requirements of detectors for flammable gases’ may mean that existing equipment and practices for gas detection no longer suffice. The new standard includes a test clause for software function verification to improve the reliability of software functions. Manufacturers will need to provide evidence of full compliance. The white paper concludes that ‘In their efforts to comply with these new standards, facility owners and operators may discover that the most effective approach is to abandon traditional detection technology in favor of state-of-the art fire and gas safety systems’, with an invitation to check out Det-Tronics’ own fault tolerant safety controller, Eagle Quantum Premier.

UK-based E2S Warning Signals has announced the STEx family of explosion/flameproof audible, visual and combined warning devices and manual call points. The devices are designed for harsh onshore and offshore environments and are IECEx and ATEX approved hazardous environments. The STExS alarm horn sounders provide 64 alarm tone frequency patterns and outputs up to 123dB(A). The STExB beacon light source is available as an ultra-bright 21 Joule Xenon strobe, halogen rotating or an array of high output LEDs.

Hexagon has announced the HxGN MineProtect operator alertness system for light vehicles (OAS-LV), a fatigue and distraction detection unit that continuously monitors operator alertness inside the cab of light vehicles, buses and semi-trucks. OAS-LV expands protects light-vehicle operators from falling asleep at the wheel, crashing or other fatigue or distraction-related incidents. The in-cab device scans a driver’s face to detect any sign of fatigue or distraction, such as a microsleep. A machine-learning algorithm leverages this facial-feature analysis data to-determine whether an alert should be triggered. Data can be transmitted to the cloud for real-time notifications for supervisors and controllers to apply an intervention protocol and allow for further forensic analytics.

Schneider Electric has announced EcoStruxure Process Safety Advisor, an IIoT-based digital process safety platform and service that enables customers to visualize and analyze real-time hazardous events and risks to their enterprise-wide assets, operations and business performance. The solution leverages Schneider EcoStruxure SIF Manager for tracking and validating safety instrumented function performance. Safety Advisor is said to be ‘platform and vendor agnostic’. A secure, cloud-based infrastructure aggregates real-time data, analytics and insights from multiple sites and geographies into a single user interface so customers can create an accurate, enterprise-wide risk profile in real time.


Grant Thornton give Data Gumbo blockchain clean bill of health

Financial ‘System and Organization Controls’ (SOC) methodology extended to cover blockchain. Study finds ‘transparency of blockchain transactions negated by their complexity and nascence of the underlying technology’.

Houston-based blockchain boutique Data Gumbo has announced completion of a ‘SOC 1Type 1 Audit*’ of its blockchain-as-a-service platform for industrial customers. The SOC 1 audit focuses on ‘controls likely to be relevant to an audit of a customer’s financial statements’. Data Gumbo retained Grant Thornton, an independent audit, tax and advisory firm, to perform its SOC 1 work. Grant Thornton’s testing of Data Gumbo’s blockchain included examination of Data Gumbo’s platform for smart contract set up, maintenance and execution and received an unqualified opinion from that its procedures and infrastructure meet or exceed the SOC 1 Type 1 criteria.

SOC validation for blockchain is a rather novel field. A recent publication from Grant Thornton titled, ‘Managing virtual currency and blockchain risk with SOC attestation’ argues that traditional SOC audit controls for company finances can be extended to cover blockchain risk. SOC validates that an organization is operating as efficiently as possible, minimizing risks and offering a best quality service. SOC level one covers an organization’s financial controls. A Type 1 report provides a snapshot of such controls as provided by management.

We asked GT for more on their blockchain SOC certification and were pointed to an article on ‘The Challenges of Auditing Crypto' interview that appeared in Coin Notes News. Here we read that ‘the transparency of blockchain transactions is negated by their complexity and nascence of the underlying technology’. There are problems with many implementations of blockchain explorers which may not satisfy conditions required for an audit. There are other problems for cryptocurrencies, blockchain addresses hide coin owners’ identities, although it is unclear to what extent such issues affect non-cryptocurrency applications of blockchain.

* System and Organization Controls. More from the American Institute of CPAs.


Going, going, green…

Equinor to disclose Sleipner CCS data. EY finds world decarbonizing fast. Gaffney Cline advocates more carbon management support from government. BP’s Dudley on ‘net zero’. Shell’s van Beurden on ‘misalignment’ with API on carbon. Total joins EU 3D CCS storage project. Wabash Valley Resources kicks off Indiana CCS project. Aker ‘Catch’ contract for Twence. US DoE funds for CCUS. Energy Watch Group study finds 100% decarbonization possible. EU geologists and the energy transition. IEA tracks clean energy progress. IOGP investigates the physical risk of climate change to the oil and gas industry. ISB chair warns of ‘exaggerated expectations’ for sustainability reporting. Sabine Pass leaks ‘threaten safety and success of US LNG’. NETL report on offshore CCS. US NAP report on Negative emissions technologies and reliable sequestration. Shell Canada’s Quest CCS milestone. ‘Tipping point’ for renewables ‘real soon now’. McKinsey sees peak oil by 2025 (maybe). ExxonMobil signs with Global Thermostat. Silicon Kingdom’s passive carbon capture. Graforce ‘feces, the energy source of the future’.

Equinor and its partners will disclose datasets from the Sleipner field; the world’s first offshore CCS plant, in a push to advance CO2 storage innovation and development. All data will be published via the SINTEF-led CO2 Data Share Consortium in September, a partnership supported by the Norwegian CLIMIT research program and the US Department of Energy. A prototype for the data sharing is available online for selected test users. The digital platform for sharing CO2 storage data is planned to be online in September 2019.

A report from EY and IDC has it that decarbonization, digitization and decentralization are accelerating the countdown to a new energy world faster than expected. Better, cheaper (green) technology is ‘speeding up the journey to tipping points by as much as two years’ and most markets have revised policies towards more ambitious clean energy targets.

A recent Gaffney Cline Focus newsletter asks ‘How can governments get carbon management right for oil and gas?’ Governments are struggling to find solutions to develop or sustain their economic competitiveness, whilst also achieving the ambitious GHG emissions reductions goals and commitments of the Paris Agreement. Less than 10% of the 550 individual energy policies and regulations relate to key technologies for carbon management in oil and gas (i.e., venting, flaring and fugitives reduction) and carbon capture, use and storage (CCUS) implementation. More must be done by governments and industry to develop supporting policies and regulations in these areas.

Speaking at a recent Chatham House event BP Group CEO Bob Dudley explained how the world is to ‘get to net zero’ carbon. Recent climate protests brought parts of the capital to a standstill and the BP AGM was interrupted by demonstrators, ‘even as we were passing a very progressive shareholder resolution’. But according to Dudley, BP agrees with more of the demonstrators’ view of the future than is realized. ‘Like our critics, BP believes the world is not on a sustainable path’, particularly as this year, global carbon emissions are rising at their fastest rate in years. BP supports ‘a rapid transition to a lower carbon future’ involving renewables and supported by decarbonized gas, including the use of carbon capture use and storage. In the net-zero world, oil ‘will likely play a smaller role’.

Concomitant with Shell’s publication of reports on industry associations, sustainability and payments to governments, CEO Ben van Beurden voiced Shell’s concern over ‘misalignment with an industry association on climate-related policy’ (read the American Petroleum Institute), adding that ‘in cases of material misalignment, we should also be prepared to walk away’. Shell aims to cut the net carbon footprint of the energy products it sells by around 20% by 2035 and by around half by 2050. More from Shell.

Total is to take part in the ‘EU 3D’ CCS project for the industrial scale capture and storage of CO2. EU 3D is a component of the future Dunkirk North Sea capture and storage cluster. The project is to capture CO2 from an Axens steel plant, using IFPen’s DMX technology, for sequestration in depleted North Sea reservoirs. DMX is funded by the EU Horizon 2020 R&D program with a €19.3 million budget over 4 years.

Phibro unit Wabash Valley Resources (a fertilizer manufacturer) has received an undisclosed amount from OGCI Climate Investments to develop a 1.5-1.75 million tons per annum CCS project near West Terre Haute, Indiana. CO2 will be sequestered 7,000 ft below the surface into Mount Simon saline sandstone. More from WVR.

Aker Solutions has signed a contract with Twence for the supply of carbon capture and liquefaction technology at Twence’s waste-to-energy plant in Hengelo in the Netherlands. Aker Solutions’ ‘Just Catch’ modular carbon capture system has a capacity of 100,000 tons of CO2 per annum and is planned to be in operation by 2021. Once captured and liquefied, the CO2 will be trucked to users such as nearby greenhouses*.

* The amount of CO2 sequestered by growing plants has been put at a few percentage points of that used in a greenhouse.

The US Department of Energy has announced $20 million in federal funding for cooperative agreements that will help accelerate the deployment of carbon capture, utilization, and storage. The selected projects will support the Office of Fossil Energy’s Carbon Storage Program. More from the DoE.

A new study by the Energy Watch Group and Finland’s LUT University describes a 1.5°C scenario with a ‘cost-effective, technology-rich, 100% renewable energy’ system that does not require negative CO2 emission technologies. The study ‘proves that the transition to 100% renewable energy is economically competitive with the current fossil and nuclear-based system and could reduce greenhouse gas emissions in the energy system to zero even before 2050’.

The EU Federation of Geologists has published a position paper on the role of the geologist in the energy transition. EFG believes geothermal energy (both shallow and deep geothermal), CO2 capture and mineral extraction are part of the answer to meet the Paris goals. CO2 capture and storage is considered a much-needed part of the plan and is a short term answer to fight climate change.

Of the 45 energy technologies and sectors assessed in the IEA’s latest Tracking Clean Energy Progress, only 7 are on track for reaching climate, energy access and air pollution targets. The latest findings follow an IEA assessment published in March showing that energy-related CO2 emissions worldwide rose by 1.7% in 2018 to a historic high of 33 billion tonnes. On the positive side, energy storage is now ‘on track’, with a doubling of new installations. Reduction of flaring and methane emissions from oil and gas operations, put at 7% of the energy sector’s greenhouse gas emissions, are still falling well short of targets.

An IOGP-led, three-day gathering chez BP investigated the physical risk to the oil and gas industry of a changing climate. The brainstorming session determined that oil production may be impacted by increases in air and sea temperatures, that personnel ran the risk of increased heat stress. Coastal facilities are at the risk of flooding and extreme waves. Increases in extreme precipitation may impact industry supply chains and operations and there is a risk of drought due to poleward migration of drier areas, increased duration between rainfall events and changes to water policy. More from IOGP.

Speaking at a Cambridge University conference on Climate-related financial reporting, Hans Hoogervorst, chair of the International Accounting Standards Board cautioned against ‘exaggerated expectations’ for sustainability reporting as a catalyst for change, ‘in the absence of policy and political intervention’.

A report in The Economist, ‘The truth about big oil and climate change’ showed that while the annual reports tell a positive green story, investment in fossil fuels continues to grow strongly and dwarfs that going into renewables. Oil companies see the demand for energy surging and have no immediate reason to fear drastic carbon pricing measures in many parts of the world. For investors, these companies remain attractive, with four of the 20 biggest dividend payers being oil majors.

A report in the Houston Chronicle describes major leaks at Cheniere Energy’s Sabine Pass LNG export terminal that ‘threaten the safety and success of America’s top natural gas exporter’. The report tells of ‘gashes up to six feet long that opened up in a massive steel storage tank at Sabine Pass, releasing super-chilled LNG that quickly vaporized into a cloud of flammable gas.’

A 500 plus page report from the US National Academies Press investigates Negative Emissions Technologies (NET) and Reliable Sequestration. The report finds, inter alia, that CCS needs to scale up at 10% year on year to meet a 2° target, and that comparable or greater rates of growth will be required of every available NET. At these rates, scale-up could become limited by materials shortages, regulatory barriers, infrastructure development (i.e., CO2 pipelines and renewable electricity), the availability of trained workers, and many other barriers

The US National Energy Technology Laboratory has published a report, ‘Estimating carbon storage resources in offshore geologic environments’. The report finds that, despite important differences between onshore and offshore systems, carbon can be stored safely and permanently in offshore saline geologic formations. This research proposes using the NETL’s saline storage methodology with an integration of ‘spatial-statistical’ tools to adjust for uncertainties.

Shell Canada’s Quest CCS has reached a major milestone with the captures and storage of 4 million tonnes of CO2 in under four years of operations. Shell Canada president Michael Crothers said, ‘Quest’s costs are coming down. If Quest were to be built today, it would cost about 20-30% less to construct and operate’. Quest received $865 million from the governments of Canada and Alberta to build and operate the facility.

A different Canadian ‘Quest’, the Quality urban energy systems of tomorrow and the Pollution Probe Foundation, have published ‘2019’s energy transformation, evolution or revolution?’, a discussion paper on the risk and opportunities in low-emission energy systems. The cost of wind, solar and batteries has dropped faster than expected (see McKinsey below) and that a ‘tipping point’, where the combination of renewables and storage will be less expensive than traditional energy supplies is nigh, in the next 5 to 10 years. Canada’s diverse energy spans decades-old hydro power, nuclear and legacy coal plants, now a source of intense CO2 emissions. Curiously, the Quest report does not consider the success reported by its Shell-owned namesake. Indeed, CCS is only mentioned in the context of natural gas. It seems like Canada’s oil sands have already been written out of the energy picture!

McKinsey Insights sees oil demand growth slowing substantially with a production peak at 108 bbl/day around 2030, possibly as early as 2025, in an accelerated ‘greening’ scenario. The report warns that a 1.5° or even a 2° scenario is now only a ‘remote possibility’.

ExxonMobil has signed a joint technology development agreement with Global Thermostat to develop its atmospheric carbon capture technology. Global Thermostat’s technology captures and concentrates carbon dioxide emissions from industrial sources, including power plants and the atmosphere. If technical readiness and scalability is established, pilot projects at ExxonMobil facilities could follow.

Arizona State University and Silicon Kingdom holdings have laid claim to the world’s first commercially viable passive carbon capture technology. ‘Powerful’ mechanical trees are to remove CO2 from the air to ‘combat global warming at-scale’. The trees were developed by Prof. Klaus Lackner, director of ASU’s Center for Negative Carbon Emissions and are claimed to be ‘thousands of times more efficient at removing CO2 from the air’ than the natural variety.

Finally, more brown than green but Germany’s Graforce has announced that ‘feces are the energy source of the future. Green hydrogen from excrement could cover half of the world’s annual energy demand and reduce global CO2 emissions by 20%. Graforce’s plasmalysis splits hydrogen out of chemicals like ammonia contained in manure using a high-frequency electrical field, a plasma. The atoms then recombine to green hydrogen and nitrogen, whereby purified water is left behind as a ‘waste’ product. The process is claimed to be 50% cheaper conventional electrolysis. More from Graforce.


Back to school …

TOG/IBM to certify data scientists. Wild Well Control’s e-learning for drilling operations. Texas RRC to put more ‘boots on the ground’. Ikon offers ‘bite-sized’ e-learning for rock physics and reservoir characterization. GreaseBook’s artificial lift 101. Opto 22 e-learning for Groov/Epic and MQTT explainer video.

The Open Group has announced a data scientist certification program, available to those who can ‘demonstrate skills and experience in the data science profession and who have applied those skills to solve business problems and drive change within an organization’. The certification program was initially brought to TOG by IBM. IBM is the first accredited organization to offer the certification.

Wild Well Control has launched an online ‘Introduction to drilling operations’ (IDO) e-learning course. IDO meets all regulatory guidelines and is IADC WellSharp accredited. The course targets operators, contractors and service companies, as well as non-technical and non-industry personnel. The 20 3D visually animated lessons are said to increase students’ awareness and understanding of drilling and the importance of maintaining well control at all times.

The Texas Railroad Commission has rolled-out ‘Boots on the Ground’ an oil and gas inspector training program, its first-ever new inspector training school. Training targets new oil and gas inspectors with less than two years tenure at the RRC and provides candidates with an understanding of the agency’s inspection process, oil and gas rules and necessary technical knowledge to provide consistent enforcement across Texas.

Ikon Science has made its rock physics and reservoir characterization training classes available online. Classes have been ‘chopped into bite-sized chunks’ offering access to easy-to-follow mini-workflows at any time. A seismic data conditioning and AVO inversion course is also in preparation. Ikon’s e-learning courses cost $750 per user for three months, and are free for those on current maintenance and support contracts. More from Ikon.

The informative GreaseBook has published a blog and guide to operating and servicing pumping units. The guide provides practical advice and tips on operating electric motor-driven pumping units and their control systems, and on natural gas engine mechanical lifts.

Opto 22 has released the Groov/Epic Learning Center, a hardware and software benchtop system for learning and developing process control systems using Opto 22’s technology. Hardware is assembled on a desktop operator load panel with illuminated pushbuttons, a potentiometer, a temperature probe, a Sonalert alarm and LED indicator, all delivered in a robust ‘Pelican’ case. The unit comes pre-loaded with software packages for configuration and troubleshooting, a real-time control engine to run flowchart-based control programs and a Codesys runtime engine for IEC-61131-3 compatible controllers. Other goodies include Ignition Edge from Inductive Automation, MQTT/Sparkplug, Node-RED and Python, C/C++ interfaces. A free online version is available. Opto 22 has also released a new video in its Automation 101 series providing ‘a thorough explanation of the MQTT broker’. MQTT enables device and application data exchange ‘with little or no IT involvement’. The tutorial explains how the MQTT broker works in IoT and IIoT applications. MQTT’s Publish/Subscribe model is said to best ‘the traditional poll and response model’. The broker acts as a post office providing ‘topic’ endpoints to subscribers.


2019 IQ Hub’s Canadian Wellsite Automation Conference

IDS automates wellsite reporting. DrScada on the wellhead manager. ABB Edge devices for the IIoT. Flicq AI at the edge. Epsis Team Box for Chevron.

IDS automates wellsite reporting

David Shackleton (Independent Data Services) automates wellsite reporting with Witsml and Anova analytics providing comprehensive analytics on machine readable data. The idea is to move from traditional ‘static’ daily drilling report to ‘lean’ automated reporting leveraging Energistics’ Witsml data exchange standard. The focus is on the ‘activity’ part of the report, the payload. IDS had developed a user-friendly front end for manual data capture. Other data can be extracted automatically from the rig (via Witsml) and sources such as SAP. The combined data set can be piped into third party drilling data systems from Peloton and Halliburton. The IDS system is coupled with an Anova database for storage, analytics and graphics.

DrScada on the wellhead manager

Dan Mackie presented DrScada’s wellhead managers in a variety of case histories, reducing and optimizing workover frequency, minimizing chemical use and avoiding issues such as high fluid level. What is a wellhead manager? An edge device that uses load and stroke data to optimize operation of the pumpjack. For some situations, like sand inflow, the pump stroke can be automatically adjusted to keep sand production below a threshold. ‘Instead of learning that production is down today, the wellhead manager can alert you to a problem ahead of time’. Leak detection and wax build-up are also candidates for DrScada’s early warning system. Mackie also gave a shout-out for Adoil’s Titan spill protection and wellhead containment system for soil and groundwater protection. More from DrScada.

ABB Edge devices for the IIoT

Zack Munk, Rockwell Automation explained how, in shale production, production characteristics change quickly and require multiple artificial lift methods. As these change, instrumentation and control systems need to adapt. Rockwell’s ISaGraf IEC 61131 is the system of choice with embedded Linux and C language programmability. The system consolidates pump data from load cell, vibration monitor, leak and tubing pressure and allows for optimum production rate without excess gas/sand/water. Well test information is pushed into the edge device’s algorithm over Microsoft Azure IoT/MQTT. The system permits auto-discovery of new devices on switch-on. The cloud/on prem system analyzes incoming data and acts as a gateway for production management system (Avocet). Event driven workflows use real time data to constantly check for known equipment states. For instance, the system may warn of a potential gas lock situation, operators can then click through to a gas lock wiki page for help and recommended actions. The MQTT IoT protocol is said the be key to security and data reliability.

Flicq AI at the edge

Karthik Rau gave another ‘edgy’ presentation, on Flicq’s ‘AI at the edge’. Flicq’s smart sensors run algorithms and analytics at the edge i.e. on site. An asset’s performance history and operational parameters can be embedded into the devices. The system analyzes data in real time ‘obviating the need for post-processing’. There is ‘no infrastructure to set up and no wires to worry about’, making ubiquitous condition monitoring possible. More from Flicq.

Epsis Team Box for Chevron

Bart Stafford presented Epsis’ Team Box, the user-configurable engine behind Chevron’s integrated operations center (IOC). Team Box is presented as an operating system for the IOC that consolidates information from PI, SAP, SharePoint and Oracle, for presentation on the wide screen of the control room, or other Team Box connected endpoints. Chevron is the Epsis poster child through a worldwide master software license. More from Epsis.

More from IQ Hub.


AI automates geoscience data management

2019 AI Paris conference hears from Total and Amayas Consulting on deep learning-based approach to document classification.

Florian Bergamasco (Total) and Dayron Cohen (Amayas Consulting) presented on the application of artificial intelligence in automating the management of geoscience data. Total’s geoscience data is growing at around 20%/year, the company now has around 15 petabytes of seismic data. Last year Total bought Maersk Oil with a considerable data integration challenge. Following brainstorming sessions and workshops between Total’s data managers and data scientists, a Data Lab was established. The Lab first developed its own expert system internally before going to third parties (LumenAI and Amayas) to enhance and industrialize the solution.

The result is data package classifier using OCR to extract metadata from reports and performing data validation, for duplicates, checking that navigation data is there etc. Amayas’ metadata extractor applies deep Learning, OCR, and image recognition, notably for company logo detection. An 88% accuracy is claimed in the extraction of well metadata from reports. The system works on PDF documents, text or scanned images, all routed through different processes. A list of ‘bigrams’ (pairs of words) is used to score and identify a page. The deep learning runs on a Linux blade on the network. Xception and ResNet also ran. More from AI Paris.


Consortia – who pays, who gets the results?

University of Houston professor rails against non-sponsoring companies that manage to sneak access to his code base!

In a recent communication, Arthur Weglein, professor at the University of Houston and director of the Mission-oriented seismic research program set out to define the return on investment for M-OSRP sponsors in these times of ‘scrutinized capital expenditures and research budgets’. Weglein raises some good points regarding the way collaborative research is financed across the industry.

Like other university consortia, M-OSRP has two goals. First is to educate and mentor students, and to assure that their contributions and advances are recognized and published. Second is to provide sponsors with relevant added-value, by solving seismic challenges, and by publishing reports and computer codes. Unfortunately, some companies that are not sponsors, provide interns or hire alumni and thereby get access to the M-OSRP algorithms. They thus derive value without supporting the source of their profit.

Weglein proposes that M-OSRP will continue with its seismic research program as before but that its ‘proprietary, well-documented code’ will from now on only be available to sponsors. The code will not, in general, be published. Weglein’s position is understandable and is how many industry consortia operate. But the approach is rather different to that seen in the open software movement where code is in the public domain and exposed to scrutiny. More on Weglein’s M-OSRP from U Houston.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.