A recent meeting of the SEG standards committee addressed the problematic issue of storing large, sometimes very large, sequential data files such as seismic recordings in the public cloud. Current usage is to use the ‘object’ storage formats that are available in all of the cloud vendors’ portfolios. The problem with this naïve approach is that the cloud APIs retrieve an object in its entirety, making data access sluggish, especially if only a small number of traces is required. A similar issue is roiling the research community as large scientific data storage formats like HDF5 (as used by Energistics in RESQML) suffer from the same problem. HDF5 ‘works’ in the cloud but runs extremely slowly. Such issues have been taken up by the data science community to the extent that vanilla HDF (and SEG-Y) are now widely deprecated for cloud use. But what is going to replace them?
The issue of storing sequential data in the cloud echoes earlier agonizing over storing seismics on disk. See for instance our 2007 article, ‘Was the future random? Disk vs. tape-based data archival’. There are many solutions to this problem, for instance Troika’s Minima, Bluware’s volume data store (VDS) and others. Such solutions re-arrange sequential seismic data into an indexed format suitable for random access. At issue today is how such work-arounds can be ported to the cloud. The HDF world has addressed the problem with ‘Kita’. But Kita is not open source and thus fails to meet the requirements of both Energistics and OSDU.
OSDU came out of the starting blocks advocating ‘OpenVDS’, an open source subset of Bluware’s proprietary VDS. Since then, Bluware has been pushing OpenVDS quite aggressively, notably in an AAPG interview where chief product officer Andy James dissed the 1975 SEG-Y format, citing ‘inherent limitations’ in existing seismic data formats and vaunting the merits of VDS.
The SEG’s standards committee is not terribly pleased by this turn of events. In the virtual meeting, the discussion addressed the perceived ‘uselessness’ of SEG-Y as a cloud format. The committee believes that ‘uselessness’ is a matter of implementation and that the SEG needs to provide guidance on how SEG-Y should be used in the cloud, rather than abandoned. The idea is to publish a Guidance Note similar to those that accompany the IOGPs standards that will keep SEG-Y alive in the modern world, without being beholden to a single vendor solution, even one that is ‘open’. What instruction will be in the Note remains to be seen. This is quite a thorny problem but, as one committee member remarked, ‘We just need to combat the erroneous opinion that SEG-Y is useless’.
Comment: The Bluware/SEG-Y kerfuffle recalls the Petroware/LAS tiff we reported on from last year’s ECIM. In both cases, a vendor is offering a ‘modern’ version of a legacy standard. It is probably a good thing that these initiatives are forcing the standards community to react, at least for the SEG. LAS custodian, the Canadian Well Log Society has not reported any issues with LAS. A CWLS spokesperson told Oil IT Journal, ‘LAS 2.0 is the standard used by the majority of users and is fully supported and regularly updated’. LAS applications, ‘LasApps’, software and certification tools are available from the CWLS website.
As has been widely reported, Google is to stop making customized artificial intelligence tools for oil and gas firms. Early reports appeared to suggest that Google had caved in to pressure from Greenpeace. In fact, Google’s announcement came before the publication of Greenpeace’s report ‘Tech companies are helping big oil profit from climate destruction ’. According to CNBC, Google Cloud took $65 million from oil and gas companies in 2019, less than 1% of total Cloud revenues. One company that is likely to be impacted by Google’s stance is Total which in 2018 signed with Google to develop artificial intelligence solutions for geoscience.
The Greenpeace report is an extraordinarily well documented summary of current oil and gas cloud computing initiatives which may make oils think twice before publicising their computing expertise. Greenpeace observes that ‘contracts between tech firms and oil and gas companies are now found in every phase of the oil and gas production chain and significantly undermining the climate commitments that Microsoft, Google, and Amazon have made’. Greenpeace cites a popular upstream IT trope that ‘while the near-term outlook for the oil and gas industry is bleak, AI partnerships [… represent …] a critical toolkit that fossil fuel companies will use to bounce back from this downturn’.
It’s not just Google that Greenpeace has in its sights, Amazon and Microsoft are also called-out as ‘partnering with oil companies to use AI to unlock oil and gas deposits in the US and around the world’. The size of the prize for Greenpeace is evidenced from some rather convenient analyses by Accenture ($425 billion from advanced analytics by 2025) and BloombergNEF (annual spend on cloud and advanced analytics at $15.7 billion in 2030).
Greenpeace has it that Microsoft ‘appears to have the most AI contracts with oil and gas companies’. Which means that Microsoft can ‘never truly achieve its recently announced carbon negative’ goal while continuing to aid the oil and gas sector. Amazon chief Jeff Bezos is likewise criticized for allowing AWS to support oil companies while announcing a ‘climate pledge’ and $10 billion ‘earth fund’.
The Greenpeace attack on oil and gas cloud computing is extraordinarily well informed, touching on AI/ML, big data and the internet of things. The study explains in some detail how oils are using cloud technology to plug gaps in data, perform 3D modelling and seismic processing. OSDU, the Open subsurface data universe gets a special mention, as does the fact that all three cloud vendors are on board OSDU.
Greenpeace is on shakier ground when it beats up on the use of high performance computing in leak and spill detection, putting oils (and their cloud service providers) in a ‘damned if you do and damned if you don’t’ position. Greenpeace goes as far as naming some ex-oil company execs who are now heading up cloud initiatives chez the providers. But then Greenpeace is not one for softball tactics – witness the misplaced zeal with which it attacked the Brent Spar decommissioning.
Perhaps the biggest risk for oils lies in Greenpeace’s attack on Microsoft as a major oil and gas service provider. Citing a Microsoft mole called ‘Zero Cool’ and a ‘Microsoft Workers 4 Good’ coalition, Greenpeace thinks it has put its finger on ‘a gaping hole in Microsoft’s carbon math equation’. Greenpeace also calls out Microsoft and Google for rebranding their oil and gas minisites as addressing the ‘energy sector’ (rather than ‘oil and gas’) in a thinly disguised bit of greenwashing.
The report winds-up with a ‘roadmap’ for the cloud providers involving a moratorium on all new machine learning and high-performance computing contracts for oil and gas companies, an environmental impact assessment of ‘the most problematic contracts’, a winding down of existing contracts and a public to ‘pivot’ AI contracts towards the renewable energy sector.
This is another of those ‘be careful what you wish for’ deals. As folks have been banging on for years about ‘digital transformation’ without knowing or caring what it means, a transformation comes along fully formed and smacks the world in the face. Enter the covid19 digital transformation. No more being stuck in traffic jams or squeezed into the underground. No more chatting around the water cooler. No more expensive downtown office rentals. Working from home is where it’s at. And all thanks to Zoom, a technology that nobody saw coming and that isn’t even really new. The question is, will it last?
In the oil industry, the direct impact of remote working is difficult to evaluate. On the one hand, for many, it has been a way of life for quite a while. Smaller operators may call on the services of favored consultant seismic interpreters or reservoir engineers. While such folks may have required rather chunky IT for a home office in the past, having their tools of the trade available in the cloud removes this obstacle. In fact, it puts the remote worker on an equal footing with the major’s engineers, an interesting corollary of the push towards the cloud. But just as those pushing for digital transformation are keen to point out, ‘most digital transformation projects fail’. So how might the shift to teleworking ‘fail’?
A decade or so ago, some of the business-oriented consultants tried to convince oil bosses that geoscience was just like any other work process. Seismic interpretation was a ‘commodity’ activity that could be improved by the application of Taylorism and ‘lean’ processes. I don’t think they got very far. A quick google search for ‘oil and gas shift work’ suggests why. All shift work, i.e. where productivity is really at stake, is in the field, not in the office. Interpreters and, for that matter, software developers don’t work shifts. Despite the push for efficiencies in geoscience and for the rapid development of software as ‘minimal viable products’ the fact is that, in the oil and gas industry, like in software, work is a lot more relaxed than say, working in meat packing*.
The really key activity in oil and gas is the wheeling and dealing that goes on prior to the award of a concession. The geoscientists and engineers inform this activity of course but it is executed by folks who have a special relationship with the license holders or governments. Such special relationships may be arms-length and completely above board. Or, as Transparency International likes to remind us, not so transparent. Other wheeling and dealing opportunities abound in the secondary market as permits are exchanged, farmed-out and sold. This wheeling and dealing defies a Taylorism/efficiency analysis as does the ‘financial engineering’ that accompanies the deals. What is key here is confidence, persuasion and, as in much of human endeavor, the chance of financial gain.
I now invite you to a future meeting of a foreign potentate, some influential financiers and an independent oil company’s exec. The meeting takes place in a small downtown apartment with a brass plate on the door.
Potentate: ‘Pleased to meet you, but where are all your employees?’
Oil exec: ‘Ah, now we are pretty much a virtual company, all our employees work from home now’.
Potentate: ‘Really, how interesting. Our National oil company has ten thousand people working in a steel and glass tower’.
Oil exec: ‘Err right, we used to do it like that …’
Potentate: ‘Our people swear by their in-house IT. We have a 5 petaflop computer on site…’
Oil Exec: ‘We do all our computing in the cloud now, no need for anything on site’.
Potentate: ‘What was that I was just reading about Google and oil and gas IT?’
Oil Exec: ‘Err, right… Let’s talk finance.’
OK, I'm not very good at dialog. What I am trying to get across is that an oil and gas company is a builder of confidence in its capacity to execute geoscience, engineering and finance. The geoscientists and engineers working visibly inside a glass and steel tower are, along with their day jobs, building confidence in the operation, like so many actors in a play. Back in the day, the majors would present their proposed work in theatrical 3D visionariums. Today, expertise in artificial intelligence may play a similar role.
Of course, as now, one size does not fit all in oil and gas theater. I can imagine shale operators working from less ostentatious locales than for instance ExxonMobil’s mega campus in Spring, Texas, or Total’s yet-to-be-built high-rise in La Défense. Both of these are designed to impress, and it doesn’t look like teleworking was part of the program when they were planned.
Along with the covid transformation, oil and gas is currently being disrupted by the green movement, by shareholder revolts and by un-friendly comment in the financial community. WoodMac’s Luke Parker was quoted in the FT as saying, ‘Make no mistake, companies the likes of Shell and BP are already ushering in the twilight years.’ In their twilight years, maybe the majors need the theater of a major downtown location more than ever.
* I am of course supposing that the geoscientists and engineers are lucky enough to still have a job. My sympathies to those who are currently ‘resting’, as they say in the theater.
How to Store* begins with a short history of fossil fuels and C02 in the light of recent targets for global warming. The CO2 story is quite an old one. Fourier, the geophysicists friend, first mooted the warming potential of the earth’s atmosphere. In 1896, Arrhenius pinned most of the warming on CO2. In 1958, atmospheric data collection CO2 at the Mauna Loa observatory on Hawaii began. Ice core data pushed the curve back to 1815. CO2 has risen inexorably, from 280 to over 400 ppm currently. A telling graph shows summarizes the ever-rising CO2 levels along with the hypothesized trajectories that could result from achievable (but as yet unrealized) reductions in output. The maximalist hypothesis, a 50% reduction by 2050, still falls considerably short of the 1.5° target of 2018 IPCC report.
The case for CCS is thus clear, particularly for those working in the oil and gas industry. Today, fossil fuel makes up some 80% of the world’s energy supply and transport, manufacturing and agriculture are highly dependent on fossil energy. A transition period is needed during which CCS (and other measures) are urgently needed.
Ringrose observes that the ‘likely level of growth’ of these energy options is widely debated. CCS provides a mechanism for decarbonising both existing power supply and reducing emissions from industry (e.g. cement and steel manufacture). CCS allows the energy transition to be achieved faster and cheaper than by using only renewable energy sources. In conjunction with bio-energy combustion CCS enables ‘negative net-CO2 emissions’. The counter arguments are that CCS is too expensive and that it encourages fossil fuel usage to continue longer than necessary. Ringrose does not see these as either-or options, rather, it is just a question of how much CCS will actually be used during the energy transition.
How to Store runs through the three pillars of CCS, capture, transport and storage. Each process may seem quite simple, but the quixotic behavior of CO2 (phase transformations, compressibility and thermodynamics, and corrosion) make for pitfalls in each stage of the process.
Storage can be achieved in saline aquifers (the greatest potential), in depleted oil and gas reservoirs (with the merit that they are well understood and the infrastructure is in place). Storage can also be achieved as a part of oil enhanced recovery (EOR) projects, as per carbon capture, utilization and storage (CCUS).
To store significant volumes of CO2, the gas must be compressed. Thus, a depth of over 800 m is necessary. Storage also requires a seal to prevent CO2 from migrating upwards and out of the formation. The parallels with oil and gas exploration are clear. At depths of over a kilometer, ‘we know that natural gas has been trapped beneath geological seals for millions of years, and so the potential for long-term trapping of CO2 at these depths is also clearly possible’.
The essential questions for a project; how much can we inject? And can we store it safely and cost-effectively? Such considerations translate into geology, well design, reservoir modelling and other issues familiar to the upstream. The activity is regulated, in the EU by the EU CCS Directive (EC 2009; annex 1) which mandates data collection and modelling and monitoring to detect any leakage. In anticipation of future projects, the EU GeoCapacity Project has evaluated and mapped the potential for CCS, likewise, the North American Carbon Storage Atlas covers USA, Canada and Mexico. These government-sponsored, projects demonstrate that there is plenty of capacity available. Although things are more nuanced when considering the economics, true storage capacity and matching CO2 sources with the sinks.
Poster child for Norwegian CCS (and for How to Store) is Sleipner, which has been in operation since 1996. Sleipner has been extensively modelled and monitored with 20 years of time-lapse seismic imaging of the CO2 plume. In Salah (Algeria) and Snøhvit (Norway) are used to illustrate CO2 storage flow dynamics, again with many parallels with oil and gas such as static rock property models, two phase fluid flow and dynamic flow simulations.
The niceties of transporting CO2, which may be held just above the boiling point for smaller shipping solutions or chilled to around 45 °C, recall the technology of the LNG business.
How to Store concludes with a chapter on ‘what’s next’ for CCS. Today there are 19 large-scale CCS facilities in operation with an installed capture capacity of 36 Mtpa, most acting in the CCUS space for EOR. The IEA states that ‘two decades of CCS has led to a growing recognition by climate experts of the value and potential of the technology’. Unfortunately, such recognition has not been matched with increased support. CCS has been ‘hampered by fluctuating policy frameworks and lack of financial support’. CCS is essential to meeting the greenhouse gas reduction goals, but it is progressing much too slowly. The main problem is ‘socio-economic’. The cost is perceived to be too high and the benefits are perceived to be too low. Such perceptions need to change, ‘CO2 storage is a lot safer and better than putting the same CO2 into the atmosphere’. A single CO2 injection well can make for very significant emissions reduction. Increases in the ‘carbon price’ will make CCS increasingly attractive, although sequestration from the iron, steel and cement industrial sectors are only likely to proceed with carbon prices of over $100/tonne.
In this review we have cherry picked the narrative. In fact, How to Store provides a wealth of accessible technical information on CCS from a geological and engineering standpoint. Mathematical equations are presented for reference and the book is well illustrated with 4D seismics, well schematics, cross sections and more. For researchers, How to Store provides some 12 pages of references! Coverage is not so strong in the fields of CCUS/EOR, perhaps reflecting its EU bias. Also, the economics and social acceptability of CCS are only touched upon. There is no mention, for instance of ‘Germany’s No to CCS. As Ringrose observes, successful projects to date all occur in national jurisdictions ‘where some kind of legal and financial framework is in place’.
* How to Store
CO2 Underground: Insights from
CCS Projects. 140 pages. Springer Briefs in Earth Sciences. ISBN
Total is to open a ‘Digital Factory’ in central Paris next year and is looking to hire data scientists, data engineers, developers, DevOps specialists and coaches. The DF will house Total’s high profile ‘Digital At Total’ initiative which has enthusiastic backing from CEO Patrick Pouyanné. Total’s businesses now span oil, gas, solar, wind and a ‘low-carbon’ electricity businesses. Across the board, Total’s employees are said to ‘share a common conviction that digital is a key lever for innovation and value creation’. The initiative is also considered essential to Total’s ‘NetZero*’ carbon neutrality climate ambition for 2050. The DF is to create tailor-made digital solutions, accelerate at-scale digital delivery at all of Total’s sites around world and be a catalyst in Total’s digital transformation.
The DF team is to be assembled over a two-year period. Skill sets sought include Azure, SQL DB, DataBricks, Cosmos and the usual data science portfolio of TensorFlow, Scala, Python, PostgreSQL, Node.js and ElasticSearch. The DF is located in Paris’ ‘Silicon Sentier*’, in close proximity to Paris’ dynamic startup ecosystem. A ‘family spirit’ centered on ‘convivial rituals’ is promised for the digital factory workers!
* Silicon Street is a reference to Silicon Valley. In fact the DF will be located in Paris’ Rue des Jeuneurs which does not, as the name translates, mean the street of fasting people, but rather, as Wikipedia tells us, refers to an older name, the Rue des Jeux-Neufs, a reference to a 17th century bowling alley.
The June 2020 edition of the Open Subsurface Data Universe’ ‘In the Pipeline’ newsletter introduces the new Energistics integration team. OSDU plans to use recognized industry standards and expand them into critical areas of its business, notably the reservoir domain. The Energistics Integration workstream (a component of OSDU Release 3), due out later this year, is to ‘identifying opportunities to leverage, adopt and implement Energistics standards’. Energistics is currently devoting almost its entire effort to the OSDU Forum and initiatives. The OSDU R3 data platform will include ingestion and delivery of WITSML standard-format files, a related JSON style guide and a ‘manifest generator’ for WITSML. Energistics has also proposed a reservoir domain data management services capability, along with related reservoir data definitions for incorporation into OSDU. These will embed Energistics’ RESQML standard, with associated parsers, ingestion pipelines and more. Energistics’ Units of Measurement and standard reference values are also included in this activity. OSDU is also said to be interested in expanding into the production domain with PRODML.
PPDM has also announced further allegiance to OSDU which will be able to leverage PPDM standards and best practices. These include knowledge embedded in the PPDM data model, including terms and definitions, use of the PPDM Data Rules Library and the reference lists ‘What is a well’, ‘What is a completion’ and ‘Well status and classification’.
On the membership front, SparkCognition, and Emerson/Paradigm have signed up with OSDU, bringing the overall corporate headcount to around 750! SparkCognition help develop the forum’s standard data platform and Emerson is to contribute RESQML expertise and software to supports multi-disciplinary workflows on top of the OSDU platform. Emerson has also announced that its entire software portfolio will be connected to the OSDU data platform. Geolog already runs atop of OSDU. More from The Open Group’s OSDU minisite.
In a web-based event hosted from Microsoft’s AI Centre of Excellence for Energy in Abu Dhabi, Microsoft announced the ‘Microsoft Energy Core’ (MEC), a grouping of energy industry players, technology partners and academic institutions who are to ‘infuse the energy sector with the power of the intelligent cloud, enabling innovation to flourish’.
The MEC is to ‘harness the power of artificial intelligence, cloud technologies and the internet of things to transform businesses, increase productivity and run more efficient and sustainable operations’. The MEC covers upstream, with data-driven value creation for the oil field and downstream with ‘frictionless experiences’ at gas stations and intelligent energy services to smart cities and prosumers. MEC’s ten founding partners are ABB, Accenture, Aveva, BakerHughes, Emerson, Honeywell, Maana, Rockwell, Schlumberger and Sensia.
The MEC website includes some rather old case histories from BP, Shell and Equinor covering computer vision deployed at Shell’s gas stations, Equinor’s Azure data center in Oslo and BP’s corporate-wide digital transformation and embrace of artificial intelligence. BP makes special mention of Azure automated machine learning deployed at its own ‘AI Center of Excellence’.
The MEC recalls an earlier Microsoft digital initiative, the Microsoft Upstream Energy Reference Architecture. As we mused back in April 2012, MURA did not make much sense beyond the statement that ‘involvement in MURA equates to the use of any Microsoft product—especially SharePoint’. MEC appears to update the marketing spiel with Azure replacing SharePoint!
Energistics is working on a Json style guide for upstream data exchange standards. The Json style guide (JSG) will cover Energistics’ WITSML, RESQML and PRODML standards. These are all currently based on XML which remains the format of reference for the time being.
The JSG reflects Json’s popularity for data science and cloud applications. OSDU, the Open Subsurface Data Universe uses JSON, and the JSG us a pathway for the integration of the Energistics data standards into the platform.
Notwithstanding Json’s popularity, the Energistics’ release
emphasizes that XML allows for schema-based data verification. The JSG
is an acknowledgement that a proposed JSON
schema has now evolved to a ‘degree of usability’. But this, and
other current JSON inadequacies, have led to various workarounds
currently used to map from XML. The JSG is an attempt to rationalize
these and allow for lossless transfer into JSON from the fuller
featured but more-verbose encoding that is XML. More on the JSON Style
Guide from the Energistics/OSDU web page.
Brüel & Kjær Vibro has launched its ‘next-generation’ VCM-3, a 12-channel data acquisition hub for monitoring pumps, fans, motors and other machinery. VCM3 offers advanced condition monitoring, easy system integration, built-in cyber security and simple IT installation.
The 2020 version of Blue Marble’s Geographic Calculator comes with a new remote desktop protocol-enabled single user floating license option, new magnetic declination models including World Magnetic Model 2020 and IGRF13, and Lidar data conversion. The release introduces support for Geoid 18 for the United States.
CGG GeoSoftware has released new versions of its ‘cloud-ready’ reservoir characterization and petrophysical interpretation software. All GeoSoftware portfolio applications (Jason, HampsonRussell and PowerLog) now run on both Azure and AWS. The new releases offer advances in machine learning and artificial intelligence and new Python notebooks.
Cold Bore Technology has announced a beta version of ‘Frac Action’, its frac optimization software. Frac Action overlays high resolution, real-time stage frac data on prior stages for comparison with an ‘ideal’ pre-set stage Frac Action leverages Cold Bore’s ‘SmartPAD’ well site data collection system.
Endress+Hauser has announced ‘Netilion’, an industrial internet of things ecosystem for maintenance and life cycle asset management. Netilion provides documentation and data management, and plant performance and health status. A Netilion Scanner app captures asset data using QR codes or RFID tags. Netilion Analytics can be used to create a digital twin of the system for proactive instrument maintenance. E+H has also announced the Micropilot FWR30 cloud-connected tank radar level sensor. The device uses 80 GHz wireless IIoT communications to connect with E+H’s Netilion Value hosting service.
Emerson’s Mimic Field 3D immersive virtual reality environment is now available from AspenTech’s digital twin portfolio. Operators can interact with valves or other field equipment and see the impact on the process. Mimic Field 3D is used in an operator training context and integrates with existing 3D CAD facility models.
Exprodat has announced V2 of Exploration Analyst, a geoscience/oil and gas toolset for ArcGIS Pro. The new release includes a Well Results tool for after-action drilling reviews and investigation of underlying spatial relationships. An Analyse Prospects tool adds portfolio management, including prospect summaries by stratigraphic stage, volumes and play success. More from the Exploration Analyst blog.
The 2020.0 release of Safe Software’s FME geographic data manipulation toolset includes an interface to Google BigQuery GIS, Google’s cloud data warehouse for geographical data types and functions.
Honeywell’s new Enraf Tank Inventory System (ENTIS). ENTIS leverages the Experion platform to provide terminal operators with a ‘powerful, modular and easy to use’ solution for distribution and bulk terminals. The ‘next generation’ tank inventory system supports weights and measures-certified applications for custody transfer, accounting and reporting. The solution is said to comply with cyber security standards.
Integrity Plus by MISTRAS, a division of New Century Software, provides on-demand pipeline integrity services to fulfil operator requirements and assure that regulatory requirements are met.
OAG Analytics’ new raster image server handles 100GB plus sized images for use in subsurface data science workflows. Map caching enables imagery to be rendered smoothly in a web browser. OAG Analytics provides ‘hybrid machine learning’ tools that can be deployed as stand-alone applications or integrated into a company’s data science workflows. The image server is available now for on-prem deployment and will be available on the AWS marketplace real soon now.
The 2020b release of OriginLab’s eponymous data analysis and graphing software adds over 75 new features, apps and improvements. A new data navigator panel accesses multi-sheet Excel files. OriginLab provides data connectors to Matlab, HDF, Excel, and NetCDF and more.
TRC Consultants has announced PHDwin V3, a new version of its economics and decline curve analytical package. V3 adds a customizable user interface, improved calculation and reporting speeds, new graphics that include a ‘bendy B-factor’. Revenue, expenses, and investments can be displayed and scenarios captured as qualified data benchmarks. PHDwin V3 is currently available to beta program participants.
ResFrac has added a new user interface, modernizing what the company describes as ‘the industry’s first genuinely coupled hydraulic fracture, wellbore and reservoir simulator’. Hydraulic fracture simulations now run in the Microsoft Azure cloud. A locally installed GUI sets up the simulation, visualizes the results and manages simulations.
ResInsight 2020.04.01, the latest version of the open source 3D visualization, curve plotting and post-processing tool for reservoir models and simulations, is a ‘major update’ that supports import of 3D surfaces, Allan diagrams and cumulative phase distribution plots. Well disks display color-coded production and injection rates and new features have been added to the Python API. Read the release notes here and visit ResInsight on GitHub.
Rose & Associates has released RoseRA, a new prospect volume, chance and aggregation tool. RoseRA is said to provide comprehensive functionality that incorporates corporate best practices. Assessments can be run over any or all prospective zones and results aggregated in a single file. RoseRA was named in honor of R&A founder Pete Rose.
Schneider Electric has released a public API for its cloud-based EcoStruxure IT Expert hosted, IoT-enabled electrical systems monitoring platform. The API will allow solution providers to integrate EcoStruxure with third party systems, adding remote monitoring of power and critical infrastructure into their portfolio. Last year BP signed a five-year agreement with Schneider for the deployment of EcoStruxure architecture its upstream assets.
Siemens’ has announced the Sitrans LR100 series of 80 GHz radar tank level transmitters, described as compact, narrow beam instruments for installations in existing vessel openings.
VELO3D has launched Sapphire, an industrial 3D metal printer with a vertical axis of 1 meter, the ‘world’s tallest’ laser-powder additive manufacturing system. The system will be commercially available starting late 2020 and is compatible with nickel-based alloys.
Christine Rhodes and James Clark (both with Chevron) presented on the Open Subsurface Data Universe’s venture into high performance computing. The OSDU HPC Project sets out to ensure that future OSDU releases are aligned with the specific requirements of upstream workflows, running either on premise or in the cloud. The OSDU/HPC reference architecture should support future OSDU goals such as on-demand computing, AI and analytics, and HPC ‘edge’ computing.
The idea is to ‘ensure that traditional HPC workflows are not hindered by emerging OSDU technology or data standards’ and also (rather enigmatically) that ‘emerging HPC technologies are not strictly bound by OSDU standards which may hinder innovation.’ During 2020, OSDU R2 will see the light of day with a common code release spanning OpenVDS (Bluware’s SEG-Y substitute format – see this edition’s lead) and Schlumberger’s OpenDES API. Before year-end 2020, R3 will see a multi-cloud ‘deployment-ready’ edition. The intersection of the interpretation world of vanilla OSDU and HPC is work in progress. The current OSDU data store is object based and accessed through domain specific API’s and data management services and may have limited intersection with oil and gas HPC. Storing pre stack data is considered a ‘stretch goal’ for OSDU so current HPC workflows might be connected as external data sources. Most majors are on board the OSDU HPC Project as is Schlumberger (but not Halliburton). Amazon and Azure are on the list (but not Google).
Paulo Souza Filho (Atrio) and Luiz Felipe (Petrobras) presented tests of large-scale seismic processing in the public cloud. The Atrio Composable Cloud (ACC) exposes a unified way to launch and manage computation workloads on multiple target machines – on-premises or in the cloud. Number crunching can be farmed-out to AWS, Azure, Google Cloud, Open Stack and other services. A ‘pop-up’, disposable cluster can be launched from the Atrio app store. The system evaluates the time and cost of running say, a reverse time migration in a selected container prior to run time. When all the parameters look right, the job is run. In the Petrobras trial, a 5 petaflop cluster was assembled in the public cloud, made up of 320 NVIDIA V100 GPUs and a 100TB Lustre file system, achieving 99% of the on-premises performance. But whereas the pop-up cluster was created in one hour, ‘procuring a 5PF system would take months’.
Fabio Luporini and Gerard Gorman (both from Imperial College, London) provided an update on Devito, now in V 4.1. Devito is an abstraction layer that hides HPC code complexity by automatically generating GPU codes ‘without the excruciating pain’. The open source Devito consortium has financial support from BP, Down Under Geophysics, Microsoft and Shell. The high-performance Python framework is driven by commercial and research seismic imaging demands. The authors observed that ‘open source is still a novel idea in this industry despite clear evidence from the tech industry that it is a critical business strategy, please engage’. More from the Devito Project.
Download these and other
presentations from the Rice O&G HPC home page.
Kent Masters has been appointed Chairman, President and CEO at Albemarle Corporation, following the retirement of Luke Kissam. Current board member James O'Brien will replace Masters as Lead Independent Director.
Lem Smith is now VP Upstream Policy at the American Petroleum Institute. He hails from Squire Patton Boggs. Russell Holmes has been appointed Director of the Center for Offshore Safety replacing retiree Charlie Williams.
Deepa Poduval has been appointed Senior MD and Associate VP at Black & Veatch’s Management Consulting business to lead its Advisory and Planning practice area.
David Lawler is to succeed Susan Dio as chairman and president at BP America. He will assume the duties in addition to his role as BPX Energy CEO.
C.J. Hughes has appointed Charles Austin as President. Former president Douglas Reynolds remains as president of Energy Services.
Katherine Lemos has been appointed Chairperson and CEO of the U.S. Chemical Safety Board.
Cynet has opened its North American sales operations in Boston. Avi Mileguir has been appointed VP of cybersecurity sales. Mileguir hails from Namogoo.
Petrofac’s Daniel Gear has been named as one of five new board members to lead the Engineering Construction Industry Training Board (ECITB).
Equinor has appointed Tim Dodson as VP Strategy Execution in Global Strategy and Business Development. He is succeeded as EVP Exploration by Tore Løseth.
IOGP’s Joint Industry Program (JIP) 33 has opened a new delivery center at Aker Solutions’ Houston facilities. Ted Fletcher is stepping down as JIP33 Integration Chair to take up a new role as GM Quality at Woodside. He stays on the Steering Committee and continue as JIP33 implementation champion.
Glynn Lockyer has joined LYTT in a business development role. He was previously with Archer.
Marathon Petroleum has appointed John P. Surma as non-executive chairman of its board of directors. President and CEO Michael Hennigan is now member of the board.
Nine Energy Service has appointed Guy Sirkes to SVP and CFO following the departure of Clinton Roeder.
Hege Kverneland is to retire as National Oilwell Varco VP and CTO. David Reid is to replace Hege as CTO and continue his role as CMO.
Oceaneering has announced a voluntary base salary reduction of the executive management team and other members of senior management during the ongoing public health and energy market crises.
Okeanus has appointed Amanda Ingram as Senior Project Manager. She hails from Oceaneering.
EagleClaw Midstream has joined ONE Future, bringing the total number of member companies to 24.
Charissa Santos has been promoted to Product Manager at OspreyData.
Quality Companies has appointed Derek Bollom as VP of offshore and offsite construction and production. He was previously with Zadok Technologies.
Mark Zoback and Joe Frantz have joined ResFrac as senior advisers.
SeekOps has appointed Jim Rutherford as VP of Engineering. He was previously with Heath Consultants.
Song Hu is the new sales and support manager at Sercel-GRC. He hails from TianJin Allwell Oilfield Technology.
Elizabeth Viator has been promoted to Stratagraph’s controller and human resources manager.
Christine Whelchel has joined Tatanka Midstream as COO. Whelchel hails from Marathon Petroleum.
Cooper Oil and Gas’ Cye Wagner is now Chairman of The Texas Alliance of Energy Producers (TAEP) board of directors.
Aberdeen-based AVC Immedia has launched TigerLive radio, a digital radio platform for the energy sector.
Kevin Raymond is Director of Services at Validere. He was previously with Accenture Strategy.
Chad Teply has been named SVP of project execution at Williams. Teply was recently with PacifiCorp.
Co-founder of Gaffney Cline & Associates, William Benjamin (Ben) Cline IV passed away on April 19, 2020 at the age of 87. There is an online obituary in the JPT.
Vitol chairman Ian Taylor
has died aged 64. The Financial Times obituary
described him as ‘a founding father of the modern commodity trading
Agile Scientific’s Software Underground is ‘going legal’ with a move to incorporate in Canada as a non-profit. The plan is to change ‘as little as possible’ with access to the Slack channel remaining free for everyone. The inaugural AGM is slotted for 2021.
CUI Global is changing its name to Orbital Energy Group. The new name reflects its transformation to a ‘diversified energy and infrastructure solutions platform’. The name change follows the 2020 acquisition of Reach Construction Group which added renewable energy EPC expertise to the portfolio.
In a recent tender offer, Forum Energy Technologies has sold some $58 million (15%) of its outstanding $1,000 notes at a clearing price of $400. Forum has now repurchased a total of $70 million worth of notes at an average price of $385.
Flotek Industries has acquired JP3 Measurement, a data and analytics technology company. In a $34.4 million cash and paper deal. Flotek president John Gibson observed, ‘This market has created such chaos and uncertainty that numerous growth opportunities – both organic and inorganic – have emerged. We have been disciplined in vetting those opportunities with a desire to reduce our dependence on rig count and the US unconventional market, while establishing an offering in the digital transformation market. An estimated $1 billion addressable market [for JP3’s technology] in the US alone provides significant revenue growth opportunities’. JP3 provides real-time analysis of flowing products and cloud-based monitoring. JP3 recently announced a joint marketing agreement with refiner Phillips 66.
Oil and gas MRO (maintenance, repair and operations) service provider GoExpedi has raised a $15 million debt facility from Silicon Valley Bank. The monies will be used to expand warehouse capacity across North America and to hire additional software developers. GoExpedi claims a 25% price advantage over its competitors, enabled by its own last mile transport and accurate inventory tracking solution.
Intelsat has filed for bankruptcy protection in the US in what the Financial Times described as ‘a move that enables the indebted private equity backed satellite operator to prepare for a government auction of its airwaves that could raise $4.9 billion’.
IIA Technologies unit KeyLogic Systems has acquired OnLocation, a provider of mathematical models that address energy and environmental policy issues. OnLocation’s latest analysis explores the impact of a hypothetical US fracturing ban. The study, initiated by the American Petroleum Institute, leveraged a custom version of the US Energy Information Administration’s National Energy Modeling System, ‘NF-NEMS’ and found that an outright ban would produce a GDP decline of $1.2 trillion and 7.5 million jobs lost by 2022.
Kongsberg Digital has increased its holding in NorSea Digital and it to change its name to KONCIV. NorSea is part-owned by private investor Jacob Møller who is to become the new chairman of KONCIV. The company provides cloud-based digital logistics services. Kongsberg Digital’s share now amounts to some 40%. KONCIV’s cloud solution is developed and operated as part of Kongsberg Digital’s platform.
Nabors Industries has received notification from the New York Stock Exchange that it is no longer in compliance with the NYSE continued listing criteria that requires a share price of at least $1.00 over a period of 30 consecutive trading days. The company has six months to regain compliance. A reverse stock split is planned.
In a $7 billion transaction, NVIDIA has completed its acquisition of Mellanox, a provider of high-performance networking technology.
Pacific Drilling has received notice from the NYSE that it is not in compliance with the minimum share price standard for continued listing.
Peloton has acquired ExproSoft, a provider of well integrity, reliability and data modelling software and consulting services based in Trondheim, Norway. ExproSoft develops WellMaster and Miriam RAM Studio software that provides data-driven performance analytics for the oil and gas and process industries. The solutions will expand Peloton’s Platform solutions for lifecycle well and production data management and enhanced decision making.
Simulation software provider ResFrac has closed a preferred share financing with its financial sponsor Altira Group. The (undisclosed) funds will be used to support the company’s growth. ResFrac customers include Hess, QEP Resources, ConocoPhillips and Shell.
3D printing specialist VELO3D has raised $28 million in a Series D funding round with new investors Piva and TNSC. The deal brings VELO3D’s funding to $138 million. Piva, the largest investor in the new round, is backed by Malaysia-based Petronas.
Following ‘careful deliberations’ the board of Weatherford International has withdrawn its appeal to the delisting proceedings by the NYSE. The Company’s shares of common stock will be delisted from trading on the NYSE. Weatherford continues to trade on the OTC Pink Marketplace.
Speaking at the 2020 LBCG Pipeline leak detection congress in Houston earlier this year, Barrett Walker presented Cheniere Energy’s drone-based approach to pipeline surveillance. A drone-mounted Velodyne LIDAR detector provides orthomosaic mapping with ground control points. DroneDeploy software extracts features such as buildings and equipment. Such surveying is rapidly becoming a viable solution for fugitive gas emissions as the software improves quantitative measurement. There is a wide range of sensors available, ICI OGI Inspector, www.infraredcameras.com, Mirage HC Optical Gas Imaging (OGI) camera, tunable diode laser absorption spectrometer (TDLAS). Moreover, proposed changes to the EPA air emission measurement center’s Method 21 are making these types of sensors ‘more viable and cost effective’. Some can even detect leakage from buried pipe. Infrared thermo-photography shows leaks vividly! Walker strongly advocates putting drones to work across your business as ‘unmanned systems impact every department!’ But a word of warning, you need to be aware of the evolving regulations of the dronosphere and work with law enforcement to know the process and contacts in the event of an incident. One useful resource is the API Guide to Developing a UAS Program in the oil and gas industry.
Chris Minto reported on 2019 trials of OptaSense’s optical fiber-based DAS technology carried out in conjunction with Brazil’s Centro de Tecnologia em Dutos (CTDUT) pipeline testing facility in Rio de Janeiro. The facility’s closed-loop pipeline allows engineers to test leak detection systems in a realistic situation. The trials have validated OptaSense’s leak detection technology which was shown to detect leaks in under 0.1 seconds 0 10s. Sensitivity was good too, at 15 litres/minute. Minto concluded that validation of external leak detection systems is only possible with representative tests at full scale and that the CTDUT site is an excellent location with a full-bore line and an existing fibre optic cable. Watch the movie.
Eric Bergeron presented FlyScan’s airborne UV spectroscopy which fills the gap between cheap, low sensitivity SCADA or eyeballed air patrols, and high end, expensive fiber/pigs/DAS or satellites. Bergeron cited a probability of failure vs. cost of failure analysis from pipeline risk consultants WKM Consulting. This showed, from an analysis of PHMSA reportable incidents, that the cost of failure amounts to some $5,000 per leaked barrel. FlyScan’s long term vision is to provide a right of way air patrol leveraging artificial intelligence, imaging and laser technology to fulfil inspection requirements as per 49 CFR Part 192, 195. Bergeron referred to Kent Muhulbauer’s classic, ‘Pipeline risk assessment, the definitive approach’.
Heath Spidle from the Southwest Research Institute in San Antonio showed how machine learning has been applied to fiber optic and Lidar data. ML has been used on data from MWIR OGI cameras to ‘autonomously and reliably’ detect methane with low false alarm rates. The approach is now being adapted for aerial (drone based) detection. ML has also been successful in detecting crude oil sheen on multiple surfaces including water. Currently, ‘detecting leaks from liquid pipelines [using distributed temperature sensing] continues to pose a significant challenge’ and ‘small leaks are of particular concern’. SWRI has investigated several machine learning and deep learning techniques and has come up with a SWRI ML algorithm that is good at large and small leak detection with zero false positives. Spidle concluded that ‘ML can find the hard to find leaks.’
Jay Almlie manages iPIPE, the intelligent pipeline integrity program at the US Energy & Environmental Research Center (EERC). The program scouts for new technology to be co-funded by iPIPE. iPIPE then organizes trials and demonstrations on live pipelines. AN iPIPE success is the Satelytics projects, an ongoing trial of leak detection algorithms working with hyperspectral satellite data.
John Hull (HiFi Engineering) presented the results of field trials of distributed fiber optic sensing* (DFOS) on a pipeline conducted for ExxonMobil. DFOS is an emerging technology for real-time pipeline monitoring. In particular, a new class of DFOS, ‘high fidelity sensing’ has been specifically designed to directly sense low volume leaks. In a 90-day DFOS field trial for ExxonMobil, on a West Texas pipeline, water and nitrogen were used to simulate liquid and gas leaks. In a blind test, the system detected 118 out of 134 simulated leaks, with zero false positives. The system was sensitive down to 200 psi, at low flow rates. Ongoing algorithm improvements and sensor proximity are expected to further improve leak detection accuracy
More from the conference home page.
* DFOS, distributed fiber optic sensing, also called distributed acoustic sensing (DAS) involves observing backscattered light in a fiber optic cable. By accurate time-gating of the returned light, continuous measurements of sound, temperature and stresses along the fiber can be made.
Total has entered into a multi-year agreement with Cambridge Quantum Computing (CQC) to develop quantum algorithms for carbon capture, utilization and storage (CCUS). CCUS begins with the capture of CO2 from sources such as coal and oil-power plants, steel manufacturers and cement works. The ‘use’ part of the equation can involve assisted recovery of oil and gas and also the transformation of CO2 into synfuels (albeit with some serious thermodynamic challenges). CQC’s ‘Eumen’ software is a ‘quantum chemistry’ application, so it seems probable that the deal will address the development of novel molecules for capture.
As we reported last year in ‘Quantum computing in oil and gas’, potential applications of quantum computing in oil and gas include computational material science, where the ability to accurately model ground states of fermionic systems would have significant implications for many areas of chemistry and materials science. Such uses may include the catalysts and solvents used in CO2 capture.
How far off are such developments? CQC’s Eumen is described as a ‘complete (software) package to facilitate the design of pharmaceuticals, speciality chemicals, performance materials and agrochemicals’. Last year, CQC reported a ‘breakthrough in quantum chemistry’ with an enhanced ‘variational quantum eigensolver’ designed to calculate the energy of molecular ground states on ‘near-term quantum computers’. Which of course begs the question of how ‘near’ the near term is. At the Paris quantum computing event, the consensus was that for industrial quantum computing, the time scale is elusive, but that quantum chemistry (along with cryptography) should be one of the first applications.
Last year UK National Physical Laboratory launched a joint project with CQC to develop a validation framework for quantum experiments running on physical (i.e. real) quantum computers. The project is funded by Innovate UK to ‘enhance the UK’s competitive advantage in quantum computing’. Yes, it’s Cambridge UK, not Cambridge MA! More from CQC.
McKinsey bloggers report on scouting Permian basin activity from space. Earth observation satellites provide an accurate view of shale activity in near real time, now that high and very high-resolution imagery is commercially available. On-demand image acquisition with up to daily revisits and advanced-analytics image processing provides an ‘outside-in’ way of reporting on activity, independent from other market sources.
McKinsey uses AI image-processing to provide ‘absolute accuracy’ and fine detail enough to identify a frack job and count the number of trucks involved. McKinsey has monitored 30,000 square kilometers of the Permian twice a month since September 2019 and has identified every newly cleared well pad and drilling or fracking event.
Satellite-derived observations have been combined with publicly-available regulatory data using ‘advanced analytics’ to provide insights into shale oil and gas activity. The McKinsey analysis found that new well-pad clearances have been increasing steadily since November 2019, with large public E&P companies leading the pack and a decline in the share of majors. Intriguingly, 20% were cleared without filing for a permit!
McKinsey is now working to add new data attributes such as working capital (for rigs, pipes, sand, and frac fleets) and pad-specific information on water and sand injection, opening up new opportunities for optimizing operations and monitoring competitors.
The McKinsey approach mirrors that used by Verisk/WoodMac, as we reported from the 2019 Esri PUG, and Baker Hughes’ truck-mounted ‘AI-to-go’ surveillance, the lead in our last issue. We also heard at the PUG from Orbital Insight on how oil storage can be tracked by checking the shadow on oil storage tanks.
Alan Bryant (Occidental) provided a sneak preview to the ISA112 Scada standard. Work on ISA112 began in in 2016 on reference architectures, a common terminology and lifecycle guidance for Scada systems. ISA112 is also to clarify how other ISA standards for cyber security, alarm management and safety relate to Scada. A first draft of the new standard is now out for comments with publication set for 2022. More information from the ISA.
Arlen Nipper (Cirrus Link) described MQTT and SparkPlug as central to the digital transformation of the oilfield. Cirrus Link has added to the Sparkplug B interface to MQTT to create a ‘complete’ industrial internet environment. The company has worked with some 35 manufacturers to implement Sparkplug B natively on their devices. Cirrus Link’s Chariot can be deployed as standalone MQTT server or data can be ‘injected’ to Azure, Google, AWS, IBM and others (Rockwell, ABB) from nearly 100 Scada electronic flow meter protocols. The ‘simple and open’ MQTT spec ‘has become the dominant IoT/IIoT messaging transport’ thanks to its low bandwidth requirements, secure comms and access control. Sparkplug B adds an operations technology-centric tag namespace and payload definition that supports auto-discovery and a ‘single source of truth’ at the edge.
Brandon Davis, a regular at the Wellsite Automation event since 2016, presented his evaluation of Scada systems for Red Bluff Resources. The company previously had multiple 3rd party systems that would not talk to each other, had multiple interfaces for operators and lacked data consistency. Similar issues were found in the automation package and artificial lift systems. Davis has evaluated message-based systems and found that these technologically attractive solutions (especially to OEMs) currently suffer from poor support from device vendors, although this is changing. Another consideration is data storage – either to a real-time data historian or cloud-based. The historian offers strong ad-hoc visualization and analytics tools with both MQTT and OPC interfaces. The cloud offers more IOT functionality but currently is not designed for oil and gas and requires a significant development effort.
Red Bluff’s current solution now embeds Signal Fire’s Pressure Scout wireless telemetry solution, Redlion Graphite HMI and Crimson software that offers protocol conversion to MQTT and cloud storage to AWS and/or Azure. The Apache ActiveMQ MQTT broker also ran.
Evan Rynearson (Middle Fork Energy Partners) showed how MQTT has been deployed to decrease communications bandwidth from the wellsite and increase efficiency. Rynearson recently challenged the LinkedIn community to ‘give me some examples of real world dollar value from a digital transformation’. The post generated considerable interest*. Scada and automation folks have been building infrastructure for their entire careers. But MQTT ‘takes it up a level’ and drives huge efficiencies. Problem is that ‘Scada folks hate change’. The appropriate strategy for MQTT deployment is therefore to keep and utilize what has already been built. Scada today is complex, intertwined and interdependent. But big benefits are achievable with little investment. The secret is templating common solutions, an approach that is already well understood by construction, maintenance and other fields. Templates leverage a unified namespace and data sources now supply unambiguous tag names. As field devices still run their native formats, these need to be converted to MQTT, adding the template-derived metadata. Middle Fork uses the low cost Ignition Edge Scada server with and OPC DA Module. The whole Scada system is now published into the MQTT infrastructure and namespace. Scada data is now available to all parts of the business. Rynearson advises starting with alarms than are pushed to operators in real time. Looking further up the value chain, ‘predictive models are impossible to scale without a unified namespace and known data intervals’. Rynearson acknowledged Sigit’s help with the namespace.
* Although not too many ‘real world examples’.
Hoss Belyadi showed how Vine Oil & Gas has leveraged machine learning to develop a ‘dynamic completions optimization workflow’ for its Haynesville Shale operations. Data mining and machine learning techniques are essential to extract information and knowledge from very large raw data sets. Working with a dataset of 222 Haynesville wells Belyadi studied the Pearson correlation coefficient to detect and remove collinearity in the data. A variety of ML approaches (feature selection and ranking, grid search…) were trialled. Feature ranking with Random Forest was deemed to be a ‘very powerful’ supervised ML algorithm. Combining many decision trees into a single model is the fundamental concept behind using random forest. Support Vector Machines and neural nets were also ‘powerful’ as was a ‘one variable at a time’ sensitivity analysis of the SVM output. The results show that a reduced cluster spacing showed, inter alia, that water/proppant loading and cluster spacing produce the most significant impact on production performance across operated and non-operated wells.
More from the conference website.
* For more on workflows for building ML models read Belyadi’s book ‘Hydraulic Fracturing in Unconventional Reservoirs. ISBN: 9780128176658, Elsevier, June 2019.
ARRIA and TIBCO are teaming to provide a ‘factual, socially responsible’ view of COVID-19 data by adding Arria’s NLG to TIBCO’s freely available COVID-19 Live app.
Capgemini’s SAP S/4HANA-based solutions for the Oil and Gas industry, EnergyPath and READY Upstream, are now ‘optimized’ for Microsoft Azure.
Sword Venture and Target Energy Solutions are to combine Sword’s data science and analytics-led services with Target’s MEERA to deliver a ‘next generation’ trusted data platform to the oil and gas industry.
Total has chosen eDrilling’s well construction planning technologies, as part of the Total ‘T-Desk’ drilling engineering software kit and for an upgrade of its Drilling & Wells digital platform.
CNPC Tarim has selected eDrilling for more realistic dynamic modelling of the downhole environment for their well designs, accounting for transient effects in the wellbore.
Equinor reports real-world use of the Microsoft Hololens to conduct checks
and controls during the construction of Johan Sverdrup Phase 2. Aibel’s
engineers us the headset to view a 3D model superimposed on their view
of the actual platform. The model is Echo, Equinor’s digital twin of
the platform, currently being built at Aibel’s construction yard in
Haugesund, Norway. Echo is also a window into data stored in Omnia,
Equinor’s cloud solution for construction data and other Equinor
databases including SAP and ProCoSys, Equinor’s commissioning database.
More from Equinor’s in-house magazine.
Fusionex has been awarded a contract by unnamed major oil and gas company to implement a data-driven digital platform to enhance customer experience.
Shell has acquired a five-year software subscription of GSE Solutions’ EnVision cloud-based simulator learning software for $1.65 million.
ICIS has expanded its strategic partnership with Enverus (formerly DrillingInfo) to bring ICIS energy news to the MarketView platform, allowing traders to access pricing and latest energy news in one place.
IDS has been awarded the 'DDR Plus’ certification from the International Association of Drilling Contractors (IADC).
Implico’s iGOS data exchange has been implemented at the Evos terminal in Hamburg. The tank terminal and the HPA now exchange train data automatically over an interface to a cloud communication service. The implementation runs on cloud technologies including Docker and Kubernetes.
KBC (a Yokogawa company) is to deliver a site-wide process digital twin to Rompetrol. The digital twin is based on KBC’s Petro-SIM process simulation software and will automate modelling activities, reconcile data from a wide range of refinery sources and provide site-wide optimizations.
Koch has selected C3.ai’s model-driven AI architecture and SaaS AI applications for its digital transformation.
Hibernia Resources has chosen NarrativeWave’s software under a two-year partnership-agreement to optimize production and detect anomalies in the operation of its oil and gas wells.
Energean and Democritus University of Thrace (DUTH) have partnered to deploy the ODYSSEA platform (an EU-funded project), on Energean’s gas production platform in South Kavala, Greece. The ODYSSEA equipment will monitor selected oceanographic parameters such as conductivity, water temperature, acidity, water level, turbidity, dissolved oxygen, currents at various depths over the water column.
Petrofac has secured a 3-year extension to its existing maintenance contract and a new 4-year metering contract with BP. Petrofac will continue to provide campaign inspection and maintenance services on the BP’s North Sea assets. The metering agreement includes on and offshore consulting and support services.
CNX has implemented Quorum’s myQuorum division order and revenue accounting solutions, a purpose-built solution to drive operational efficiency in oil and gas operations. The ‘cloud-ready’ solution was delivered and populated ‘just 9 months’.
Future Gas Station, a Recon Technology subsidiary, and China Petroleum Planning and Engineering Institute, a CNPC unit, have signed a joint operation agreement to deploy the DT Refuel mobile app in Zhejiang. Recon has also a $2.8M contract with Grand Energy Development for the design of a heavy oil transportation system in the Garraf Oilfield, Iraq.
Rotork’s IQ range of intelligent electric actuators are to be used in remote locations on an (unnamed) Indian pipeline. The actuators are linked by Rotork’s Pakscan network.
SAP and Accenture have co-developed a solution for upstream oil and gas operations based on SAP S/4HANA to streamline processes and cash flow. The solution includes contributions from ConocoPhillips and Shell.
Sercel has deployed its 508XT land-based acquisition system on a large-scale (60,000 channels and 45 Nomad 65 Neo vibrators) seismic survey being carried out by Sinopec Geophysical.
SparkCognition has joined The Open Group OSDU Forum to develop a standard data platform for the oil and gas industry.
TC Energy has migrated ‘almost 90%’ of its corporate and commercial applications, including SAP, to Amazon Web Services to automate workflows, unlock data and improve efficiency for its pipeline and power generation businesses.
Woodside has awarded TechnipFMC a second consecutive five-year iEPCI frame agreement for the development of its Lambert Deep and Phase 3 of the Greater Western Flank fields, located offshore North Western Australia.
Teledyne Marine is to provide its ‘eco-friendly’ eSource seismic source to WesternGeco. eSource will be used on an upcoming wide-azimuth survey in Brazil.
Knust-Godwin has secured the first order to produce parts for an oil and gas application with VELO3D’s new Sapphire 3D metal printer.
Emerson has signed a multi-year agreement with YPF to provide its E&P software suite for seismic data interpretation and visualization which is now YPF’s corporate standard.
The Global Legal Entity Identifier Foundation has published the Legal Entity Identifier (LEI) taxonomy as a proposed recommendation. Download the taxonomy here. We were curious to know how the LEI relates to the ISO 8000-116 standard we learned about from Peter Eales in our 2017 article. As Eales explained to us, ‘During the development of the ISO 8000-116 standard there was a liaison between the two working groups, and an acceptance by GLEIF that, by definition, the LEI as defined in ISO 17442 is a ‘proxy identifier’. Whereas the LEI (a.k.a. ISO 17442) creates its own registry and assigns a number to a legal entity. ISO 8000-116 uses a country and registry code from ISO 3166 and adds the existing local identifier. For example, MRO Insyte has been assigned a company number of 06236771 by Companies House in England & Wales So its authoritative legal entity identifier is GB-EAW.CR:06236771.
The XBRL US Data Quality Committee has published its 12th ruleset for public review. The DQC helps enhance data accuracy by providing US GAAP and IFRS filers with freely available automated checks that can test an XBRL financial statement prior to SEC submission.
The front cover of the April 2020 newsletter from ECCMA, the international standards body, rather grandly sports the tag line, ‘data science specialists, shaping and implementing data quality standards for two decades and counting’. The contents are rather more prosaic, addressing the 2019 ISO 8000-116 for corporate ‘authoritative’ legal entity identifiers (ALEIs). ECCMA’s public website, is said to allow companies to locate their on-line government register, find their ALEI, and format it in accordance with ISO 8000-116.
The US SEC has updated its Test Suite for software developers working on filings that conform to the Edgar Filing Manual.
The Open Geospatial Consortium (OGC) seeks public comment on the candidate CDB* version 1.2 standard. The CDB is conceptual model for data management in a synthetic environment as required in high-fidelity simulation or mission rehearsal, such as battlefield simulation. The standard is said to addresses the challenge of ‘plug-and-play interoperability’ and reuse of geospatial data in a modeling and simulation environment. More from OGC.
* Formerly the Common DataBase
The OGC has published the outcomes of its 2019 R&D initiative, Testbed-15. The program tested OGC-compliant earth observation technology, geospatial data security, federated web service cloud environments and more. The testbed was backed, inter alia, by Natural Resources Canada, the USGS and NASA.
Energistics has announced a candidate released V3.0 of PWLS, the Practical Well Log Standard. PWLS enumerates the terminology used by service companies to describe proprietary tools and associated data deliverables. The standard categorizes these marketing names and obscure mnemonics in plain English. PWLS provides an industry-agreed list of logging tool classes and a hierarchy of measurement properties. The previous PWLS V2.0 dates back to 2003. V3 has been expanded to cover four product lines: wireline, drilling, mudlogging and MWD/LWD, associating the logging tools with the curves they measure. The PWLS version 3.0 candidate release is available for public review until August 15, 2020. Download the zip file here.
PPDM reports that its participation in the Open Subsurface Data Universe project will allow OSDU to leverage PPDM standards and best practices by extracting terms and definitions embedded in the PPDM data model, using the PPDM data rules library to improve data trust and leveraging reference lists such as the What is a Well, Completion and Status documents. PPDM has announced the imminent released of Well Status & Classification V3 with updated facets, clarifications and sample rules.
The technical standards committee of the Society of Exploration Geophysicists has announced the imminent release of SEG-Y Release 2. The update on the SEG’s standard for stacked seismic data (the original SEG-Y was published in 1975) is intended to simplify data ingestion with a machine readable format and a comprehensive header that is suited for use in the cloud (see also the lead article in this issue). A companion guidance document will be issued to show oils and contractors how to leverage the new mandatory fields. The standard and guidance document will be available on the TC web page real soon now.
The International Industrial Internet Consortium has published a short note on ‘Enabling digital transformation (DX) IoT performance and properties measurement’. The note describes the IIC’s IoT security maturity model/framework, a framework that ‘informs compromises that improve the trustworthiness of a system’.
The OPC Foundation has announced OPC UA Safety Release 1.00. SR 1.00 aka Part 15 of the OPC UA core spec is based on the ‘black channel’ principle and addresses controller-to-controller communication using OPC UA clients/servers.
In the OPC’s March 2020 newsletter, OPC president Stefan Hoppe bemoaned the ‘downside of the user group boom’ and the emergence of independent ‘splinter groups’ and ‘completely new consortia’ that are taking up the cause of interoperability. ‘Instead of investing a lot of effort in the foundation of new organizations, I invite you to join the OPC Foundation directly’.
The World Wide Web Consortium has published a web of things architecture and WoT ‘Thing’ description. The new recommendations are said to enable ‘easy integration’ across IoT platforms and applications. The WoT covers smart home, industrial, smart city and other domains where IoT systems combine devices from multiple vendors and ecosystems.
The Object Management Group has formed a ‘Digital Twin Consortium’ along with Ansys, Dell Technologies, Microsoft and Australian property developer Lendlease. The plan is to ‘create standard terminology, reference architectures and share use cases across industries ‘from aerospace to natural resources’. DTC early innovators, a.k.a. ‘groundbreakers’ includes the US Air Force Research Lab, Bentley Systems and others.
Susan Blevins (ExxonMobil) and Leslie Savage (Texas Railroad Commission) speaking on behalf of the National Petroleum Council (NPC) described how the ‘dual challenge’, of providing affordable and reliable energy, while addressing the risks of climate change, is to be met. Following a 2017 request from the Secretary of Energy, the NPC has conducted a study into the potential pathways for integrating CCUS at scale into the energy and industrial marketplace. The study found, inter alia, that ‘widespread CCUS deployment is essential to meeting the dual challenge at the lowest cost’ and that ‘CCUS can favorably position the United States in new market opportunities as the world transitions to a lower CO2 intensive energy system’. The report recommends amendments to Section 45Q of the US tax code to simplify CCUS rules and make the activity economic. The NPC argues that CCUS deployment at scale will mean moving from 25 to 500 million tonnes per annum of capacity, involving a $680 billion investment. Read the Roadmap to at-scale deployment of carbon capture, use and storage on the NPC minisite.
Grant Bromhal presented the NETL-hosted SMART initiative, a.k.a. ‘science-informed machine learning to accelerate real-time decisions in subsurface applications’. Machine learning can transform subsurface operations and improve and extend traditional science-based prediction. SMART combines field data (drilling) plus fundamental theoretical and laboratory studies (shale) with a data-driven approach using deep neural nets. The aim is to transform reservoir management with ‘dramatic improvements’ in subsurface visualization, exploiting ML to achieve speed and enhance detail. The new approach aims at optimizing CO2 injection and brine production across multiple wells to maximize storage and minimize the pressure plume. More from the NETL. The SMART program is described in a 2019 report from Carnegie Mellon.
Jørg Aarnes presented DNV GL’s work on the quality assurance of CO2 storage and on verification of such facilities against ISO 27914:2017. The ISO best practice report incorporates learnings since 1990 from projects including Sleipner, In Salah, Snøhvit, QUEST, Illinois and Gorgon. Note that the standard applies to injection of CO2 into geologic units for the purpose of storage (sequestration) and does not apply to CO2 storage in conjunction with enhanced hydrocarbon recovery. DNV GL-SE-0473 is a certification framework that verifies conformity with ISO 27914 across a project lifecycle.
Shawn Bennett from the Office of Oil and Natural Gas described the DOE Water Security Grand Challenge. Here the DOE, USGS and EPA are partnering to transform produced water from a waste to a resource. The project has resulted in the risk-based data management system (RBDMS), an integrated suite of tools for managing oil and gas regulatory data. The RBDMS was launched in 1992. Today, North Dakota is undertaking a major system upgrade, ‘NorthSTAR’, adapting innovations previously developed for California WellSTAR. A Texas ‘LoneSTAR’ is under development. More from the DoE.
Scott Frailey (Advanced Resources International) presented the CO2 Storage Resource Management System (SRMS), a classification system for CO2 storage sites that derives from the SPE’s Petroleum Resources Management System (PRMS). The SRMS represents a project maturity-based classification and categorization of storable quantities. The classification provides standardized terminology and definitions similar to the familiar resource assessment methodology of the PRMS.
Michael Godec (also with ARI) asked rhetorically if CO2-EOR is a niche or a robust carbon management strategy. The US conventional oil in-place endowment is 624 billion barrels. Primary recovery and water flooding have recovered about a third of this oil endowment, leaving behind 414 billion barrels. Much of which is technically favorable for CO2-EOR. Enter ‘next generation’ CO2 EOR driven by enhancements to IRC Section 45Q tax code that resulted from the bipartisan budget act of 2018. This, inter alia, eliminates the old 75 million metric ton cap for new facilities that break ground by year end 2023. The new code provides subsidies of $50/mt for geologic storage and $35/mt for EOR and more support for a CO2 pipeline infrastructure. The next generation initiative seeks to transform CO2-EOR into ‘carbon negative oil’. Previous life-cycle analyses of CO2-EOR do not represent the state of the art in technology and fail to account for the emerging paradigm where CO2 storage is a co-objective. The next generation initiative is backed by the Helen MacArthur Foundation, Hewlett Foundation and Spitzer Charitable Trust.
Read the proceedings from the conference website.
Speaking at the online/virtual 2020 Nvidia Global Technology
Conference, Vanessa Kemajou and Joseph Winston teamed to present Halliburton’s work on real-time
machine learning and quantitative analytics ‘at the edge’. Applying
deep learning models in near real time requires the right hardware and
software. On the hardware front, the Nvidia Jetson Nano enables cost-effective deployment of
field accelerators. For the software, Rapids
(for ETL), GPyTorch (machine learning) and TensorRT (for optimization)
have added data analytics to the DecisionSpace 365 Real-Time Well
Engineering cloud application. The solution provides automatic rig
state detection for non-productive time analysis, and friction factors
calibration through reverse torque and drag analysis. Halliburton is
now planning to add natural language generation to alert drillers to
changing conditions. The authors referred to Gardner et al.’s 2018 paper on ‘GPyTorch: Blackbox
matrix gaussian process inference with GPU acceleration’.
Equinor researcher Hongbo Zhou traced the history of AI from ‘GOFAI, ‘good old fashioned AI’ involving symbolic reasoning and logic that ‘did not work well’, through machine learning with decision trees, support vector machines and self-organizing maps (OK for some applications) to the ‘current revolution’ of deep learning with deep neural nets. Equinor’s U-Saltnet leverages Unet’s ‘semantic segmentation’, and a feature pyramid attention network to automate salt body interpretation. The Unet algorithm was tuned with Nsight’s profiling tools. Tensor cores allowed for automated mixed precision calculations. Trials on the SEG’s Gulf of Mexico SEAM dataset ‘seem to match people’s perceptions’ (of where the salt boundary is).
Henrik Ohlsson and Nikhil Krishnan, on behalf of the Baker Hughes C3.ai joint venture, presented BHC3 Reliability, a productized, deep learning-based anomaly detection framework for large-scale oil and gas applications. Reliability uses a ‘system-of-systems’ model to predict failures. More from the Reliability JV.
Total’s Lionel Boillot and Long Qu extolled the merits of GPU-based compute acceleration in seismic applications such as reverse time migration and full wave inversion. Pangea3 comprises 1116 IBM Power9 CPUs with 3348 Nvidia GPUs. The machine came in at N° 11 in the Nov 2019 TOP500 list. Software is developed in a collaboration between Total, IBM, Nvidia and Altimesh. Seismic workloads involve moving very large amounts of data between memory and disk. The IBM Power9 on-chip accelerator provides the ‘fastest on-chip gzip/gunzip compression engine in the industry’ and an ‘80 to 125x speedup’ over a CPU single thread. Total contrasts its legacy, CPU-based Pangea II (CPU) with the GPU-based Pangea III. Overall Pangea III provides an 8x speedup over its predecessor.
Read these and other presentations and videos on the GTC2020 website.
Shell’s Anders Thostrup reported on the transition of CFIHOS, the capital facilities information handover standard, to the IOGP. CFIHOS, now in Release 1.4 has seen strong growth in participation. USPI-NL itself is transitioning with a new director (Martin te Lintelo) and new initiatives, notably the Facility Lifecycle 3D Model Standard FL3DMS. USPI is also renewing its relationship with other standards bodies – notably a reactivated ISO 15926 and DEXPI, the P&ID standards body (see below).
Luke Kendall traced the transformation of Norske Shell’s business with the digital twin. The digital twin has three components, human machine interaction (HMI), models and data. The engineering data warehouse is the basis of the DT along with ERP transaction data, permits, status reports and real time process control streams. Data is pushed out to operators and into models. Current work involves bringing the models into the DT for ‘what if’ scenarios, turning on a new well, forecasting the water cut and its effect on operations. Shell is looking to build the DT an early stage in an asset’s life, before steel is cut, perhaps even before FID. The DT will also be the last thing switched-off at decommissioning. Current asset maintenance systems rely on people to integrate across equipment, maintenance and projects. In the future the DT will be the integration platform. One motivation for the increased digitalization is that ‘young blood is not inspired by oil and gas’, digital is perceived as a way of attracting and retaining talent.
The first full DT was built for the Nyhamna gas plant in 2019, requiring Shell to get partner agreement in the face of ‘different levels of interest’ in the DT concept. The Nyhamna DT is coupled with Statnett’s (Norway’s electricity grid operator) own digital twin to reduce or eliminate power dips. Electricity makes up around 50% of Nyhamna’s opcost. The DT now connects the AMS to the ERP, providing notifications on equipment and work orders. Real soon now, real time information will come into the twin and be compared with models running in the background to show issues with power consumption and under-used resources. Though the DT is in its early phase, it has brought ‘amazing achievements’ and Shell is looking to profoundly change its way of working. The upcoming Ormen Lange DT (2020) sees the DT concept extending to the subsea domain. In the Q&A, Kendall revealed that the DT is a collaboration between Shell and Kongsberg Digital, with all the data in the Kognif-AI cloud.
Heiner Temmen (Evonik) and Reiner Meyer Rössl (Autodesk) teamed to present the ongoing ‘DEXPI’ (Data exchange in the process industry) project and its alignment with CFIHOS. The authors consider the P&ID (piping and instrumentation diagram) to be the most important plant document. Dexpi sets out to avoid information loss in data transfer, quite a tall order as a P&ID diagram embeds graphics, lists and structures/topologies. Today, information is transferred as paper, PDF, and Excel, but there is no standard. DEXPI has good support from the vendor community (Siemens, Autodesk, Hexagon, Aveva …). Owner operator support is limited to six members and only one EPC (Air Liquide) is on board. Deliverables to date include a P&ID exchange spec, an extension of the Proteus schema and CAE interfaces. A Dexpi ‘verificator’ has been developed by AixCAPE and a third-party test body has been founded by the owner-operators. Work is in progress on alignment between Dexpi and CFIHOS with the goal of a ‘user friendly end user (engineer) tool independent of Autodesk or other packages’. A Sparql endpoint is available for ‘semantic’ interaction with Dexpi data. More from Dexpi.
In the ensuing Q&A the degree to which Dexpi (and other standards) are ‘baking’ legacy data silos into new digital processes was raised. An informed observer commented that ‘we have never really been able to get away from digitizing paper documents, I would like to but never persuaded anyone to do this’.
Koen Penneman, from Belgian PipeXperts, presented on digital transformation for the smaller engineering company point of view. With parent company Engiplast BVBA, Pipexperts has been conducting laser scanning of petrochemicals and oil and gas facilities since 2004. Today, the company performs P&ID digitization, layouts and has added drones and VR. PipeXperts has teamed with software boutique CEA Engineering to develop ‘Plant4D’, a database for asset data and for ‘1D, 2D or 3D visualization’. Plant4D can ingest a point cloud, create a 3D model and link it the underlying asset. A flagship PipeXpert deployment is the Belgian Prayon chemical plant.
Leo van Ruijven (Croonwolter&dros) outlined a proposed USPI project to deliver an implementation guide for companies starting out on the systems engineering journey. Systems engineering (SE) is defined in ISO 15288. This is a ‘nice’ definition but does not address the problems of one-off designs and of doing SE in the face of ‘fragmentation’, as SE has many owners. SE projects often fail in the early stages and folks need simple guidance, especially on requirements analysis. Enter the READI, the ‘required asset digital lifecycle information JIP’. This addresses issues such as imprecise, complex requirements that defy automation. Simple sentences in requirements may be ambiguous or hard to interpret. READI sets out to break these down into a taxonomy of specifics. Work has started (as ISO 15926 part 11) for requirements mapping/description. The USPI project targets engineers with limited semantics and information management skills. The project has strong backing from Shell. ‘Procurements is good at messing things up. We need an independent requirements structure to be able to reuse successes and stop making the same mistakes again and again’.
More from USPI-NL.
Aveva has partnered with Axonify, a ‘microlearning’ specialist to add artificial intelligence to Aveva’s Unified Learning training solution for industrial operations staff. AUL provides an integrated training platform with extended reality capabilities and tools for designing learning programs. Deployment can be on-site, in the cloud or hybrid. More from Aveva and Axonify.
Barco has added Panopto’s enterprise video platform to its weConnect visualization and collaboration technology to comprise a ‘virtual learning environment’ for universities, business schools and corporate learning environments. weConnect enables students to engage simultaneously with their in-person and remote classmates. Sessions are available on-demand and students can search through a video to find any word spoken or shown on-screen. Recordings are available to learning management system and can be shared with third parties for accurate, human reviewed closed captioning.
Shell has signed a five-year subscription to GSE Systems’ EnVision cloud-based learning simulator in a $1.65 million deal. Shell has been using EnVision simulations and tutorials at its refining operations since 2005. The new deal sees EnVision delivered from the cloud, providing on-demand access to 60 generic process simulation models, 30 tutorials in five different languages and ad hoc remote training.
Honeywell’s Automation College now offers a training needs assessment service to identify gaps between a corporation’s automation system capabilities and its workforce’s skills. The TNA provides a comprehensive training plan delivered through secure, cloud-based access to the Experion PKS and PMD training environments. A PMD ‘driving license’ certifies trainees’ competency.
IOGP Report 476 (now in its third edition) provides recommendations for enhancements to well control training, examination and certification. The report covers onshore and offshore well control operations worldwide. Coverage spans well design, drilling and completion, work-over and P&A. Version 3 adds a training levels guidance chart, also available as a standalone item from the IOGP Bookstore.
Energy e-learning company Norwell EDGE has released Virtual EDGE 3D, an immersive scenario game that puts oil and gas worker’s skills to the test. Players are transported to virtual rigs where they can explore the layout, equipment and systems needed to drill wells. Tasks are aligned with the content and learning outcomes on Norwell EDGE’s digital training platform. Scenarios cover both onshore and offshore wells and can be tailored to specific company projects and activity.
Siemens is expanding its Digital Industry Academy to include Sitrain Access, a new digital learning platform. Courses are delivered via web-based training, tutorial videos, blended learning and practical exercises.
Italian national oil company Eni has revamped EniSpace, its supplier portal and collaboration Environment. EniSpace combines collaboration channels and ‘open innovation instruments’ with the traditional procurement processes. The interactive platform provides self-service management of applications, qualifications and tenders.
EniSpace has four sections. ‘JUST’ (Join Us in a Sustainable Transition), is an invitation to participate in Eni’s energy transition and align with the UN’s 17 sustainable development goals. Section two covers business opportunities. The third, ‘innovation match’ invites submissions of ‘principled and sustainable ideas and solutions’. Finally, ‘Agorà’ is a virtual marketplace for sharing experiences and best practices, ‘in line with JUST principles’. Here suppliers can compete and exchange views on their experiences and the solutions they consider most innovative, principled and sustainable.
At a recent grand opening in Stavanger, oil industry guests heard NORCE* researchers present OpenLab, an advanced drilling simulator. OpenLab is the fruit of a five year, 50 million NOK program funded by Norway’s Research Council. The simulator is said to be ‘one of the world’s most advanced simulators for training and technology development in digital drilling of oil and gas wells’. The simulator can be trialed live.
In an SPE-hosted webinar, speakers from Norway’s NORCE R&D establishment (formerly IRIS) showed how ‘seamless’ interoperability between real-time drilling systems can be achieved with a ‘semantical description’ of drilling signals. The approach leverages Norce researcher Eric Cayeux’s Drilling Data Hub which links multiple rig site data feeds with a ‘semantic data model*’. The model now forms the basis of the Norce OpenLab drilling simulator. Norce’s simulator can also be via a web API of as a hardware-in-the-loop (HIL) simulator running in a control system (à la digital twin). Cayeux’s group believes that their semantical descriptions of drilling signals will ‘significantly reduce the time and resources needed to collect and process recorded data for post analysis and for research and innovation’. AkerBP, Equinor and Maersk (now Total) are users.
* Earlier presentations on DDHUB’s semantic data model stated that ‘the standard will be open and […] if successful, the semantical data model and its associated API will be passed to a standardization organism like Energistics’. This does not appear to have happened yet.
Cegal has announced the GeoCloud* a vendor-agnostic petrotechnical cloud solution. GeoCloud is advertised as a ‘high-performance platform tailored to meet the challenging requirements of petrotechnical workloads’. The solution facilitates E&P applications and data management in a ‘scalable workspace’. The platform enables cross-border collaboration and global access to data and applications, providing a digitalization hub for data and application consolidation, and workflow automation. While the GeoCloud is ‘vendor-agnostic’ regarding geoscience software providers, it does not (yet) appear to be cloud-agnostic, the first GeoCloud release is for the Microsoft Azure cloud platform.
In a separate announcement, BearingPoint has teamed with French startup Geoxilia on a geosciences data lake, LakEasy. LakEasy promises an integrated multidisciplinary approach that makes for an ‘efficient continuum between geosciences and data sciences’. LakEasy provides a global vision of the corporate portfolio. Data ingestion is fully automated and monitored to ensure clean and secure data availability with direct access from geosciences software. LakEasy is currently implemented on AWS, with other cloud implementations available on demand.
* Cegal’s GeoCloud should not be
confused with Pays International’s GeoCloud we reported on back in 2017.
Canada’s Transition Accelerator has announced a framework to advance the hydrogen economy in Alberta’s industrial heartland, leveraging Alberta’s strength as one of the world’s lowest-cost producers of hydrogen. Alberta blue hydrogen is made with ultra-low emissions by upgrading natural gas. The carbon by-product generated from this process can then be captured and permanently sequestered underground or used for another purpose. The Canadian Energy Systems Analysis Research (CESAR) initiative at the University of Calgary was a catalyst for the TransitionAccelerator.
DNV GL has published a report, ‘Heading for Hydrogen’ that outlines hydrogen’s ‘central role’ to the oil and gas industry’s decarbonization effort. DNV GL research has found that over half of senior oil and gas professionals expect hydrogen to be a significant part of the energy mix by 2030 and one fifth of industry leaders say that their companies have already entered the hydrogen market. Heading for Hydrogen includes a survey of some 1,000 senior oil and gas professionals and in-depth interviews with industry executives.
Eneco has joined Neptune Energy as a partner on the PosHYdon pilot, the world’s first offshore green hydrogen project. Eneco will supply simulated wind data from its Luchterduinen offshore wind farm, to support the project which aims to integrate three energy systems in the North Sea: offshore wind, offshore gas and offshore hydrogen. The data will be used to model the use of electricity generated by the windfarm to power sea water electrolysis on the Neptune-operated Q13a platform.
The IOGP has published Report 629, ‘Environmental sampling and monitoring from airborne and satellite remote sensing’, a free 68 page analysis from the IOGP’s environmental and geomatics committees with help from consultant Geocento. The report covers modern satellite-based remote sensing across the oil and gas project lifecycle on land and sea. A companion Report 630 adds a comparison of methane reporting requirements, authored by consultants MACH10.
The Environmental Defense Fund is to launch ‘MethaneSAT’ in 2022, a new satellite to detect methane emissions from oil and gas across the world. MethaneSAT is to identify the areas and extent of methane emissions worldwide, giving companies and governments the opportunity to track, quantify and reduce emissions. MethaneSAT data is to be publicly available. EDF also recently produced a report, ‘Hitting the mark: Improving the Credibility of Industry Methane Data,’ an entreaty to improve emissions data accuracy and earn stakeholder confidence. Adding to the EDF report, Nigel Jenvey reports from Gaffney Cline’s review of readiness of the top 10 oil and gas companies for a low carbon transition. While all had started reporting on the carbon intensity of their business, no calculation was comparable because of differences in units used and contrasting approaches to accuracy and transparency.
ExxonMobil has proposed a new framework for industry-wide methane regulations. The framework was developed across ExxonMobil’s oil and natural gas operations and has resulted in improvements that ‘demonstrate what’s practicable and achievable’. The framework addresses high-leak potential at production sites and is said to be more comprehensive than current federal rules.
A pre-publication version of the Federal Register notice ‘Controlling air pollution in the oil and natural gas industry’ and fact sheet is available from the EPA. In October 2019 the EPA held a public meeting regarding the proposed policy amendments to the 2012 and 2016 New Source Performance Standards for the oil and natural gas industry. Transcriptions and presentations from the event are now available on the US Regulations.gov website.
Kayrros takes satellite data from the European Space Agency Copernicus program’s Sentinel-5P satellite and uses AI/ML to trace emissions back to their source. The work has shown that there are around 100 high volume-emitting events at any one time around the world, one half of which are in regions with heavy industry ‘such as oil and gas and coal mining’. Kayrros is documenting thousands of major methane plumes from various sources around the world.
Qnergy has closed a $10 million Series B funding round led by OGCI Climate Investments, Tene Capital and Kibbutz EHI. The funds will be used to accelerate deployment of Qnergy’s CAP3 compressed air pneumatics product, a low emission alternative to gas pneumatic devices currently used in upstream oil and gas.
The Global CCS Institute has produced a modestly subtitled ‘thought leadership’ publication, ‘The value of carbon capture and storage’, a 24 page explanation of why CCS is an essential climate mitigation technology, how it will scale and what it will cost.
Equinor reports successful testing of its 31/5-7 Northern Lights CO₂ storage well in the North Sea. The Northern Lights Alliance, with operator Equinor and partners Shell and Total, has submitted a plan to the Norwegian government for an ‘important part’ of the Norwegian project for transport and storage of CO2 on the shelf. CO2 will be captured from Fortum’s heat recovery plant at Klementsrud in Oslo and Norcem’s cement factory in Breivik. The first phase will see the injection of 1.5 million tonnes of CO2.
A release from legal and public affairs consultancy McguireWoods provides advice and guidance on the February 2020 update to the US IRS carbon capture tax credit regime, Notice 2020-12. Section 45Q of the US tax code allows a federal tax credit for carbon captured from qualified facilities that is used in either EOR ($35/tonne) or secured in a geological formation ($50/tonne).
In a similar vein, the Global CCS Institute has published a seven page brief explaining the US Section 45Q tax credit for CO2 sequestration.
Vello Kuuskraa, (Advanced Resources) and co-authors have published an article, ‘Reconsidering CCS in the US fossil-fuel fired electricity industry under section 45Q tax credits’ suggesting that Section 45Q tax credits may need to be modified to achieve their intended impact.
The US Bureau of Economic Geology’s Gulf Coast Carbon Center presented its research into offshore CCS. Current projects include the Gulf of Mexico Partnership for Offshore Carbon Storage (GoMCarb), and SECARB Offshore, covering the potential in the eastern Gulf region from eastern Louisiana to Florida.
Chevron Technology Ventures is to explore the use of Svantec’s (formerly Inventys) large scale CCS technology with a pre-front end engineering design study of Chevron’s operations. The study will evaluate the feasibility and design of a 10,000 tonne-per-year carbon capture unit in one of Chevron’s California facilities and is expected to be complete in the first half of 2020.
The API, IPIECA and the IOGP have published the 4th edition (2020) of ‘Sustainability reporting guidance for the oil and gas industry’, a 200 plus page free download.
GARP, the Global Association of Risk Professionals has opened registration for its Sustainability and Climate Risk (SCR) certificate, a new risk management training program to start in September 2020. The SCR Certificate is designed to help businesses address the risks associated with climate change. The program includes modules on policy and regulation, sustainable finance, and scenario analysis, among others. The exams can be taken via remote proctoring or at testing sites around the world. Register at GARP.
France’s Autorité des Marchés Financiers (AMF) is to tackle greenwashing in asset management with a set of minimum requirements that fund managers must meet in order to market themselves as green. These include quantitative thresholds to investment in issuers with a higher environmental, social and governance (ESG) rating. A critique from the XBRL organization warned that such ‘laudable’ efforts will be hampered by the lack of standardization in non-financial disclosures, advocating consideration of its global XBRL taxonomy, based on the work of the Task Force on Climate-related Financial Disclosures.
The European Union has adopted a unified EU sustainability taxonomy to provide businesses and investors with a common language to identify environmentally sustainable economic activities. The taxonomy is to ‘encourage private investment in sustainable growth and contribute to a climate neutral economy’ and is said to be key to the EU target of climate neutrality by 2050. The XBRL organization regretted however that ‘this is a classification not a taxonomy’ adding that ‘to be usefully and effectively monitored, an XBRL taxonomy enabling comparable, machine-readable disclosures of business-material information is the next step’.
The IOGP has built a new website for environmental performance data to house HSE data reported by its members. Environmental performance data for 2018 is already online and is to expand to other reporting areas throughout 2020 and 2021. Visit the IOGP Data Series and read the executive summary.
The US Interstate Oil and Gas Compact Commission (IOGCC), a multi-state government agency, has published a study of idle and orphan oil and gas wells. The study covers the number of such wells, costs and regulatory tools and funding sources for plugging and restoration. The work was based on a survey of 30 IOGCC member states and five Canadian provinces. Almost 300,000 wells are ‘approved’ idle wells (15.6%) of drilled and not plugged wells. States report from zero to over 13,000 ‘orphan’ wells, idle wells for which the operator is unknown or insolvent. Canadian provinces reported a total of 3,818 documented orphan wells. Per state estimates of undocumented orphan wells range from under 10 to 100,000 or more. These figures are a ‘serious concern’. States and provinces plugged a total of 3,356 orphan wells in 2018, with Texas plugging the most at 1,440. The average P&A cost was $18,940 (US) and for $61,477CND in Canada. Much more in the 70 page report from the IOGCC.
The Consumer Watchdog has
called on California’s Governor Gavin Newsom to prevent oil companies
from receiving approvals for new oil wells without first requiring full
bonding for their clean-up. Citing the ‘impending bankruptcy’ of oil
drillers and the State’s ‘grave deficit’, CW also requested that new
permits should be issued with a requirement to plug a certain number of
idle wells. CW puts the cleanup of California’s wells at a ‘whopping’
$9.2 billion compared with only $110 million in bonding for those
wells. More in the CW letter.
A new, 400 page report from the Alberta Energy Regulator, ‘Measurement Requirements for Oil and Gas Operations’ a.k.a. Directive 017 updates Alberta Energy Regulator (AER) requirements for measurement points used for AER accounting and reporting purposes, as well as measurement required for upstream facilities and pipeline operations. The Directive reads like a comprehensive textbook covering oil, gas and bitumen metrology and regulation.
A study by France’s IDDRI think tank looks into ‘Regulation of the offshore sector 10 years after the Deepwater Horizon oil spill’. Countries’ stances differ widely with some tightening regulations or banning drilling while others continue to promote development of the sector. Oil and gas extraction remains the least regulated maritime activity under international law although some international progress has been made in the EU and Africa notably with the COBIA initiative.
The Texas regulator, the Railroad Commission is to increase the transparency of its hearings with a new, public online portal for the energy industry. The Case Administration Service Electronic System (RRC CASES) provides public access to operator filings and other documents. The portal covers unresolved issues in oil and gas, pipeline safety, alternative fuels safety, gas utilities, and surface mining matters. Parties involved in a hearing can register as authenticated users and can upload documents for filing and review. Visit the RRC CASES public portal and checkout the user guide and video tutorial.
The RRC has also ‘launched’ a drone program. Drones will help inspectors quickly respond and inspect sites that are unsafe or inaccessible during emergencies such as fires, flooding and other natural disasters. To date, nineteen inspectors in the agency have received remote pilot certification from the FAA. The agency has a total of eight DJI/Mavic Enterprise drones.
The Railroad Commission of Texas has added more data to its public GIS map viewer. The public can now view information on the voluntary cleanup program and brownfield response program sites around Texas. The two programs are designed to incentivize the remediation and redevelopment of abandoned oil and gas sites. More layers have been added to the Environmental Permits information to show commercial waste disposal sites and discharge permits relating to oil and gas activities. The new data is housed in the Public GIS Viewer and the user guide is available in the application as well as on the Public GIS Viewer webpage.
The Alberta Energy Regulator has released a suite of annotated core descriptions and geophysical logs from the Athabasca and Cold Lake oil sands areas, Alberta. The dataset consists of an index of cored wells linked to the individual annotated logs and core descriptions which are provided as raster images in PDF format. Download the 440 MB zip file.
The Norwegian Petroleum Directorate has published high resolution core images of 94 shallow wellbores in the Norwegian Sea and the Barents Sea. The wellbores are said to provide important stratigraphic information about the area. The shallow borehole surveys were carried out by IKU, the Norwegian Continental Shelf Institute (now part of SINTEF Petroleum Research) from 1982 to 1993. A total of over six kilometers of cores are available on the NPD’s Factpages. Upon request, geoscientists can study the cores in person by making the trip to Dora, an old submarine bunker in Trondheim.
Speaking at the 2020 Open Porous Media (OPM) meeting in meeting in Eichstätt, Germany, Alf Birger Rustad, from OPM sponsor Equinor, outlined progress on the open source software initiative. OPM ‘encourages open innovation and reproducible research for modeling and simulation of porous media processes’. This is done by coordinating collaborative software development and by maintaining open-source software and open data sets. Rustad reported progress on the OPM flagship ResInsight project, a reservoir simulation visualization tool, that now offers ‘fast and free’ 3D visualization of simulation results in the Eclipse binary format and seamless integration with the GNU Octave Matlab clone. IFEM, the Isogeometric Toolbox for the solution of partial differential equations is ‘still very active’. Flow, the fully-implicit, black-oil simulator now produces Eclipse-compatible restart files.
Cíntia Gonçalves Machado described recent OPM work at TNO*, in particular, a JIP between Total, Equinor and others that is investigating salt precipitation in tubulars with a view to optimizing washing strategies and increasing production.
Tom Hogervorst and Tong Dong Qiu showed how BigData Accelerate is accelerating OPM run times in hardware using the Xilinx Alveo FPGA.
Joakim Hove, from Norwegian OPM-OP presented OPMRUN, a graphical user interface for OPM Flow. The GUI has similar functionality to Schlumberger’s ECLRUN program. The tool offers reservoir engineers a production environment that supports editing and management of OPM Flow run time parameters. The app is based on the Apache velocity template language (VTL). Some 450 templates are currently implemented in VTL. The tool has been used to kick-off parameter sensitivity tests from an existing input deck.
Read these and other Open Porous Media presentations from the project page.
* The Netherlands Toegepast Natuurwetenschappelijk Onderzoek R&D organization.