2019 Society of Petroleum Engineers Annual Technical Conference and Exhibition, Calgary

A tale of two transitions, in energy and digitalization. Panel session on responsible energy development. Sustainability special session. The ‘good, the bad and the ugly’ of oil and gas data analytics. ‘Operationalizing’ AI. Automating of well construction. Panel session on strategies and tactics for digitalization. Nanotechnology for the oilfield. Notes from the exhibition.

Introduction, a tale of two transitions

At last year’s EAGE we reported on the contrasting themes of ‘going green’ and ‘business as usual’. While there was plenty of ‘business as usual’ on the exhibition floor (and in the talks), the SPE ATCE picked-up the ‘greening’ of the industry theme big time, in the opening general session and in a panel session on ‘responsible’ energy development. Both sessions demonstrated the difficulty the industry is having in the face of the energy transition, a theme that is a major issue for Calgary and Alberta as witnessed by daily coverage in the local paper, the Globe and Mail and talk from some of ‘Wexit’, Western Canadian separatism.

The SPE, as we have previously reported, has collectively drunk the big-data-machine-learning-analytics Kool-Aid, along with its close cousin the digital transformation. We report from a session on the ‘good bad and ugly’ of data analytics. This was of course more about the good than the ugly, although Erdal Ozkan (Colorado School of Mines) did a good job of tempering some of the analytical excesses.

We also report from two sessions on digitalization. It is generally believed (at least by the supermajors) that ‘digitalization’ involves, at least to a degree, a move to the cloud. We report on what this means to the software development community that is now engaged on some transformations of its own with the advent of the cloud, Kubernetes, microservices and so on.

Opening general session

Mayor Naheed Nenshi welcomed the SPE to Calgary, ‘where Bow meets Elbow’ (rivers). Calgary has been home to Canada’s oil and gas industry since the Leduc oil boom in 1947. Today the industry is helping ‘fight poverty with access to clean energy’ (ripple of applause). Nonetheless there are ‘well-placed concerns’ about climate change and sustainability. ‘We can no longer ignore this and just be boosters for industry’. Nenshi was in New York the previous week for the climate summit events. ‘If we fail here, generations to come will never forgive us’ (no applause for that!). We need to tell stories about Alberta GHG reduction and ‘lead the fight against climate change thanks to smart PEs’. Nenshi mentioned Calgary-based hydrogen energy startup Proton as a promising way forward.

Sami Alnuaim (SPE president) agreed that a billion people ‘lack access to basic energy’. Oil and gas ‘will continue to be needed as part of the energy mix’. The value to society is ‘immeasurable’. Alnuaim was also just back from New York where he participated in the oil and gas showcase. ‘Net zero by 2050 will require cooperation with coal, steel and others’. An SPE video skillfully blended the sustainability field with the ‘digital transformation’ uber theme. Machine learning, digital transformation, efficiency will lead to a reduced CO2 footprint, less emissions, water use and improved safety and environment. The video got a ripple of applause.

Panel session – Positively impacting the world through responsible energy development.

Moderator Eithne Treanor (ET Media) boldly pronounced that ‘oil and gas will not be part of a climate catastrophe’, the industry is determined to reduce emissions and leverage its expertise. Climate activists call for action right now. But we also need to assure supply and meet a growing energy demand. Governments must do more, green business is good business. We need to work and find solutions like low carbon technology and emissions reduction. ‘No one is ignoring the risk of global warming’.

Jackie Forrest (ARC Energy Research) agreed, adding some ‘context’. Oil and gas has gone from scarcity and high prices to new technology and lower breakeven prices (50% down!). Investors are changing their stance to ‘lower for ever’. Companies are finding it hard to carve-out money for dividends and buybacks and to pay for emission reductions. It will be hard to transition fast as fossil fuel use is up year on year. This is a generational challenge.

On the topic of ‘net zero’, Forrest observed that the ‘net’ part is key. Can we sequester enough CO2? This will enable is to continue to use fossil fuels. Canada is a leader here. Another idea is to ‘grow seaweed and bury it’). But net zero will be hard without CCS.

And there is the incremental technological approach. Andrey Bochkov reported that Gazprom’s flaring is ‘down by 95%’ and seismic surveying ‘cuts down less trees with wireless receivers’. Leigh-Ann Russel added that BP now uses drones for methane leak detection. Flaring has been reduced with IR cameras. BP is also creating new businesses, Solar Light Source, Fulcrum, and ChargeMaster. Jeanne-Mey Sun added that Baker Hughes is also working on venting/flaring reduction and fugitive emissions with its FlareIQ and Lumen methane monitor. Another innovation is Gazprom’s use of integrated compressors to collect separator gas for use or sale, ‘a non-trivial amount’ in Russia.

The Q&A raised the tricky issue of emissions measurement, ‘do these include end-product burn?’ Forrest answered that 80% of emissions come at combustion and these are not included. Indeed ‘there will be a lot less demand for oil and gas if the aggressive net zero goals are met’. Current emissions reduction efforts, adding a few cents at the pump and planting trees or extracting CO2 from the atmosphere mean ‘a net zero that would work 100 years from now!’. Industry needs to think more and harder about this. Individual behaviors need to change. Greta Thunberg points her finger at consumption. Almost half emissions are within our control, put the lights out, stop traveling for holidays, get a smaller car. Personal sacrifices are needed. Unfortunately, e-car sales are now slowing!

Finally, a suggestion that SPE could transmute into a ‘Society of energy professionals’ met with a mixed reception, with some OK for a name change.

Sustainability special session

This special session addressed ‘what SPE members need to know about sustainability’. Nigel Jenvey (Gaffney Cline) offered some fundamentals that are leading to ‘staggering changes’ in energy. Since the Kyoto agreement, some $3 trillion public money has been invested in renewables. Meanwhile, oil and gas has incurred a $1 trillion debt across its supply chain. Climate is now a key part of ESG efforts, but it needs more engagement from industry regarding carbon emissions. For a stable, low carbon transition, tax is critical. The industry is confronted with a ‘real dilemma’.

Shell’s Josh Etkind pointed out that the world population is increasing rapidly, and the world needs more energy with less emissions. Shell’s activity is now framed by the 17 UN sustainable development goals. The plan is to improve the bottom line and lower the carbon footprint. Finance ‘is really playing a role here’. On the plus side (for the industry), Etkind cited Boston Consulting Group’s Jamie Webster who blogged that better oil and gas economics have increased oil and gas reserves by a factor of 2.5.

The debate turned to the ‘innovative business models’ that might enable sustainability. Here Etkind cited the Environmental Partnership’s push to eliminate pneumatic valves and HARC’s work on shared water infrastructure in the Permian basin. This was in part driven by the fact that currently, 60% of reinjection wells are souring due to poor biocide use. Etkind cited Data Gumbo’s blockchain as having produced ‘a $3.7 billion saving’ in water trading. The OGCI oil and gas climate initiative, a $1bn fund targeting CCUS and emissions reduction, also got a mention.

Jenvey added that the SPE has created a CO2 Storage Resources Management System (SMRS) subcommittee of the SPE CCUS technical section. More from the ‘groundbreaking’ document. Etkind concluded that industry needs to reach out to explain its role in the energy transition, ‘We are seen as dinosaurs that are unable to change. But we do have scale and the ability to move quickly.’ The ensuing debate discussed the role of natural gas as a ‘bridge fuel’ and part of a long-term solution. But this is challenged by the anti-fracking movement. The debate is more polarized. Kamel Bennaceur, (Nomadia Energy Consulting) maintained that natural gas remains a ‘destination fuel’ with over 100 years of gas ahead of us. Etkind agreed that the issue is politicized and has reached fever pitch, as both sides think the others are stupid. What makes sense for you? The temperature record, reduced arctic ice, and poor air quality are real, whether you believe or not. We need to come together with respect. Data is less compelling in this emotional debate. In the summing up it was agreed that there needs to be more investment in low GHG technology and cleaner fuels. ‘Growing demand does not mean we can’t cut emissions, witness the US where natural gas has largely replaced coal in power generation.

The good, the bad and the ugly of data analytics in the oil industry

Ashwani Dev (Halliburton) observed that others, like Tesla have created a ‘value shift’ So what is the value shift for oil and gas? Current processes are too complex, make them simpler with ‘optimization across the board’. Make machine learning systematic for engineers. Halliburton has been collaborating with MIT on analytics to turn petroleum engineers into data scientists. Halliburton’s ‘smart oilfield on an open platform’, Open Earth Community has backing from Shell, RedHat, Equinor, Total and others. Halliburton has ‘completely adopted open source’. ML proofs of concept began four years ago and now represent a major practice. To date successes have come from automated fault extraction, NPT analysis on daily reports and flow prediction.

Pablo Tejera Cuesta (Shell) bemoaned the ‘fact’ that ‘only 5% of seismic data is actually used’ (‘this is shocking’). And only 1% of daily data is ‘used’. Shell is reducing the carbon intensity of its production using digital technology to monitor and improve. One flagship is the Quest carbon capture and sequestration (CCS) project. Digital, analytics and ML are bringing the fourth industrial revolution thanks to sensors, AI, compute power and wireless connectivity. Data in the cloud is key to faster analytics and decisions. Shell’s in-house ‘X-Digi’ subsurface data universe SDU got a mention, even though ‘it’s hard to share some stories because of confidentiality’*. Other projects include seismic image analysis, WLO (well location optimization) and predictive maintenance with a digital twin.

* Although chunks of SDU are now being ‘shared’ as per OSDU.

Robert Heller (BP) reported on the application of data analytics to sand production with an AI-based software tool called the ‘Sandman cognitive sand advisor’. 50% of BP’s production comes from sand-prone reservoirs that represent a safety, environmental and economic risk. Sandman applies expert knowledge and data science to set an optimal production rate that mitigates sanding. Heller contrasted data-driven, numerical AI with ‘cognitive’ knowledge-driven, symbolic AI. While results from the former are harder to explain, creating a knowledge base for the latter require handcrafting in a ‘quite painful’ process. Once it’s done, knowledge-driven AI offers explainability and a low error count. In fact, Sandman uses both approaches. Regarding the ‘good bad and ugly’ aspects of AI, BP sees tremendous potential in the approach. But cognitive computing is not right for every problem. It is best applied to a high value problem that can count on help from willing experts. Expert engagement needs to be well-planned, ‘use more than one expert, but not too many!’ Project management is challenging. As developing software is interdisciplinary. IT underpins BP digital initiatives. ‘Conversations’ with the IT department for connections, permissions are necessary. Sandman was a big software engineering effort. Finally, a good GUI is also important.

Philippe Herve (SparkCognition) compared the cost of descriptive, diagnostic, predictive and prescriptive approaches to maintenance as increasing in difficulty and value. Rule/physics-based models perform poorly in edge cases and fail if one variable is changed. Data scientists are hugely in demand. Herve proposes automated model building (AMB) aka AutoML where ‘AI designs AI’ and replicates the mind of a data scientist. AMB uses genetic algorithms and deep learning to converge a generalized solution. AkerBP is a user. One PoC (not AkerBP) used unsupervised ML model to identify behaviors and correlate anomalies with specific downtime events. The model predicted 75% of production-affecting events several days in advance. The (unnamed) client is now moving into full deployment. Interestingly, 75% of the failures are ‘system failures’ (as opposed to equipment breakdown) when operating parameters have changed, and instability results. SparkCognition has also run another PoC at the Texmark refinery.

Erdal Ozkan (Colorado School of Mines) portrayed himself as the ‘average Joe’, a ‘skeptical believer’. Why skeptical? Because there are few clear examples of ML success beyond pattern recognition. So, what is the promise of analytics, AI and ML for reservoir science and engineering? Ozkan referred to work done at the Center for Petroleum Geosystems and Engineering. Current trials of AI/ML include optimizing economic performance, estimating PVT properties without samples and 'forecasting without physics’. These proxy models are to be presented ‘with a large pinch of salt’. They ‘leave the PEs doing the knitting’. There are lots of gotchas in AI, from model bias to the fact that data-driven models are backward-looking and may miss more current stuff. True physics-constrained/data-driven models have yet to be delivered. So, it’s ‘engineer vs. fortune teller!’ Prediction is particularly hard in shale with changing delivery regimes. If you do get a pattern, should you build a physical model to incorporate the result? Machine learnability has yet to achieve the status and early promises of AI, i.e. beyond pattern recognition. Ozkan concluded with some philosophical reflections on Cantor’s uncountable infinities, on Hilbert’s ‘10 open questions’, on Gödel’s proof of incompleteness theorem, and Ben-David’s 2019 demonstration that some problems cannot be solved with AI.

Operationalizing AI

A paper by Peidong Zhao (U Texas at Austin) applied machine learning to a big data set (4,000 wells) across the Eagle Ford formation. The project set out to compute EUR (estimated ultimate recovery) across a 50x400 mile area. EUR is considered a proxy for economic success. A rather intricate workflow applied multi variate linear regression across 20 or so parameters that showed ‘multicollinearity’. Data was massaged with ‘backward elimination’ and manual removal of redundant information. Moran’s I test showed that EUR is spatially autocorrelated which meant for more data ‘cleanup’. A Random Forest model proved robust in creating prediction model with spatial data. The models ‘explain’ around 50-70% of observed variation. The ‘most significant’ variables that predict EUR are TOC, vitrinite reflectance, Poisson, upper Eagle Ford thickness, well depth and lateral length. That’s just about everything, no?

Chevron’s Andrei Popa showed how AI has been used to optimize horizontal well and perforation placement in the venerable Kern River field. Kern River (California) has some 21,000 wells with 10,000 producers and 52 active rigs. The program started in 2006 and now around 1,100 h-wells have been drilled to reach the thin oil leg, all that is left after decades of production. H-well candidate selection was a ‘laborious manual process’ and eventually Chevron ran out of candidates. The AI program began in 2012 using fuzzy logic on resistivity, oil, gas saturation, reservoir thickness and temperature. ML was used to understand what is driving performance and the impact of stratigraphic connectivity and heterogeneity. A novel approach used a ‘dynamic reservoir quality index’ (dRQI), computed for all 155 million grid cells. A workflow involved, ‘fuzzification’, ‘if-then’ rule evaluation and ‘defuzzification’. A plot of the Lorenz coefficient showed baffles and barriers to flow. The unsurprising conclusion is that h-well performance is best with a good reservoir and no barriers to flow.

Automating well construction

This special session on well construction automation was jointly organized by the SPE DSATS (drilling automation), DUPTS (drilling risk), OGDQ (data quality) and WBPTS (positioning) committees.

For David Reid (NOV) Drilling automation is hardly new, ‘the technology is here, what’s stopping us? Culture!’ When DSATS started there was no business case for drilling automation, ‘just a belief that there ought to be something in it!’ Today there is a business case and ‘we are going to go fast’. Jeff Moss (ExxonMobil retired) added that DSATS started with some irrational exuberance followed by a long honeymoon period. With today’s unconventionals the business case for automation is good, but it remains difficult to share the spoils of automation between operators and contractors.

In his keynote, Precision Drilling CEO Kevin Neveu described the drilling business in Calgary as ‘on its heels’. Which might mean a ‘tipping point’ for drilling automation. AutoDriller software has been around for decades. What is new is full process multi-machine control. The next step is to leverage the high volumes of data with AI and complete the transformation. But this will take much longer, and each line of code is potential failure point. Such development is costly and unfortunately ‘finance has abandoned the industry’. Drillers are working against procurement groups here and service companies ‘need to define the value for themselves’. Capital constraints are making industry very risk averse. But risk will be commonplace when field testing AI and data science. Regarding the human impact, some will benefit, others not. The driller’s lot will be greatly enhanced, but the company man and directional driller will be made redundant! Exceptional leadership will be required to push the changes through.

Matthew Isbell observed that Hess has been in the Bakken since 1951. Unconventional wells are ‘moderate to low’ in complexity but need fast paced, supply chain logistics. The key ‘days to TD’ KPI has consistently improved over the years. Where will be the value in automation? A Hess study broke well delivery into 17 phases to find that two, vertical and lateral drilling, dominate. Drilling automation can improve here. Hess began in tests in 2015 with wired pipe and identified problem as ‘people taking bad decisions!’ New workflows have reduced variation in decision making. Hess also uses a central real time operations center that has enabled fewer drillers to see and learn from more wells. Learning rate improvement is the key. In the future, the plan is for standard operating procedures, set points, and drilling in a process control loop, optimizing one section at a time. Today Hess has one well doing this in the Bakken. Drilling automation is not about the technology, it is about people, minimizing variation and leading by the business and ‘lean’ methods à la SQDCP.

Lars Olesen (Pason) described himself as a ‘humble peddler of electronic drilling recorders’. Pason’s automatic driller software controls the rig’s draw-works and top drive giving faster ROP and decreased NPT. Olesen noted that, in product development, the algorithm is a small part of whole. Add in control system integration, workflows/UX, support and ‘you have 10x the effort’. ‘Achieving the overall vision for us and for this panel will take some time’. ‘You can’t tell the company man what to do’! And it can be hard to prove what is contributing to a successful KPI. We need help from the operators here. A meaningful conversation with the contractor is required to deploy an automation product. And ‘most operators are a lot more chaotic than Hess’.

Duane Cuku (Precision Drilling Rig) observed that we are ‘already down from 40 to 23 days per well’ and will soon be ‘down to 13 days thanks to the digital transformation’. But the downturn meant that companies ‘had to offer the moon to survive’, which is unsustainable. A reluctance to invest in technology is understandable, operators are skeptical in the face of ‘extreme cost control’. A fragmented supply chain with multiple providers of the same service doesn’t help. Operators’ personnel are resistant to change. In all events, the rig is the central platform for well construction and drillers are at the heart of the transformation. PDR has developed an ROP optimizer that has now moved from an advisory mode to control the auto driller. Key to this development was a ‘satisfactory sharing of the spoils’ (IP and cash). The tool is charged per-day as separate line item in the drilling contract.

Morten Norderud-Poulsen (Maersk Drilling) referred to automation learnings from offshore drilling on Norway’s Martin Linge field*. Wired drill pipe has proved successful in torsional vibration control, where the auto driller compares modeled and measured weight-on-bit (WOB). But the value creation for the contractor is limited and must be set against the cost of wired pipe. What is now needed are discussions on cost sharing, compensation and risk/liability. While the technology is fine, these considerations make it hard to deploy. The current set up does not drive value creation. But there is the potential for 20-30% savings (on a 400k$/day rig) with a relatively small investment. A new value creation concept is needed, an ‘alliance model’ that focuses on outcomes between operator, drilling contractor and service company.

* As presented in 178863-MS IADC/SPE, Wired drill pipe in a complex environment.

Strategies and tactics for digitalization

Jim Claunch (Bain & Co.) offered a maturity model for digital technology in oil and gas. The industry is ‘really good at creating silos’ so level 1) involved scaling digital inside the silo say with pump jacks, gas lift automation. Level 2) extends optimization along the value chain, e.g. using midstream data to optimize the field which is ‘tough to do’. Level 3) sees a ‘workforce of the future’ and new, agile ways of working. This ‘makes total sense and gets great results’ but involves hiring different people, ‘digital natives’ and data scientists. Finally, Level 4) sees digital in the DNA of the organization, the real ‘digital transformation’. Claunch touched on diversity, gender and inclusion and bemoaned the passing of the 'great leaders of the past’ that knew their stuff. Today we live in a digital society. Too many people are ‘lost in middle management’.

Michelle Pflueger is yet again involved in a digital transformation initiative in Chevron. Previous efforts (digital oilfield) have not changed people’s work experience significantly. So what is new toady? Pflueger believes that there is now the potential for major change, as the cellphone has changed our lives. A top-down initiative from Chevron’s management has seen Pflueger create a small team of ‘accelerators’. The business units map out their own transformations and the team helps them go faster. The Permian unit is ‘all-in’ with a world-class data set and analytics. This could not have been done with a company-wide ‘standard’ methodology. Gorgon is also different with around a million sensors deployed, growing at 100k per year. Advanced process control at Gorgon has brought a $240 million ‘value’. Chevron is ‘moving everything to Azure’. Chevron is working to evolve its digital culture with Shark Tank-style initiatives to encourage entrepreneurship and a Digital DoJo worldwide championship. The shift is also seen in recruitment, ‘five years from now every engineer will have to know digital like I used to know Excel’. While there are ‘a lot of questions facing oil and gas’, students should realize ‘it is an important time to be in the industry’, ‘we enable human progress’. There is a great opportunity for folks with deep digital expertise ... we are just starting out!

Hani Elshahawi (Shell) sees digital disruptive and ‘a potential threat to many incumbents’. Oil and gas is facing a perfect storm as investors leave in a push for decarbonization. The vanishing workforce (crew change/brain drain) means that in five years there will be 75% millennials. How does digital help? By uncovering invisible insights in an ‘avalanche of data’.

In the session on ‘developing and implementing digitalization business values’, Lucas Gonzalez presented YPF’s ‘Dagma’ (data governance management analytics) program, a push to create a data-driven culture in Vaca Muerta shale operations. Dagma involves data preparation for data science initiatives to optimize frac hit and screen-out detection. Dagma supports business intelligence in the cloud with easy access and visualization of data. To date, 200 people have been trained on self-service BI. YPF is now planning a data lake and data warehouse for upstream data with a ‘balance scorecard connector’ to these databases. Tools of the trade include Azure and PowerBI. Y-Tech, YPF’s technology arm helped build the data pipeline.

David Joy (HP Enterprise) presented on the use of ‘edge computing’ and the IIoT to modernize a Texmark petrochemical plant. The comprehensive system provides video surveillance, ‘man down’ alerting and a connected worker (with an electronic hardhat). Connectivity is provided via a central Wifi canopy/umbrella. The system provides condition monitoring (e.g. clogged filters, augmented reality etc). All served from an ‘Edge Center’ which for all the world looks like a medium-sized server! Here data is kept ‘at the edge’ i.e. on-site, an interesting redefinition of what ‘edge computing’ is about. The communications infrastructure used an elevated dual antenna. Atex issues with the kit were solved. Your mileage, especially in the upstream, may be different as the HPE/Texmark solution relies on solid, non-intermittent comms. The ‘core industry architecture’ is now being extended to other petrochemical plants and refineries.

Nanotechnology for the oilfield

First, a confession. We attended the Nanotechnology for the Oilfield session in the vague expectation that we would hear of wonderful nano-robotic devices doing smart, perhaps ‘digital’ things in the reservoir. No such luck. Despite earlier claims and hype, notably Saudi Aramco’s hype, over a decade ago, of ‘autonomous micro machines’, today’s nanotech is a bit more prosaic. As Steve Bryant (U Calgary) put it, nanotech is the synergy of nanoparticles and chemistry. Today, ‘anything you can draw, a smart chemist can make’. Nanoparticles plus surfactants can be powerful. The synergy makes for a larger, more flexible design space, allowing for the tuning of electrostatic affinity of foam to enhance propagation.

Hugh Daigle (U TX at Austin) reported on the use of nanotec in ‘sustainable’ resource development where they can replace hazardous chemicals, reduce water use, treat produced water. As an example, particles with a magnetite core and SiO2 shell can be manipulated in magnetic field. Dirt is attracted to the particles and can be pulled out with a magnet. Another application is the use of superparamagnetic nano paint to coat pipelines. These can then be heated up with an external magnetic field to melt paraffin accumulations. The technique was presented in an SPE paper that included a ComSol model of the magnetic field inside a pipeline pig.

Miscellaneous

In a booth presentation, Jack Nassab, demoed Schlumberger’s Stewardship Tool, first released in 2018. The proprietary wellsite modeling software helps communicate the pros and cons of unconventional wells to the public. Public perception is very powerful, ‘we are judged on our few mistakes rather than the many successes’. The Sustainability Tool computes model KPIs for fracking including emissions on location, during operations and flaring, how consumables get there and waste disposal. The web-based tool covers SOx, NOx, CO2, engines, noise, VOCs, worker safety, traffic etc. An augmented reality function encourages stakeholder engagement. At the time of the ATCE, the stimulation module was operational. Early 2020, V3.0 will roll-out with a production module. Schlumberger plans to release a single well version as an ‘open source tool for all’.

INT presented IVAAP as a subsurface workspace customizable with GeoToolkit. Widgets fit into the IVAAP canvas. Data selected in a map search flows into a Jupyter notebook for AI. INT is working with CGG on a geophysical Jupyter notebook. Weatherford Central is now on IVAAP along with a Slack-style messaging for well ops.

Touring the exhibition floor, we chatted with some knowledgeable members of the upstream IT vendor community. We learned that some majors are requiring a microservices architecture running on Pivotal’s RabbitMQ bus for integration with company development environments or embedded in control systems for machine to machine interaction.

We also learned that OSDU V1 (as it was handed over from Shell) was AWS based and used technology from 47 Lining, now a Hitachi Vantara unit. See the 47L oil and gas case history (which is probably Shell’s SDU). This code base has now been replaced with OpenDES, the low-level layer of Schlumberger’s Delfi. Some expressed an opinion that this represented a de-facto take-over of OSDU by Schlumberger whose ‘professional code’ will include Petrel log formats. While OSDU was originally on Amazon, objections from Microsoft saw the code re-jigged to be multi-cloud. Although ‘multi-cloud’ is considered by some to be more or less impossible!

Silverwell Energy is working on low end edition of its DIAL, digital intelligent artificial lift device for unconventionals.

CMG is using AWS S3 storage, Dynamo DB, Lambda and SNS to bundle data plus application (IMEX, GEM, STARS) into a single Docker container. This is said to be more secure and faster. MPI is used to distribute big jobs across multiple machines.

ColdBore’s SmartPad IoT ‘digital wellhead’ data combines with other data feeds from multiple service providers adding authoritative time stamps for CT/wireline operations and completions. Events are recorded chronologically and ‘impossible to deny’. Previously each contractor had its own time stamp. An operations database is delivered to the operator which can also population WellView. NPT alerts allow for on-the-spot reconciliation.

StoneRidge Technology’s Echelon runs on GPUs and has displaced Schlumberger’s Intersect et ENI. Marathon provided the original Echelon code and funded early development. The tool has been tested on a billion cell model.

Peloton’s mobile app monitors emissions and pushes notifications to personnel and C-Suite corporate reporting dashboard with company-specific KPIs.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.