At the opening general session of the SPE Annual Technical Conference and Exhibition (ATCE), the moderator Eithne Treanor asked ‘$60 oil? What happened?’ Jarand Rystad (Rystad Energy) replied, ‘It all started here. The people in this room were too smart and too clever. You created, not oil from shale, but money from the bond market!’ Shale, along with weaker demand made Saudi Arabia start a ‘volume war’ with a ‘2 million barrels per day increase’ from OPEC over the last year. This has led to a ‘perfect storm, not the short sharp shock the Saudis wanted.’
All the operators agreed that in the meanwhile, costs have got out of control in the last 5 years. In reaction to this, cost cutting is rife. Bernard Looney reported that the initial estimate for BP’s Gulf of Mexico Mad Dog Phase II development was $20 billion. It is now down to $14bn and BP is asking ‘can we do it for $10bn?’ Scott Tinker (U. Texas Bureau of Economic Geology) like many has been through several major energy cycles but previously the global energy mix ‘never noticed.’ For the last 35 years, fossil fuels were steady at around 85%. Today this is changing with a steady slow decrease in fossil fuel’s share. Jorge Leis (Bain & Co.) added that as the world economy (with the possible exception of the US) slows, we ‘should not hold out hope for a demand side solution.’ Looney rejoined, ‘Hope is not a strategy. We have to manage and control what we can and drive costs down. There is plenty of opportunity to do this.’
Treanor asked what the impact of renewables was going to be. Leis said that it depends on whom. Bain & Co. does not advocate oils diversifying from their core competency. But we are entering an era of structural change, with climate change legislation and new technologies for renewables. Battery technology is a game changer. The diversification of energy sources is happening. But oil and gas companies should ‘Stick to your guns but be cognizant of the impact risk of renewables.’ Looney stated that BP has over 80% of world energy out to 2035 as coming from fossil fuels. The IEA’s greenest scenario has half of world energy being met by oil and gas. The latter has huge role to play. Also oils have big alternative energy business, Shell in biomass, Total in photovoltaic and BP (erstwhile ‘Beyond Petroleum’) in wind.
Tinker was skeptical as to the impact of renewables. ‘I don’t do personal opinion or hope.’ Wind, sun, tide are all resources like oil and gas. We develop the best one first. But things bump into challenges of scale, for instance many renewables technologies may have issues with the supply of raw materials as they scale up.
Treanor asked what the panel thought of a carbon tax. Looney said, ‘it depends on where you are.’ There are a billion poor people in the world who just want the things we take for granted. But BP thinks that carbon does need to be priced, a tax is necessary and will be high on the COP21 agenda. EU IOCs have signed a letter supporting a tax. Tinker ‘playing the devil’s advocate’ observed that the biggest reduction of C02 has occurred in the US, as fracking has replaced coal and where there is no carbon tax. In the EU, with its emissions trading, there is now a moratorium on Nuclear and on fracking! ‘You can’t not like everything!’ Of course the biggest emissions come from China but, it’s not their fault, it’s ours, we buy their stuff!
During the debate, participants were invited to vote on a range of topics using the Freeman XP Touch online polling system. Asked ‘should oils diversify into wind and other renewables’ a surprising 86% of the engineers votes ‘yes.’ And 63% thought that energy companies can transition to new energy companies.
The digital energy session heard from Maithem Al Nakhil (Aramco) on novel automated workflows for a ‘large carbonate reservoir’. These have been developed in response to the ‘challenging’ amount of data streaming in from Aramco’s i-field where it is proving hard to monitor production and water injection and arrive at timely decisions. Enter the ‘integrated dynamic surveillance tool’ IDST. This allows for data triage with a ‘heterogeneity index’ i.e. Value(well)/AvgValue(wells), used to recommend wells for workover. A real time display shows production, the evolving water cut and non compliant wells. The system will (note future tense!) ‘help and support us in our daily work.’
Sunitha Gyara introduced Halliburton’s framework for scalable and reusable digital oilfield implementations. The digital oilfield has evolved such that today, systems can detect and rank production anomalies and perform diagnostics. Software models can be corrected and physical infrastructure adjusted. But there are new challenges of equipment downtime, system complexity and, echoing the Aramco presentation, of information obesity. There are also the old issues of silo challenges to collaboration and new ways of working. Individuals’ goals may not be aligned, and folks may have a partial view of the production process. The answer (here comes the sales pitch!) is: Halliburton’s new production architecture, a flexible IT architecture offering integration services, data quality analytics and an engineering modeling suite. The system includes ‘virtualization’ of disparate data sources and is built on Microsoft SharePoint.
Andrei Popa described Chevron’s use of ‘big data’ in heavy oil reservoir management. First, Chevron tried to understand what big data meant with a literature review. It turned out to be an evolving concept that was ‘difficult to grasp’ and that has come to mean any ‘advanced analytics.’ Whatever. Chevron’s Kern River business unit was defining its own ‘big data’ with 17,000 active wells pushing a billion points/day into Energy Components, Hollysys RMIS and other systems. Most of this is structured data and at ‘only’ 8 – 10 GB/day falls short of big data à la Google. Chevron’s big data solution does not leverage the new data lake/Hadoop technologies, rather an in-house built custom app to ‘bring all this together.’ Popa claimed ‘$10 - $100 million’ in value creation from the system’s use in analyzing distributed temperature gauge data during steam flood.
Ryder Scott’s He Zhang has been using neural nets to classify malfunctions occurring in electrical submerged pumps (ESP). Root causes of failure include insufficient well inflow, wax, emulsion, reverse rotation, leakage and outright failure. For a long time engineers have been using ammeter card readings to evaluate ESP malfunction. Classifying this data previously required a huge manual effort. Today the data set is a perfect candidate for artificial neural net diagnostics. Ryder Scott now has a database of failure modes. Most interesting are the cases which slip though the neural net. Analysis of one ‘unexplained’ pattern suggested a tubing leak which was later verified by the operator.
Our old friend the semantic web cropped up in a couple of presentations. Randall McKee presented Chevron’s work with USC/CiSoft on ‘rapid data integration and analysis for upstream applications.’ The work addresses the perennial problem of multiple data sources, multiple databases and the fact that ‘everyone has their own view of the data.’ Enter the new data integration and analysis framework (DIAF), that uses semantic web technology to automate the discovery of links between data, a process called ‘unified fuzzy ontology matching.’ Queries can thus execute across multiple ontologies. The UFOM approach helped realize 8 months of mapping in two weeks. Other CiSoft work involved machine-learning driven ‘shapelet’ investigation of anomalies in time series data such as ESP intake pressure. The semantic approach allows for data mining across heterogeneous data sources and the integration and analysis of text in reports and operator logs. More from CiSoft.
Bob Rundle (Baker Hughes) has also been using semantic web technology to help validate a subsurface model by tracking data provenance across the workflow. Geomodeling suffers from sparse data and bags of uncertainty. Ideally we want to capture the provenance of all a model’s components. Enter an object repository to capture provenance and track an object’s change history. All objects (logs, deviation survey, grid) are given a unique IETF RFC 4122 identifier. An entity-attribute-value EAV) triple store was used to managed object and version identifiers, ensuring that everyone works from current data. The EAV data is housed in a NoSQL data store. A ‘single version of the truth’ is not good enough. What is required is a ‘local but reconcilable version of the truth!’ The fixed schema repository is a barrier to innovation. In the Q&A Rundle revealed that this a prototype and that to benefit from the approach, applications would need to be re-engineered.
A presentation on the Kappa Engineering booth on unconventional workflows revealed that ‘We do not yet know what we do not know!’ Shale presents a steep learning curve. Of the three flow régimes predicted by theory, we only expect to see the first two in the practical life of a well. Decline curve analysis remains popular but a plot of Eagle Ford wells’ decline shows a cacophony. Normalization and cross plots (using Citrine) help but while all models give good agreement with the data, forecasts differ. Kappa’s KURC App is only available to members of the unconventional resource consortium. A Schlumberger booth presentation on well integrity compared standards from the API, Norsok and others. Schlumberger’s InVizion integrity service uses its Techlog software to collect integrity data and documents with added InVizion plug-ins. Checkout the Eagle Ford case history. Schlumberger and Aramco also showed off the ‘Manara’ system a commercialization of Aramco’s ‘extreme contact’ hybrid multilateral well watch and inflow control system. More from the SPE ATCE in our next issue.
© Oil IT Journal - all rights reserved.