Kappa Engineering MD Olivier Houzé presented a paper on ‘New tools and workflows for dynamic data processing in the i-field’ to the Association of French Oilfield Professionals (AFTP) this month. Oilfield dynamic data analysis involves some kind of disturbance to flow which produces pressure changes that can be compared with a model. Such techniques can be used for operational decisions and for real time forecasts. They are an essential component of today’s ‘intelligent’ fields. The technology toolset includes permanent gauges, production logs, temperature measurements and production data. Kappa is currently working with Total on a joint industry project to understand fiber optic temperature data which is ‘complex and poorly constrained.’ But it is expected that the huge data volumes streaming from instrumented wells will bring up something useful.
Production data itself is potentially a longer term, more representative measurement. The new tools and data are creating their own problems as resources are stretched. Houzé estimates that engineers waste 50% of their time ‘fighting the data.’ A disgraceful situation in the context of the ‘i-field’ buzz. On which topic Houzé is circumspect. The buzzwords—i-field, field of the future etc. are a dangerous trend. We run the risk of re-inventing the wheel and revisiting the whole ‘buy or build’ debate of the 1980s. Deployment is further complicated by demarcation lines between engineering and IT and between the human brain and automation. Here Houzé is clear, we need to use all available techniques plus data models, plus integrated workflows. But in the end it is human intelligence that needs to be leveraged in the decision making process.
For Houzé, automation has a role—in data preparation and analysis. For instance, the ‘dirty data’ of production monitoring is amenable for pressure transient analysis. It can be cleaned up with tools like Kappa’s Ecrin, deconvolved or wavelet filtered and compared with models to present engineers with actionable information. Deconvolution can also be used to combine two subsequent well tests to give a valuable ‘pseudo long term test’ that provides an estimate of reservoir extent.
The i-field workflow today involves a background, automated data collection process, back allocation of production, history matching and ‘virtual metering.’ The model is used as a proxy of the reservoir generating artificial production data which is compared with the cleaned-up real time field data. If there is divergence, alarms are sent to the engineers for in-depth analysis. The aim is to replace the ‘dumb’ bits of the workflow and supply engineers with high quality actionable information.
Houzé was pressed in the Q&A on the way the i-field is sometimes ‘sold’ as a deskilling process—with IT and automation replacing knowledge workers. Houzé observed that today’s engineers are not necessarily poorly qualified, but they do have a very high workload. Hence the requirement for workflows that minimize data handling. More from email@example.com.
© Oil IT Journal - all rights reserved.