Later this year, Kappa Engineering is to release a software tool that addresses the problem of the massive data volumes generated by the ‘e-field’. Diamant is described as a ‘software cross-over’ - a data management and reservoir surveillance hybrid. Diamant accesses and processes data from production data historians and permanent gauges.
Current downhole and surface acquisition creates vast amounts of raw data. Straightforward filtering can distort significant data ‘signatures’ while recording all data will saturate the CPU and the memory of any application.
Diamant filtration reduces data volumes by two orders of magnitude without missing significant events such as choke changes and shut-ins. Diamant uses wavelet filtering to process billions of permanent gauge data points, extracting only useful information. Low frequency producing pressures are de-noised and filtered for production analysis. History matching and high frequency events, such as build-ups, are detected and loaded. New data is loaded with individual gauge filter settings and appended to existing data. Users can return to any part of the data and locally re-populate sequences of interest. Diamant also loads and updates rate data from a production database via ODBC.
Data can be drag and dropped (or passed via the clipboard) from Diamant to Kappa’s Saphir for pressure transient analysis or to Topaze, for production analysis. Permanent gauge data contains useful information allowing accidental or planned shut-ins to be used as transient tests. The Diamant browser positions files in a logical hierarchy that includes fields, tanks, well groups and wells, irrespective of the actual location of the files. Data management functionality further enables files to be ‘gathered’ into a rational data structure.
© Oil IT Journal - all rights reserved.