InnerLogix User Group—Dominion, Newfield and ‘MetaLogix’

New quality-centric data infrastructure unveiled at Chevron-hosted InnerLogix user group meet.

Tina Warner presented Dominion’s well data quality initiative. Dominion has a PIDM 2.5 well master database which feeds data to three regional master OpenWorks databases. These in turn are used to create interpretation projects. All transfers are controlled by InnerLogix (LX) data quality management (DQM) rules. Similar rules govern back population of added value data from regional masters to PIDM. Business units determine area-specific naming standards which are ‘hardwired’ into ILX Synch jobs and DQM process.

IHS

The PIDM datastore is updated nightly by IHS and other vendor’s data is loaded by Dominion’s IT group. A PIDM composite builder runs nightly to promote highest quality well data. Project data loading is achieved with similar ILX rules controlling data movement and overwrite protection. Data managers are notified of exceptions by email.

Friction points

Careful evaluation is required of ‘friction points’ such as missing values from priority or new wells from vendors, risks of overwriting or losing value added data, synchronizing deleted data across projects and master stores, and the usual issues of inconsistent naming and data quality such as coordinate reference systems and well elevations.

Merge

Dominion worked with ILX on procedures for merging multiple event wells and child objects into a single wellbore for working projects with approved log curves and horizon picks. These involve very detailed standards for master OpenWorks project data. The ILX DQM approach is now extending to other business units.

Newfield

Jim Day described how Newfield is using ILX to assure quality during data movement between Open Works and, again, a PIDM master dataset of well and directional survey data. Newfield has implemented scripts for well naming conventions and other data standards.

Heggelund

ILX CEO Dag Heggelund’s presentation on geodesy, subtitled ‘geography without geodesy is a felony’ traveled familiar ground to Oil IT Journal readers (See also the EPSG Guidance Note announcement on page 2). Helleglund’s presentation on the future development of ILX’ products outlined how the ILX Back Office tools were being extended to monitor data movement throughout the workflow. ILX is introducing new measurement categories for audit, data change and ‘relevancy rules’. ‘Near real time’ data availability will be facilitated with ‘data event detection’ in QCMonitor, eliminating manual data loading to corporate datastores. This means that, for instance, a marker pick in Petra is simultaneously broadcast to SMT and a PPDM corporate datastore in near real time, with on-the-fly quality checks assured by QCPro.

MetaLogix

ILX is working on Development of ‘MetaLogix,’ a.k.a. ‘a roadmap to trustworthy data translations.’ This will extract and collect metadata, perform data translations and capture translation ‘context.’ MetaLogix will add search, traceability and comments. ‘Relevancy rules’ will allow AOI boundaries to be put on data transfer to reduce transfer bandwidth. ILX is working towards a ‘virtual data environment,’ making a heterogeneous data environment appear homogeneous. Which will be achieved by capturing data context, model translations and embedding knowledge of corporate processes. ILX user group presentations can be downloaded from www.innerlogix.com. More from gaym@innerlogix.com.

Comment

ILX is expanding its compelling data quality offering to a veritable data infrastructure. But instead of interoperability at its center—now seemingly a given—it has quality. Is that smart or what?

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.