In a recent blog post, UK-based ETL Solutions’ Richard Cook offered some advice to users of the Professional petroleum data management association’s data models. While standards like PPDM are important to the upstream, they present challenges at both adoption and when transitioning between versions. Databases grow in size and complexity as volumes increase, as metadata is added and especially, when interpretation results are captured. ETL uses the example of how checkshot data is captured to a variety of PPDM databases. The 3.2 edition represented checkshot surveys simply, with little metadata. 3.3 added many new checkshot and foreign keys to reference tables where more data can be stored.
Intermediate PPDM flavors introduced subtle, albeit important semantic changes, but there was a major shift with 3.6. The checkshot table has gone and checkshots are now considered seismic data. 3.7 added yet more attributes to record the lifecycle of these objects and 3.8 introduced ‘potentially far-reaching’ changes. Several tables’ use is now deprecated. Others have moved and the precision of some numeric values has been increased which could result in tests using them to fail.
For those scratching their heads, ETL’s Transformation Manager offers a structured approach to data migration, separating underlying data structure from implementations. TM is claimed to be ‘the key to handling data stored in PPDM and other highly normalized models.’ TM generates rule based Java integration components to manage complex data translations.
© Oil IT Journal - all rights reserved.