PPDM Houston Professional Petroleum Data Expo 2021

What is a Facility? PPDM and OSDU. Chevron implements What is a Well taxonomy. Resolve and Talus on seismic data prep for OSDU. PetroDataOnline’s ‘serverless’ PPDM 3.9. EnergyIQ as data staging platform for OSDU. Deloitte’s graph technology for LNG operations.

PPDM CEO Trudy Curtis welcomed participants to the 2021 Houston Expo, ‘our first major PPDM Association event in nearly two years’. She summarized current PPDM Association activities, announcing the launch of a ‘What is a Facility?’ initiative using the same methodology as ‘What is a Well’. The PPDM reference lists and data rules are available to anyone online. PPDM Association products are leveraged by the OSDU Platform to optimize data’s value. PPDM material is ‘incorporated by reference’ into the OSDU Platform when required (Curtis chairs the PPDM/OSDU integration team). PPDM training will focus on supporting data professionals while OSDU Platform-specific certification and related training is owned and accredited by the Open Group.

Patrick Meroney presented the Open Subsurface Data Universe or rather just ‘OSDU’ as we are to call it today. OSDU represents something of an ‘all-in’ approach to cloud computing. Meroney observed that while the ‘rush to the cloud’ has brought some of the predicted benefits, the wins come with a cost, notably that ‘you still need to manage the data’. A 2021 Flexera ‘State of the cloud’ report has it that 82% of respondents reported a hybrid cloud (on and off site) as the architecture of choice. For Katalyst (and maybe other PPDM relational afficionados) this means a dual deployment of an on-site PPDM database (possibly from Katalyst) and an offsite OSDU infrastructure. The two are maintained in sync with Katalyst’s ‘KWSR API’ an implementation of the OSDU external data services functionality (members-only content) on GitHub.

Jess Kozman (Katalyst Data Management) argued for a ‘Data Fit Organization’ a concept developed by a steering committee of Australian R&D establishments and industry partners. The DFO movement has backing from the CORE Innovation Hub, an Australian technology collaboration initiative focused on natural resources (petroleum and mining). A Data Fit Organization (DFO) is one where ‘data culture is a ubiquitous part of work, like safety is today, where all employees have data competencies and capabilities, and demonstrate behaviors that deliver strategic value from data, and where data roles and responsibilities are measured and incentivized’.The DFO is working on a simple, repeatable and accessible framework, inspired by the ‘Fitness-To-Operate’ safety competency framework developed for the Australian offshore petroleum industry by the University of Oxford and NOPSEMA, the Australian regulator. This was developed to reduce the risk of an accident in Australian waters similar to Gulf of Mexico Macondo disaster. Kozman drew attention to a ‘data management and delivery failure’ that was part-responsible for the Macondo incident, citing a 2015 SAS Institute study that found that ‘the first clear data indicator of fluid flow imbalance appeared 43 minutes before the blowout’ and that ‘the rig operators had the data they required to prevent the accident’. Kozman has kindly agreed to let us host a preprint of his PPDM paper giving more of the relationship between Macondo and the origins of the DFO, which will likely for the basis of a future PPDM Data Examiner article.

John Thibeaux outlined how Chevron has implemented the WIAW taxonomy in its DIAL staging database. WIAW provides an industry standard taxonomy for various well components, allowing those involved in upstream operations to ‘speak a common language’. Data passes through the DIAL access layer for ingestion into consuming applications. WIAW/OSDU terminology is shared across reservoir, drilling and completions and production.

Don Robinson (Resolve GeoSciences) and Paul Thompson (Talus Technologies) offered advice on preparing seismic data for the OSDU platform. Reliable data is needed prior to OSDU ingestion. Robinson proposes an automated seismic data validation workflow, as this is a chance to scan data and extract as much information as possible. A machine-readable manifest file of required metadata is generated in the process. This can capture data lineage and ‘simplify your seismic ancestry management’.

As much of the PPDM community is looking towards OSDU, PetroDataOnline’s (PDO) Vidar Andresen offered a back to the future presentation of a ‘serverless’, hosted edition of the PPDM 3.9 relational database. In this context, ‘Serverless’ is a compute tier for single databases running in the Azure SQL Database. PDO’s Database Manager is a free, open source web based tool to manage PPDM database and data science projects. Current functionality allows users to load a PPDM, manage PPDM, CSV and LAS data connectors and perform other data transfer and QC. The tool is ‘built for the cloud’. The user interface is based on Blazor. The serverless paradigm includes ‘durable functions’, leveraging tools such as CSVHelper, Mathnet and Automapper. Visit the PDO Database Manager on Git and contribute to the project.

Quorum’s EnergyIQ is now proposed as staging platform for data curation prior to ingestion in to OSDU. Duncan McDonald argued that ‘data is the fuel that is driving the digital transformation’. Companies are focused on building an energy data platform based upon established standards and ‘The OSDU initiative shows the direction that the industry is heading’.

Deloitte’s Nishanth Raj presented on the use of graph technology to combine and analyze data from a variety of sources to support LNG operations. Graph technology provides global users with insights into LNG shipping and transportation, including vessel information, shipping costs, weather and volumes sold. The solution provides ‘self-service’ data access and analysis of disparate data sources. A ‘multitude’ of data driven techniques such as pattern recognition and machine learning provides data profiling, data quality rules configuration and further analysis of LNG data, leveraging machine learning. The use case is reminiscent of our 2016 report from the Neo4J user meet where graph technology was used to uncover the bad actors behind the Panama Papers.

More on the PPDM Houston meet in the Agenda.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.