2020 LBCG Oil & Gas Pipeline Integrity & Data Utilization Solutions

Virtual event hears from API on latest PHMSA pipeline safety rules. Analysis of the new ‘mega rule’ from Paramount Energy and Crestwood. Enterprise Product Partners presents PipelineML. Williams’ plans for inspections out to 2035, big data and risk analysis.

Speaking at the online 2020 LBCG Oil & Gas Pipeline Integrity & Data Utilization Solutions congress, The American Petroleum Institute’s David Murk explained the implications of new PHMSA hazardous liquid pipeline rules. The rules were updated to reflect current integrity management practices in the light of high profile incidents such as San Bruno CA, Marshall MI and Yellowstone River. The update removes older ‘one-size fits all’ proposals and allows for the use of advanced technology and engineering assessment of pipeline integrity. The Final Rule, ‘Pipeline Safety: Safety of Hazardous Liquid Pipelines’ was published on October 1, 2019. The rule has support from the API, which has however argued for an ‘appropriate timeline’ for implementation and the exemption of offshore and rural gathering lines. The rule mandates data analysis and review of pipeline safety status, integrated with geospatial information systems. Much of the PHMSA’s requirements are already covered in the API’s different recommended practices (API RP 1173: Pipeline Safety Management Systems, API 1160: Managing System Integrity for Hazardous Liquid Pipelines and others). The API is now revising its safety management and integrity standards and developing an RP for Pipeline Public Engagement. More from the API and its pipeline standards home page.

Brandi Wolfe (Paramount Energy Consulting) offered an analysis of the PHMSA ‘Mega Rule’ for gas transmission pipelines. First published as a Notice of proposed rulemaking (NPRM) in 2016, the Mega Rule has since been updated and now covers MAOP*, repair, corrosion control, integrity management and management of change. The final version of the Mega Rule in 2019. 2020 saw feedback in the form of draft FAQs along with a COVID-19 ‘stay of enforcement’. The ‘stay’ ended on 31st December 2020. Wolfe enumerated the considerable number of requirements that now need to be addressed including records management, inline inspection and the extension of coverage to ‘moderate consequence areas’ (MCAs). Interested parties should contact Paramount for more.

* Maximum allowable operating pressure.

Clem Chuck (Crestwood) warned that the Mega Rule has brought a 20% increase in the number of regulated pipelines in the United States. Operators will face significant operating challenges and increased costs. ILI preparation and inspections easily run into the hundreds of thousands of dollars per line. High consequence areas (HCAs) are 200-meter buffer zones where a pipeline passes through developed areas and places where people frequently gather, like a school. Urban development has created many new HCAs, meaning that many previously non-regulated pipelines now require frequent inline inspections. Even areas previously classified as ‘Non-HCA’ or medium-consequence areas have experienced recent incidents, and PHSMA is extending its requirements to some of these. Inspections can be robotic or tethered. Although expensive, robotic inspections are suited to inline (ILI) inspections of newly-identified HCA areas. The devices assess pipe wall integrity with ultrasonic testing. Electromagnetic acoustic transducers measure pipeline wall thickness and laser profilometers and high definition cameras can detect internal surface irregularities such as pitting. For ‘unpiggable’ pipelines, i.e. without launching or receiving facilities, or with complex geometries, ice or gel pigging can be used prior to decommissioning.

John Tisdale (Enterprise Product Partners) presented the Open Geospatial Consortium’s PipelineML data standard. PipelineML builds on the OGC’s GML standard. Any software that can read GML can visualize the spatial information in PipelineML. The OGC PipelineML working group was established in 2014 and the standard was approved by the OGC in 2019. PipelineML enables pipeline data to be recorded in an industry-standard format as it is acquired. Data can be captured and verified ‘while the ditch is still open’ and added to, throughout the life of the asset. PipelineML can ingest design-time data from CAD software like AutoCAD, Bentley, or Intergraph. Data can be added as material tests and other records are first attached to components. Construction Management Systems can output PipelineML files to show up-to-the-minute progress. ‘PipelineML makes it fast and easy to capture information whenever data is discovered, such as when the ditch is uncovered during a rehab project or non-destructive examinations’.

PipelineML files support validation. When a file passes a validation test, it receives a unique validation certificate which proves the quality of the data. Asset data can be exchanged between different parties without custom translation or reformatting. Data must still be reviewed by a subject matter expert prior to ingestion, but the costly bottleneck of data manipulation is avoided. ‘PipelineML does the heavy lifting so staff can focus on work requiring subject matter expertise’. PipelineML solves the most difficult aspect of information sharing: resolving differences in vocabularies. PipelineML 1.0 embeds some 178 code lists, each containing a standardized set of codes and values. Code lists from the API, ASTM and others have been consolidated and standardized. PipelineML natively archives information. Every time a PipelineML file is generated, it captures a snapshot of asset information flows, either within the company or outside its firewall with service providers. Snapshots can be archived to the project or as part of an asset data management system that supports PHMSA-style TVC* compliance. PipelineML can be used to create situational awareness across operator departments with TVC-complete record management across the enterprise. PipelineML is a ‘free open standard that is available today’.

* Traceable, verifiable, and complete.

See also our 2019 coverage for more on PipelineML and its relationship with other pipeline data standards.

Amy Shank (Williams) discussed the impact of new regulations on integrity management with reference to the expanding scope beyond HCAs. For Williams, this entails the assessment of around 2,200 miles of previously unassessed MCAs in a 14-year time frame, out to 2034. The new hazardous liquid rules mandate integrity assessments at least every 10 years. Within 20 years, all liquid pipelines in HCAs must accommodate ILI tools. The rules also call for more leak detection surveys and expanded reporting requirements. Williams is currently re-evaluating its impact analysis and will be developing an implementation plan.

Kelly Thompson presented on Williams’ strategy for analyzing large data volumes to meet the new regulations. Williams leverages a set of data standards and uniform practices to support operations. Labor-intensive, repetitive tasks have been automated using tools such as Safe Software’s FME, SQL, R and Python. The idea is to ‘get risk data into the customer’s hands’ and help protect against losses. This is achieved by ‘making a multitude of databases work together’. SharePoint lists are leveraged as repositories for baseline assessment, prevention and mitigation activity. FME extracts data from native sources and pipes it into a SQL risk engine for calculation. Data is analyzed in PowerBI and visualized in ArcMap. The risk system informs Williams’ preventative maintenance system. Data is the foundation of the compliance, integrity, risk pyramid.

More from the LBCG Conference home page.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.