Hortonworks’ Hadoop for oil and gas

Grand claims for the big data lake in seismic, production optimization and HSE reporting.

A rather overblown marketing document from Hortonworks sets out the stall for the use of innovative ‘big data’ technology in oil and gas. The release includes grandiose claims for Hadoops’ contribution to US energy independence and mitigating declining world oil production to propose three use cases, seismic, lease bidding and compliance with health safety and environmental reporting.

Machine learning algorithms running against massive volumes of sensor data from multiple wells can be used to optimize production and extend a well’s life. Once an optimization strategy has been obtained, optimal set points can be maintained with Apache Storm’s fault-tolerant, real-time analytics and alerts. Storm running in Hadoop can monitor variables like pump pressures, RPMs, flow rates, and temperatures and take corrective action if any of these set points deviate from pre-determined ranges.

In lease bidding, Hadoop is claimed to provide competitive advantage by efficiently storing image files, sensor data and seismic measurements, adding context to third-party surveys of a tract open for bidding. Apache Hadoop also offers a ‘secure data lake’ of compliance-related information. Improved data capture and retention make compliance reporting easier. Because Hadoop does not require ‘schema on load,’ data can be captured in native format, as pdf documents, videos, sensor data, or structured ERP data. More from Hortonworks.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.