SPE Digital Energy, The Woodlands, Texas (Part I)

Is oil and gas at the bottom of the digital league table? Or is it the ‘greatest show on earth?’ Former Texas railroad commissioner on ‘getting regulation right.’ IDC on big data’s potential. BP, Chevron Devon, Intelligent Solutions and USC on data mining. IBM, Cloudera, PointCross bang Hadoop drum.

Around 600 attended the 2013 edition of the Society of Petroleum EngineersDigital Energy conference and exhibition. An almost chance remark in Louis Ehrlich’s keynote set the tone for much of what followed. A ‘survey’ from Gartner seemingly places the oil and gas industry at N° 38 out of 39 on ‘adoption of digital strategies’ as compared with other industries. Many (including ourselves) questioned the veracity of the Gartner finding and there was a suggestion that Gartner should be invited to next year’s Digital Energy to set the record straight.

Chairman Phiroz Darukhanavala (BP) traced the recent turn around in the US oil and gas landscape from a ‘grim’ picture only five years ago to the current boom and promise of energy independence within a decade. Regarding the decade old digital oilfield concept, industry has made slow and steady progress toward the vision. We now have more instrumented operations and sensors ‘everywhere’ generating enough data for us to drown in! Daru twisted the Ancient Mariner’s words thus, there is ‘data, data everywhere, but not a bit to use.’ Thus the focus of the 2013 Digital Energy show is ‘from information to insight.’

Chevron CIO Louie Ehrlich set out to be provocative, emphasizing that we need to be better at combining the strengths of engineering and IT. The healthcare, retail and financial services sectors recognize the critical nature of IT and do a better job of embedding IT into their businesses. While oil and gas is a technology leader and already gets lot from IT, ‘We are leaving value on the table because of the mixed reception given to the partnership of engineering and IT.’ Digital technology has a lot to offer in support of projects, execution and capability with the multiplier of the strategic partnership. Our ability to gain insights from information is increasing and we’ve only just begun. We have made progress in the fields of production reliability, equipment status, reservoir condition and more but are still not fully leveraging the IT multiplier. Oil and gas information is growing exponentially and we should be using all of it along with our knowledge, experience, tools and technology. There is light on the horizon. Today’s petroleum engineers are much more digitally literate—they are digital ‘natives’ as opposed to digital ‘immigrants.’ The natives are leading the paradigm shift from traditional IT to ‘integral IT’ by acting as ‘grand connectors’ and ‘data-driven’ engineers. In the Q&A, Ehrlich was asked if field automation should reside IT or operations. He replied that it doesn’t matter. What is important is to be clear on how it will work and to enable information and work flows. Looking to the future Ehrlich sees a world where the big game changer will be the impact of IT on knowledge workers. Computers may get to be better than humans—’that’s what we need to watch.’

Halliburton’s Duncan Junor moderated a debate on the challenge of regulation in the face of a booming industry. Former Texas Railroad Commissioner Elizabeth Ames Jones (now with Patton Boggs LLP), who describes herself as a ‘recovering regulator,’ paid homage to George Mitchell whose company, Mitchell Energy, was first to frack the Barnett shale and also came up with the idea of a planned community at The Woodlands. Jones congratulated the audience on progress driven by digital and petroleum engineers who have created the ‘greatest show on earth.’ Jones is determined that regulation keeps up with the activity and avoids destroying jobs. ‘You are either with us or against us. I believe in energy security, energy is essential for our quality of life. We had better start being one issue people now.’

Michael Krancer of the Pennsylvania Department of Environment Protection observed that Halliburton now has 3,000 employees in Pennsylvania, wisecracking that he sees Houston as ‘the Pittsburg of the Gulf Coast.’ Krancer doubted the 38th out of 39 rating, comparing energy with the space program’s blend of engineering and IT. Krancer is no liberal, ’ the power to tax is the power to destroy.’ In Pennsylvania regulation is based on ‘sound fact and sound science—not on the opinion of Hollywood movie stars who come here and tell us they know more about it than we do.’

Darren Smith (Devon Energy) observed that ‘digitization and data can alleviate green attacks on industry.’ Devon’s move into non conventionals, with the acquisition of Mitchell Energy, has not gone unnoticed. Some of the new stakeholders are hostile and would like to ban fracking. Concerns revolve around water contamination, methane migration, air quality, flaring and induced seismicity. Devon counters this with a data driven social license to operate. Data transparency can counter ‘factoids,’ i.e. tidbits of misinformation. The fracfocus.org chemical disclosure registry now has some 39,000 reported wells and has been adopted by eight states. A balance needs to be struck between reporting and confidentiality. From the regulator’s perspective freedom of information requests mean that everything can become public.

Bob Moran reported from Washington, where he represents Halliburton. Halliburton has hired 13,000 people in the US on the strength of the non conventional boom and the offshore. What will it take to make shale go global? Local culture is key. In Texas and Pennsylvania the activity is well received. Elsewhere, less so. Tax regimes, expertise, market and favorable geology are also key. Texas is good on all fronts as is Australia. In Ohio, the Utica oil shale is hampered by regulations. There are 14 federal agencies trying to get a toehold in fracking. The States are should be in charge. Moran agreed with Ames Jones, ‘We need to turn up the heat on elected officials who are not on message.’

If non conventionals dominated the politics and economics of the conference, the main technology theme was the old chestnut of data mining. Data mining, a.k.a. analytics, predictive optimization and so on has been around for years but has been rejuvenated by the buzz and hype surrounding the ‘big data’ movement.

One of the best turn-outs for the show was for Jill Feblowitz’ (IDC Energy Insights) look at the ‘big deal in big data.’ Feblowitz traced the big data movement from its origins with Google and Amazon’s requirements for massive ‘ingestion’ and analysis. This has sparked a ‘new generation of technologies and architectures running on commodity hardware.’ Big data is characterized by the three (or maybe four—see this month’s editorial) Vs—volume, variety and velocity. Volume as in a wide azimuth seismic survey, variety as in the jumble of documents and data sources that support a company’s decision making process, and velocity as in real time data streaming from a production platform or drilling bit, perhaps vie a wired drill pipe. IDC has surveyed 144 companies to find, unsurprisingly, that ‘big data’ is not a ‘familiar concept’ in oil and gas. However, some trials have been reported, Chevron is using IBM InfoSphere BigInsights (a Hadoop-based environment) in its seismic data processing. Shell is piloting Hadoop in a private Amazon cloud. Researchers at the University of Stavanger have demonstrated the application of ‘Chukwa,’ a Hadoop-sub project devoted to large-scale log collection and analysis to perform oil and gas data mining. Cloudera has launched a ‘seismic Hadoop’ project and PointCross has Hadoop based solutions for well and seismic data. Feblowitz suggests that use cases include data mining 3D seismic volumes for particular attributes, geospatial analytics on non conventional acreage, drilling data anomaly detection and production forecasting. So where should you start? By recognizing the value of your untapped data asset, performing a gap analysis for technology and staff and formulating a big data strategy.

Shawn Shirzadi outlined BP’s data mining and predictive analytics effort, parts of its ‘field of the future’ program. Since 2010 upstream data and applications have changed significantly with the advent of ‘high velocity’ real time data, unstructured data, ‘supercomputers’ on desktops and the sensor ‘explosion.’ All of which present data-driven analytical opportunities. In the field of pipeline corrosion threat management, BP estimates that software could contribute $4 mm/year in risk reduction leveraging in-line inspection data. Seven wells in the Gulf of Mexico now are equipped with autonomous production allocation, 24/7 reconciliation of sales meter data with production using 95% accurate virtual gauging. A third data workflow around waterflood optimization uses a ‘top-down’ workflow combining data from the historian and performing injector producer connectivity analysis with a ‘capacitance-resistivity model.’ This uses a non linear solver to estimate injector allocation factor and to comute a ‘value of injector water’ factor used to rank injection opportunities in the face of a shortage of injection capacity. For Shirzadi, real time data-centric decision making has only just begun in the upstream.

Derrick Turk described Devon’s venture into ‘analytics’ which he defines as ‘the discovery and communication of meaningful patterns in data,’ opening up the field for classical statistics, computing, AI, machine learning and domain expertise. As the industry is now drilling large numbers of wells in poorly understood reservoirs a ‘big data’ approach is appropriate. Devon encountered analytics at the 2011 Digital Energy conference and engaged a consultant on a proof of concept analysis of the key drivers for a major new play. This produced enough encouragement to make Devon decide to bring the evidence-based decision making capability in-house. The technique has evolved into what is now the Devon analytics cycle—moving from what data is available, to hypothesis testing and finally predictive analytics and machine learning. The technique, still at the pilot stage, is being trialed on KPI drivers in a resource play. To date the main impact has been in spotting data quality issues. It can be hard to demonstrate quick wins.

Lisa Brenskelle described research into ‘advanced streaming data cleansing’ (ASDC) conducted by Chevron and the University of Southern California. The result is a scalable system that intercepts data from control systems and detects erroneous/missing data, flatlining readings, spikes and bias. Data is read in from the historian, cleansed, reconstructed and stored back with a new tag number. The system was developed because nothing was commercially available for dynamic data. The technology stack includes OSIsoft’s PI System, Microsoft’s Stream Insight’s complex event processing tool and custom code from Logica (now CGI). The cleansing algorithm is ‘empirical’ and uses multivariate statistics and dynamic principle component analysis. One questioner asked how the system distinguishes between bad data and a real process fault. This is currently an unresolved issue that the researchers are working on.

In his second day keynote, Greg Powers (Halliburton CTO) observed that the next trillion barrels will be really hard to access, will come in smaller amounts and be more costly and risky for oils. Exploring and producing the new resources (hydrocarbons) will be performed with fewer experienced resources (people). Half of all discoveries are now in deepwater where spend is rising at 18% per year. In terms of value creation, Petrobras is ‘off the scale,’ BG is N° 2 and Chevron N°3. ‘The harder we look, the more we find.’ But, ‘You don’t get to drill deepwater without a safe, consistent and sustainable process.’ To assure this, we need more automation leveraging real time data. We are not there yet because we’ve not got the telemetry and communications bandwidth. Powers bought in to the Garner rating, ‘We are at the end of train and we cannot stay there.’ Remote operations centers represent a sea change for industry offering the best expertise available remotely. Dual fuel vehicles and rigs running on natural gas are greening the non conventional drive as is ‘drinkable’ frac fluid. Oil and gas is piggy backing the miniaturization of electronics with better, higher capacity units—but ‘We’re doing it downhole’ in a much tougher environment. ‘Try putting your iPhone in the oven overnight!’ The fiber optic revolution promises many new applications. Fiber resolution is now down to atomic levels. A video showed how just talking into one end of a fiber can be picked up kilometers away. Drilling automation will be enabled by sensors, telemetry, actuators and ‘a lot of math.’ Autonomous machines will perform better than humans. This will greatly enhance capital efficiency—but also will force us to rethink much of our assumptions as to what wealth creation is about. All the rules of the game are changing. In the Q&A, the cyber security question was raised. Powers responded that all communications from the well site are encrypted and transit via a private satellite network. ‘Everything is secured by ourselves, we assiduously stay clear of public networks.’ Powers was also asked if he was comfortable with the idea of machines taking over. He answered, ‘Did you fly here? If it was a short hop, your pilot did not fly the plane!’

Shahab Mohaghegh (West Virginia University and Intelligent Solutions) returned to the data mining theme with a presentation on synthetic geomechanical logs generation for Marcellus shale development. Mohaghegh’s Petroleum Engineering and Analytical Research Lab (PEARL) specializes in ‘data driven analytics (DDA). DDA derives relationships directly from large data sets with no attempt made to understand the underlying physical processes—which are often too complex or poorly understood to qualify. Geomechanical properties are key to designing frac jobs and completions. But the factory drilling paradigm means that geomechanical logging is not a common practice. Following calibration against the few existing wells with geomechanical logs, neural networks and sequential Gaussian simulation are used to convert regular (sonic, density, gamma ray) log data to geomechanical properties over the whole region.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.