SPE Digital Energy 2015, The Woodlands

Digital oilfield ‘crosses the chasm?’ BP’s analytics and 20 years of drilling software evolution. Chevron’s model-based production optimization. Oil country big data tales from Chevron, Baker Hughes and IBM. New PVT/fluid standards. NASA, GE, Schlumberger’s ‘future vision.'

Geoffrey Moore kicked off the 2015 edition of the Society of Petroleum Engineers’ Digital Energy conference in The Woodlands, Texas earlier this year asking if the digital oilfield has ‘crossed the chasm,’ a reference to his seminal work on innovation and disruption. Moore’s framework for analyzing technology adoption ‘plays out in every disruption.’ For instance, 3D printing today is in the chasm but will emerge sooner or later into take-up by the majority. The question for such innovation is ‘how do you light the fire on the other side of the chasm?’ Moore was better at citing examples (Amazon in cloud computing, Apple for smart phones) than providing matches. Although Bezos’ and Jobs’ ‘charisma’ helped these companies ‘get ahead of the herd.’ Oil’s current ‘re-pricing’ may push companies to ‘go digital’ because of the pain. Moore recommends targeting a beachhead segment, a niche market with an intractable problem. What will oil and gas be missing in 2016?

Another problem is doing this from inside a large company or, ‘from the belly of the whale.’ Moore is on a ‘positive jihad’ to help large organizations allocate resources to ‘unwelcome’ next-generation projects that are ‘dilutive before accretive.’ But the digital oilfield is not an innovation challenge, it is a management challenge. Moore covered all bases in his conclusions observing that ‘Digitization is changing everything, it will change oil and gas’ but also that ‘the time horizon for venture capitalists and energy is not aligned,’ citing the ‘big smoking hole’ that green energy investment has left in VC finances.

BP’s Joe Anders presented on the application of analytics to well integrity. The Norsok D-010r4 standard was a starting point. BP’s well integrity workflow relies on copious data collection, even an operator’s smartphone photograph of a gauge which can be coded by a clerk back at base. For analytics, many different information sources come into play, accessed via a data integration layer and common data model. Anomalous conditions, such as a well’s annulus pressure breaching a limit, will email a notification to appropriate personnel who can use the system to investigate further and generate standardized reports for various conditions. Analytics make it possible to link precursor events, such as wells with 2 or more bleeds in the last month, to defects before they become serious. BP has evaluated various ad-hoc schematics packages and finally went for Well-Barrier. The tool now has 600 users.

Amol Bakshi related some of Chevron’s lessons learned from model based real time production optimization. Ideally this concerns highly instrumented assets but this is not always the case. Early solutions used various software solutions and integration platforms. Around 2010, Chevron recognized that it needed a coordinated enterprise-wide solution which is now operational. The first step in production optimization is to make sure that the asset is amenable to such an approach, not all are. Once a commitment has been made to model based decision making, an 80/20 solution that targets core objectives is designed, resisting scope creep. Key tools of the trade are Petex’ IPM suite and a ‘virtual field simulator’ that generates test data before first oil. We asked Bakshi what had come from his earlier work on semantic technology in Chevron. This appears to have been down-played as ‘finding people with semantic technology expertise is hard.’ SQL would seems to be the tool of choice at Chevron.

David Reed offered a retrospective1306 of 20 years of managing BP’s drilling and completions application portfolio. BP made a move to Landmark’s EDM in 2005 and is currently migrating to the R5000 release as part of BP’s Chili project. The user community’s demands often far outstrip IT’s capability. User needs must align with architectural principles and BP’s bill of IT (buy not build and favor Windows over Linux). The IT architecture has evolved from flat file, through client-server and now data layer with pipes feeding five ‘pillar’ applications, Landmark’s Engineers’ Desktop, Well integrity, Kongsberg’s operations suite and GWET, a Sharepoint-based well engineering toolkit. BP has over 300 separate database and many more stand-alone machines, a situation deemed unsustainable. The company is working to cut the database count by 60%. Chili combines IT discipline with new workflows for well delivery, CRS and deviation data management. The aim is to get the drilling target closer to geological target and reduce the ‘ping pong’ between engineers and geoscience. Safety-critical data is now a one way transfer via sanctioned applications. Witsml plays an increasing role in ‘making things less ambiguous.’

Big data and analytics proved popular themes this year. Minshen Hao (Chevron) has used population-based stochastic search to optimize steam generation in an EOR project. The object is to optimize the sum of multiple output vs efficiency curves. ‘Quantum-behaved’ particle swarm optimization is the technique of choice, a ‘simple, flexible cost minimizer that is entirely data-driven.’

Alireza Shahkarami (Saint Francis University) has been using artificial intelligence to model CO2 sequestration and EOR with a surrogate reservoir model (SRM) from Intelligent Solutions. This provided ‘fast track’ modeling of the complex Permian basin Sacroc field, with some 50 years of injection history. SRM uses ‘Latin hypercube’ experimental design. Results are claimed to be similar to those obtained with a CMG simulator, but while the simulator took 24 hours, the SRM takes ‘under a second.’

Inge Svensson introduced Baker Hughes’ new big data platform for real time drilling data and rig performance optimization. Real time analytics shifts the focus from pure non productive time to other, ‘invisible’ causes of lost time including human factors. The system includes a NoSQL database, HHTPS connectivity and a Witsml server. KPIs show fine grained activity break-down and opportunities for performance improvement. Statoil is a user.

Peerapong Ekkawong’s showed how PTT E&P has used the Matlab linear optimization toolbox to increase production from a Gulf of Thailand gas field. The Excel to Matlab to Excel workflow improves on manual fine tuning. The work was performed in collaboration with Texas A&M’s model calibration and efficient reservoir imaging program Mceri.

IBM’s Mike Brulé provided some evocative imagery of the evolving corporate data lake, ‘from data swamp to data reservoir.’ The big data reservoir (BDR) holds data that has been cleaned-up in the data refinery. The oil and gas industry is still trying to figure out how to use big data. Modeling with data driven methods is all very well but it is better still to understand cause and effect. Empirical ANN is gaining ground alongside of traditional physics-based models. The BDR is Hadoop/MapReduce ‘landing zone*’ and analytics sandbox for all oil country data, both structured and semi structured. ‘Polyglot’ data repositories include SQL, NoSQL, Hadoop and GraphDB. Use cases include DTS analytics and real time pipeline leak detection. More on the BDR in the IBM Redbook ‘Governing and Managing Big Data.’

Frans van den Berg reviewed Shell’s experience of smart fields and the collaborative work environments (CWE). Smart and standard processes are used in surveillance, production optimization, maintenance and emergency response. There is ‘one team, one plan and one set of KPIs.’ Exception-based surveillance plugs into guided workflows for follow up and alerting to asset teams. Today the drive for a CWE comes from local assets who ‘want to work this way.’ The CWE increases efficient decision making through dialog rather than emailing.

ExxonMobil’s Robert Aydelotte provided an update on Energistics’ ProdML extension for PVT and fluid characterization data exchange. Modeling has now migrated from Excel to UML. The spec can capture an audit trail of sample chain of custody, adding context and ownership information along the way. Also included are fluid analytical tests, and sample description. The spec is currently in the test phase and should be available in Prodml 2.0 later this year. Energistics’ common technical architecture was leveraged such that fluid/PVT data can be included in simulator-ready Resqml decks.

In the ‘Future vision’ session, Mark Little described GE’s FastWorks initiative that seeks to foster a start-up mentality inside the whale that is GE. GE uses around 300 3D printers in various manufacturing contexts. A ‘brilliant’ 21st century factory leverages full physics modeling, model-based manufacturing and no more paper drawings. ‘Everything is becoming connected’ (including incidentally a young lady in the row in front of us who was checking out some shoes on Yahoo shopping).

NASA’s Steven Fredrickson offered insights from space flight. We tend to overestimate what can be done in the short term and underestimate what can be done in ten years or so. NASA was an early adopter of digital representations from early electronics to digital. He mentioned ‘human-in-the-loop’ simulation, Monte Carlo modeling and the ‘digital double’ approach where SysML modeling enables a ‘computable representation of everything.’ On-board intelligence deploys Watson-inspired systems for exploration anomaly resolution a.k.a. ‘mission control in a box.’ NASA has migrated from paper through Adobe PDF to the international procedure viewer IPV XML documentation.

David Rossi spoke of Schlumberger’s ‘new’ early stage investment program (actually announced in 2012) in promising start-ups like Foro Energy’s high powered lasers used in well abandonment. Schlumberger is also working with Aramco on ‘electric completions’ in multi-lateral extreme contact wells and one a collaboration with Google to use its search in oil and gas. Rossi also revealed that Schlumberger now has a total of 26 petaflops of compute capacity in Houston which would put it in the top five of the Top500 classification (if it entered). The Q&A revealed that nobody actually wanted to wear Google glasses, considered ‘a bit creepy.’ Rossi opined that ‘if you can write a flowchart for a job, it will disappear and be done by automation or robotics.’ Asked ‘What job you recommend to youngsters?’ the engineers recommend ‘engineering.’ Our cheeky question, ‘what would Watson say if he was on the panel’ was elegantly fielded by IBM’s Curry Boyle who suggested, ‘stop watching those kitten videos!’

* A metaphor too far?

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.