After the bizarre warm-up session (see last month’s editorial) the SPE/Reed Expo Intelligent Energy event retrieved its composure with a plenary session on ‘preparing to meet the grand challenge.’ Nabeel Al-Afaleg sees the I-Field as an answer to the graying workforce. Saudi Aramco is bridging the experience gap with bespoke education delivered from its upstream development center, where immersive technology is used to run virtual build up tests. Elsewhere Aramco is automating and integrating business processes, leveraging AI, expert systems, real-time data and remote control. ‘Nano robots,’ gigacell simulation and instrumented wells also got a mention.
Satish Pai observed that the industry is getting younger and smarter with intelligent technology, centralized operations, better knowledge management and remote operations. Schlumberger isn’t scared of the big crew change which will largely be mitigated by technology. The regulatory/environmental scene is a much bigger headache. Pai noted that today, both operators and service companies have their own remote operating centers. Coordinating workflows across them is a challenge. The digital oilfield has been a success but has not yet brought fundamental change—we still have a way to go and ‘most on the rig are not ready for the technology that is coming.’
Gerald Schotman enumerated Shell’s intelligent technology—with its million channel seismic system under development with HP, another low frequency fiber optic based system (with PGS) and ‘flying nodes,’ hundreds or thousands of sensors in an ‘adaptive mobile grid.’ Shell’s GeoSigns (a single Shell platform for processing and interpretation), the Bridge (exception-based surveillance) all got a plug as did ‘autonomous drilling’ which ‘provides consistency and reduces dangerous human intervention.’
Ellen Williams (BP) observed that while the digital revolution has transformed society, ‘oil and gas has lagged behind.’ This notwithstanding BP’s ten year old Field of the Future program which has brought ‘demonstrable benefits on a business scale.’ Current point solutions now must be grouped at the systems level—and choices must be made as to the degree of automation, with either actuation with a ‘man in the loop’—or systems that override the operator. We are faced with ‘a daunting at-scale infrastructure transformation.’
Kjell Pedersen (Petoro) thinks we may have turned the corner in integrated operations—at least for green fields. Mature fields will take more time, drilling and prepping wells takes longer and uncertainty is perceived as high. IT advances are used to manage risk and justify investment.
In the debate, Afaleg observed that you can’t let inexperienced folk loose on a reservoir. We need a sandbox for training, perhaps ‘an open source platform to share across industry.’ On standards, Schotman observed that industry could move faster by leveraging packages across suppliers. A questioner raised the issue of safety and the digital oilfield. Williams believes that digital will make safety part of the culture. Digital lets more people have better access to information. Pedersen thinks intelligent operations is really about culture and safety and that automation will move people away from dangerous offshore platforms. But how do you persuade management to pay for sensors and databases when their focus is on production? As a recent digital convert, Pedersen observed that knowledge is the key and digital—data, communications—means that we can ‘turn around quickly and tell people what to do.’ Cost is an issue, and we can’t continue with current huge day rates for sophisticated equipment.
Keith Killian described ExxonMobil’s upstream digital infrastructure—largely inspired from the company’s downstream business—which has been using digital technology for over a decade to ‘squeeze out every bit of margin.’ One early example of downstream to upstream technology transfer is Esso Australia’s Longford plant where, with help from the downstream, severable multivariable constraint controllers have improved recovery and optimized plant capacity. But their use in the upstream remains relatively rare. Exxon’s digital technology in asset management (DTAM) is currently addressing surveillance by exception and predictive monitoring of rotating equipment health.
Shell’s Ron Cramer offered a slightly different take on the upstream/downstream divide. Cramer observed that process automation systems (PAS) have evolved over time with little attention to oil and gas needs. PAS solutions were designed for refineries and chemical plants are ‘fragmented’ and a ‘force fit’ to the upstream requiring lots of system integration. This is in part because the upstream ‘never really told vendors what its requirements were!’ Which is just what Cramer is attempting to do now. A PAS for the upstream needs to handle the ever increasing data volumes—Shell is to drill 20,000 wells by the end of decade. We need to ensure data is ‘owned,’ otherwise instrumentation is not maintained. Data needs to be grouped into objects such as a ‘well’ along with context and intelligence. The upstream also needs WAN/LAN integration—unlike a refinery which is all on a LAN. Autonomous systems such as beam pump need to be integrated with intelligence at the central facility. Cramer came up with more specs than you could shake a stick at.
BP’s Steve Roberts described some digital oilfield challenges. This is a broad, complex and cross functional activity that impacts many technologies and workflows. Prioritizing activities and making solutions sustainable can be hard. BP is on a path to add 100 million barrels of oil equivalent through digital technology by 2017. The company has already delivered a (remarkably precise) 73 million bbl/d, thanks to its proprietary technology. BP is engaged in building ‘deep capability’ for faster decision making, provide early warning of risks and suggest mitigation and optimization strategies. BP is also planning to make the client ‘irrelevant’ and provide engineers with access to information everywhere. Roberts elaborated on the use of mobile devices in the Q&A observing that most control room operators don’t want another screen. What is needed is more information in a smaller footprint. BP is working on security issues to facilitate access to information from devices such as the iPad, while ‘future proofing’ the technology.
One of BP’s digital flagships is its work with dynamic modeling to control slugging. The P22 well on BP’s West of Shetlands Foinaven field could not initially be flowed because it upset the whole gathering network. Fortunately, BP has seen a ‘revolution’ in dynamic modeling that is now coming into its own thanks to data visualization and analysis. Patrick Calvert described how BP now color codes data to distinguish between stable operations and slugging to pinpoint the safe operating range. Modeling can be used to select an optimum rate as an ‘advisory’ for operators. But true slug control takes this further to automate actuation of production control valves. One 23 km tieback experienced 3,000 bbl/d of deferred production due to high back pressure. This ‘very complex system’ was modeled in SPT’s Olga and Matlab. Dynamic modeling initiatives led to 17.5 million bbl/d of annualized incremental production in 2011.
Total’s Mohamed Haouche described a ‘smart meter’ pilot on joint venture with Qatar Petroleum. Modeling technology from Belsim was used to perform online data validation and reconciliation (DVR) a.k.a. an ‘advanced virtual flow meter.’ DVR, technology transferred from the downstream, is used for production optimization, energy management, condition-based metering and production allocation. The idea is to measure a range of accessible parameters and compare them with models of the field to infer an unknown quantity such as flow in an unmetered well. In this study, the virtual meter provided better consistency than a multi phase flow meter. This is likely due to the fact that MPFMs are also partly software based and present the same kind of problems.
Gulio Gola of Norway’s Institute for Energy Technology has applied artificial intelligence to managed pressure drilling. This involves a model-based approach to bottom hole pressure estimation. Models can be physical (first principles) or empirical (data driven). The latter can capture unknown processes at work. Gola used a combination of both to analyze a 450,000 sample dataset from four days of North Sea drilling. Four models (a flow model from Sintef, EnKF, virtual sensor and support vector machine) were combined. The median was a good approximation to measured BHP. AI appears to work and improves predictions.
Attending Intelligent Energy is a bit like drinking from the fire hose for us at Oil IT Journal—there will be more in next month’s issue when we’ll report on Exxon’s intelligent agents, Chevron’s McElroy i-field and more artificial intelligence applications.
© Oil IT Journal - all rights reserved.