Keynote speaker Don Moore described how Oxy is using oilfield automation (OFA) to meet production targets and to optimize corporate acquisitions. Oxy’s OFA scope spans well construction, operation surveillance and asset management and leverages Case Services’ Life Oil Well Integration System (LOWIS), MRO Software’s Maximo asset management package and Oxy’s ERP system. Field data is served up as key performance indicators over the Oxy Dashboard, increasing Oxy’s ability to react quickly and to fix production problems as they happen.
Oilfield Automation Systems
Cleon Dunham (Oilfield Automation Systems) pointed out that oilfield automation isn’t new, it started in the 1960s. A good surveillance system goes beyond the presentation of information and focuses on enabling corrective action. Electric submersible pumps (ESP) require top-quality monitoring to avoid damage and capital loss. Dunham maintains a database of ESP and progressive cavity pump data, the ‘Reliability Information and Failure Tracking System.’ Others are invited to share such information on alrdc.com.
Glen Klimchuk described SAIC’s ‘next generation oilfield’ (NGO) program as spanning G&G, development, reservoir engineering, operations—right through to sales. NGO techniques are applicable to mature, marginal fields with large EOR programs and low per well productivity. Field data, the data historian and applications are linked through an integration layer to the asset dashboard for reporting. SAIC’s NGO solutions leverage components from Cognos, Microstrategy, Business Objects and Hyperion. Real time architectures can be based on .NET or J2EE web services. Shell is using the NGO program to ‘change its E&P business.’
For Bob Bacon, Pavilion Energy Services, the oilfield is a ‘factory’ and the same automation techniques can be applied. Pavilion’s I-field Perfecter uses the ‘objective function’ to maximize field value without violating downhole pressure constraints. The model can be run in real-time and adapted to sensor data and operator set objectives. The company’s ‘mission-critical’ software includes over 100 integration drivers, a bespoke GUI and over 100 patents.
Pat McGinley described how the Baker Expert Advisory Centers/Operation Network (BEACON) evolved from early data centers in the North Sea. Conventional, in-house data centers are ‘great, but not practical’. McGinley believes that investment in in-house data centers will be pulled back as companies look to more efficient solutions. The BEACON model is just that, a slimmed down data center aimed less at offshore head count reduction and more at virtual team building. Baker’s GOM center monitors 15-20 plus wells at any given time, through an extranet site with secure, entitlement-controlled client access. WITSML is a great enabler in all this and has broken down barriers between service companies which operated their own proprietary data systems in the past.
Don Colley’s company, DGC Consulting, is specialized in gas management and energy saving. Colley’s Energy Efficiency KPI is obtained by dividing the energy equivalent of production by the energy ‘cost’ of production. The ‘cost’ can be energy, or ‘eco-efficiency’ metrics like CO2 or sulfur emissions. KPI granularity goes from pumps and compressor energy efficiency to high level, wellhead to point of sale energy ‘intensity.’ So far 22 production facilities have been benchmarked. Colley commented that today, most plants are not even monitored.
Jaleel Valappil described the Bechtel/ConocoPhilips joint Global LNG Collaboration. A ‘virtual’ LNG plant has been developed in AspenTech’s HySys and DeltaV Simulate Pro. This enables testing and optimization. Process simulation is conducted in Matlab. Valappil advocates ‘open software standards’ – specifically ActiveX/COM, CAPE OPEN and OPC. These enable different simulation tools to cooperate and provide a ‘platform independent’ modeling framework.
This article was taken from a 9 page report in The Data Room’s Technology Watch series. More from email@example.com.
© Oil IT Journal - all rights reserved.