According to Marathon’s Steve Hinchman, current demand forecasts and declining production will mean a 125 million bbl/day shortfall in oil supply by 2020. The problem is not insurmountable, reserves are there but most are in the hands of National Oil Companies, (NOCs). Historically, NOCs required investment from the International Oil Companies (IOCs), but now have they have both capital and technology. IOCs need to adapt, to ‘differentiate and partner with NOCs’. Value-add relationships are critical and need to be ‘more open and transparent’.
Today’s work practices mandate a ‘single source of the truth’ and there is new focus on IT-enabled knowledge sharing. Despite this, petrotechnical and IT professionals are ‘like ships passing in the night.’ To date IT has ‘over promised and under delivered’. But potentially, the competitive advantage goes beyond the digital oilfield of the future (DOFF) to leverage combined talents, workflows and industry standards. Connectivity is the tissue that pulls it all together. In the Q&A, Hinchman admitted that for Marathon, ‘standards’ are more likely to mean ‘internal standardization on vendor applications’ than POSC or PPDM.
Hinchman doubted that the data standards problem can be solved across the industry. Another questioner asked if the new base oil price was affecting IT investment. Hinchman opined that, ‘Investment levels target sustainable growth and are independent of the oil price. Our focus is on our workflows. IT invest will come later.’ Another questioner asked what Marathon had learned from talking to other industries about managing the ‘data deluge’. ‘So far we have talked to an army of consultants and heard a lot of consultantese. We’ve not yet seen the benchmarks we need.’
Michael Lock (Google) believes that adding metadata tags to unstructured information is a waste of time. Google’s experience with one global energy customer was that content was distributed across geoscience, HR, seismic etc. Archival involved ‘cumbersome’ manual tagging. There was no provision for cross silo search and on retrieval, relevance was ‘spotty’. Following implementation of Google’s Enterprise Search, ‘searches are up and complaints down.’
Marc Sofia described Baker Hughes’ use of commercial off the shelf software in a prototype enterprise data integration solution that federates well data across multiple legacy data stores. A ‘search broker’ captures metadata and allows for ‘requirements-driven data synchronization’.
Jim Crompton, Chevron, suggests that we have created ‘yet another silo boundary’ between the modeler builders and operations that ‘deal with reality’. This is reflected in the tools the different communities use which range from the sophisticated (for the modeler) to the spreadsheets of the operator. Engineering is dominated by Microsoft Office technology with project documentation in Power Point, collaboration via Outlook and ‘faxes still work, as a proxy for lack of connectivity’. All of which leads to unstructured data issues as email grows, data hides on ‘O:\ drives’ leading to multiple versions of documents. It’s not that ‘IT didn’t do it right,’ rather, ‘IT didn’t do it at all!’ Many digital technologies, such as SCADA, don’t even belong to IT. Another bleak truth is that ‘data management is worse than you think, it’s amazing we do business at all.’
Data data data
The data issue cropped up again in Don Paul’s (Chevron) presentation. ‘People are maxed out on data and we’ve only just started!’ and for Devon Energy CIO Jerome Baudoin, data management is ‘more and more of a problem in our environment, a complex, ill defined activity.’ Despite the best efforts of PPDM, PIDX and POSC, ‘we go over and over again spinning our wheels!’ As an independent oil company, ‘we want to spend energy on what is critical to our organization’. A balance needs to be struck between data overload and data ‘righteous’ (sic). Don Paul noted the increasing sophistication of the NOCs as impacting the traditional role of majors as bringing technology to the game.’ For Alan Huffman, (Fusion ), the 21st century ‘will be the century of the NOC. These companies are aggressively hiring US engineers and IT people today.’ Katya Casey (BHP Billiton) offered an impassioned plea for data management, ‘Companies don’t put enough value on data. Nobody adds metadata. The discipline grew out of secretarial and drafting departments, there has been no education, training or support for data management.’
Casey presented BHP Billiton’s Technical Information Architecture, a ‘common information platform to solve business problems’. Core application selection is to be ‘workflow-driven’ and delivered on a single technical hardware platform. Components include OpenWorks, Foster Findlay & Associates, Paradigm and Petrel. Seabed and OpenSpirit also ran. BHPB’s Portal, the Petroleum Professional Web Workspace is under construction. BHPB is looking at ProSource with SeaBed as a taxonomy-driven metadata repository. A major workflow analysis is underway prior to consolidation into ‘true’ 3D environment circa 2008. BHPB is ‘coming to terms with a multi-dimensional integration process.’
Tom Evans detailed Marathon’s in-house seismic processing effort on the Cactus prospect. Kirchoff and Wave Equation Migration were both used, running on Marathon’s own 1,000 CPU cluster. Re-running processing tests even one year later show significant image enhancements due to code improvements. ‘As geophysicists, we shouldn’t be worrying about the hardware but the reality is that this is a highly compute-intensive, interactive process.’ Stochastic simulation is also used with around 100 runs per evaluation, performed on a 256 node SGI box. The engineers liked it so much that ‘they kicked the geoscientists off it and made them buy another one’.
John Nieto (Anadarko) contrasted the ‘linear’ approach to problem solving with shared earth’ modeling (SEM). This integrates static, in place reserves with dynamic, fluid flow modeling. Log-derived facies populate the geological model. If the history match doesn’t work, ‘there are many things you can alter’. The SEM is driven by software like GoCad, Roxar and Petrel. These tools let interpreters see seismic, logs, facies maps all together in one environment.
Linda Dodge (Shell) stated that poor data management was at the root of the Piper Alpha disaster and also caused the loss of one of Shell’s oilfields, which was killed by water injection. Dodge traced the move from the well-maintained, central datastores of the 1980s, with their ‘costly bureaucracy’. In 1995, Shell decentralized and allowed for local customization. But this caused ‘data loss and confusion’. Shell now has a data management community with virtual teams, data quality metrics, global standards and a ‘higher profile for data management’. Shell is involved with POSC and sees standards like the Global UWI project and PRODML as non-competitive.
Oil and gas is a poor performer compared with other verticals according to Microsoft’s Mike Brulé. Disparate systems lead to ‘cumbersome’ navigation in the data environment. Brulé suggests that the DAMA International organization offers ideas as to how enterprise business intelligence (BI) could help. Our complex, ‘highly engineered’ industry lags others in data management and enterprise BI analytics. Some propose services-oriented architecture (SOA) as the new silver bullet. But for Brulé, SOA ‘is orthogonal, it says nothing about data management, quality and system performance.’ Some argue that our data is different from other industries. Brulé disagrees, ‘Data use is axiomatic across all industries’. But Brulé did acknowledge that oil and gas has to cope with an application ‘tug of war,’ between Maximo, DIMS, SAP etc. ‘It’s real problematic’.
This article has been taken from a 10 page illustrated report produced as part of The Data Room’s Technology Watch Research Service. For more information on this service please email firstname.lastname@example.org.
© Oil IT Journal - all rights reserved.