IQPC Digital Oilfields Summit 2013, London

Kuwait Oil Company—'There is no such thing as a small digital oilfield project.’ McLaren Software on why you should not pick the 'low-hanging fruit.’ A brief discussion on structured vs. unstructured data. BP on the role of the ‘cyber librarian.’ Troika on new media for the million channel seismic survey. Talengi on weak out-of-the-box EDMS solutions. Datawatch on its ad-hoc ‘tree map’ data visualization offering.

The second IQPC Digital Oilfields Summit held in London late last year provided a decent combination of oil and gas companies’ experiences and vendor innovation. Hatem Nasr provided an in-depth update on Kuwait Oil’s (KwIDF). In the 2000s, KOC was preoccupied with real time data. As the last decade drew to a close real time data was available but the question was, what to do with it? It had become ‘more of a problem than solution.’ What had been underestimated was the effort required to mine, clean and make sense of all this data. Today companies have the systems and technology to integrate and process data—and are achieving good results. Even so, it remains a struggle to assess the true value of digital, which impacts investment plans.

Many companies are not doing it right—there is too much focus on doing piecewise or a focus on a single technology like smart completions. Which, for Nasr, are not the digital oilfield. No more are smart downhole sensors, analytics, data mining or ESP optimization. All these are just pieces of the puzzle. The heart and soul of the digital oilfield is integration and collaboration across all of the above. The true digital oilfield delivers demonstrable value. Are you getting more oil and/or gas?

KOC has initiated multiple large scale DO projects, some across whole fields and representing investments of $10s or 100s of millions). These have been carried out with help from different suppliers and at several locations to see what works where and identify best of breed solutions. The key is not to ‘just sit there and watch the service companies.’ Even the big ones like Schlumberger, Halliburton, Emerson do not have all the knowledge and skills to really understand the DO problem. It is the operators that have the requisite knowledge of its fields—hence the need for a true partnership.

Change management is crucial to the DO project which represents an upheaval in work processes. This is an ‘ongoing challenge.’ There is no such thing as a small DO project. It is not just a glorified Scada deployment! The outcome may be uncertain. Production may not ‘rise by 10%’ and mistakes are inevitable. KOC has four projects underway. These can be considered as pilots but are actually very large—Sabriyah covers 49 wells, Burgan GC1 60, Jurassic 30 and West Kuwait 90. The aim is for an environment that encourages collaboration along with improvements to the corporate knowledge base. The idea is to ‘make it smarter’. One monster problem that has to be overcome is data management. A complete workflow includes real time data, models, ‘lots of AI’—statistics, intelligent agents, numerical simulation and forecasts. In the field there are major infrastructure changes with Wimax and an IT infrastructure. This has enabled continuous well testing. The Jurassic KwIDF project is entirely wireless. Wireless communications are now both a commodity and a game changer—you can just ‘pop a sim card on a well.’ KOC is getting value from its digital effort but Nasr believes that ‘the greatest value has yet to come.’ Projects start but they don’t end—this is a continuous improvement process. Technology helps but the DO is ‘mostly about change management.’ In the end, you ‘go digital or perish!’

The discussion on structured unstructured data highlighted the difficulty of making a clear distinction between data, metadata and ‘unstructured’ data. Metadata is a dataset in its own right and one repository’s metadata is another’s data. Nasr asked ‘How do you know how much a well produces?’ You may have a production test, a test at the gathering station or a reading from a $300k multi-phase meter all giving very different results. Which is correct? The answer is, ‘it depends.’

Getting back to the unstructured data issues there was a plea for a solution that would bundle Petrel project files and Adobe PDFs into a ‘standard format for interpretation results.’ One possibility would be ResqML or perhaps the embryonic efforts of the SEG’s EarthIQ Forum.

Wendy Valot described how BP divvies up its knowledge management into project information management (PIM) and KM proper. PIM manages data, files docs. KM manages know how, experience and learning in the context of continuous improvement—or, as Abe Lincoln observed, ‘it is a poor man who is not wiser today than yesterday.’

Valot started in BP’s drilling unit in 2004 where there was a strong KM culture. Her role now is to understand what had been achieved in drilling and expand it to thousands of other users across BP. For drillers this involves an after action review following a well. But in other sectors, there may be a time lag of many years as projects complete and individuals move on. Folks will still use the old best practice in interim and may fail to leverage the most up to date knowledge. It is therefore crucial to determine the value of a piece of knowledge and to position it in a quadrant of term of use and value. The best items are long term high value. This is allowing BP to deploy a systematic approach, tied to best practices and making knowledge accessible, reliable and shared.

To achieve this the company has defined KM roles and implemented training to make KM ‘systemic and repeatable.’ Previously knowledge was captured in long reports, books, notes and there was a reliance on search. This led to ‘big data overload.’ The new combination of support roles and technology is helping to establish connections and consistency and to ‘avoid long reports that nobody reads.’ Roles are being built into the information flow—injecting action items, profiles, alerts, distribution lists. Previously well-intentioned people stored lots of information in long reports. This is evolving into more succinct information items sent out to distribution lists along with alerts. BP’s ‘cyber librarians’ canvas users to see what products are relevant to them. Cyber librarians classify documents according to domain-specific taxonomies.

The idea is to create ‘purposeful social networks’ starting with project teams and working outwards. Valot’s team makes sure that knowledge gained in, say the North Sea, is immediately available to workers in Australia. BP’s knowledge management effort has now spread out to global wells and operations. Deployment is now in the works for downstream and refining.

Tim Fleet (McLaren Software) is an advocate of business process management. While generic BPM tools are fine for simple workflows, supply chain integration and complex engineering documents need more specialist tools. Fleet recommends that when launching a pilot project ‘don’t pick the low hanging fruit.’ This is because it such projects are likely to be easy to implement and are unlikely to be representative of real world BPM.

Jill Lewis (Troika) continues with her crusade to educate industry on the problems it is facing with exploding seismic trace counts and their solutions—standards-based data management. Her poster child use case is Apache’s ‘refresh’ of BP’s legacy seismic data over the UK North Sea Forties field which contributed to a 800 million barrels reserves hike. Apache is now implementing life of field (LoF) seabed seismics to further aid its production effort. It is an exciting time technology-wise for seismics with the latest 3592 barium ferrite tapes holding 4 terabytes each. Metadata is stored on a microchip on the cartridge and the new robot holds 2.5 exabytes. On the field recording front, Sercel has just announced a million channel system. All of which is mandating a new focus on data management. While SEGD 3.0 is ‘an absolute necessity,’ there is still no standard for tape on disk. Without proper organization, read times for a 600GB tape ‘go through the roof.’ Future LoF data volumes to be stored are huge compared to today’s. ‘Get a handle on it now.’ Troika, whose data loading package Magma is now embedded in Landmark’s SeisSpace, is training IT folks on seismics. Both IT and technical contract writers need to understand these issues.

The topic of document control on oil and gas projects was addressed by Matthieu Lamy (Talengi) who has been working with GDFSuez. Thousands of engineering documents undergo multiple reviews across the workflow—from design, purchasing and on to construction? The whole process today focuses on engineering deliverables rather than coordination and administrative documents such as deviations, change orders and even meeting minutes and reports. The master deliverable register is the key for all stakeholders and, for a medium size project, might include 10,000 documents. Stakeholders need to know that they are working on the latest version of a document. It can be hard keeping track and may introduce financial of information security risks. In the event of a problem it may be a legal requirement to demonstrate who did what, when and where.

Document control (and the controller) should be at the center of upstream IM. One solution is, naturellement, Talengi EDMS and the ISO 30300 standard which ‘may drive document controller to a more global IM role.’ One problem is that the EDMS has no equivalent to the iPhone, it may not fulfil immediate expectations. Out-of-the-box solutions are weak and need a lot of work to customize. This itself is a ‘difficult and dangerous business’ due to vendor lock-in, infrastructure requirements and costs. Other tools are also expected— FTP, Excel, email along with technologies like HTML5, CSS3 and JavaScript. Lamy floated the idea of an initiative to build a ‘better EDMS’ more suited to the diverse requirements of the upstream. In the Q&A, Lamy was asked if the document focus was not a bit anachronistic in today’s 3D world. He responded that documents were still required to define the 3D model. Another comment was that the use of web based document repositories like ShareCat might help solve the versioning problem.

Martin Black wants oils to consider using the same technology as capital markets to ‘get decisions right.’ His company, Datawatch , which acquired Panopticon in 2012, competes with Tableau and Spotfire on analytics of historical and real time data. The technology accesses multiple data sources—Excel, logfiles, feeds and databases and offers a configurable view across all data sources in what is described as a ‘tree map.’ These displays were previously referred to as heat maps (2012/03/26). Investigators can drill down through the map and investigate data as scatter plots etc. Black was asked how his software managed to connect with so many different sources. He answered that most data sources today expose an API such as JSON, XML or Odata. If a source is really unusual a bespoke solution can be built but that in reality, ‘Even if software is said to be radically different, it usually isn’t.’ For vendors who refuse direct data access, ‘Just give us a data file. We have years of experience handling this kind of behavior.’ More from IQPC.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.