Jackie Banner presented the UK DTI’s PILOT Data Initiative. The objectives are to provide easy access to UK data and to relieve licensees of their ‘perpetual obligation’ to keep data on behalf of the government. Legacy systems can be a stumbling block for digital data sets. The ‘Digital Doomsday Book’ project only lasted 15 years before the BBC Micro-based data was unreadable. After 1,000 years or so, its paper based predecessor is still in use today! Numerous workgroups in PILOT and the UKOOA have looked at these issues and there is now one solution, a single data catalogue, the National Hydrocarbon Data Archive.
Benchmark
A benchmark study of archiving costs is underway on Kerr McGee’s Hutton field. The operator is responsible for data and the catalogue. Paras is acting as a ‘confidential cost advocate’. Problems arise from inconsistent catalogue naming; it is proving very hard to aggregate the information. ‘There are as many E&P catalogue standards as there are operators’. The DTI is spearheading the search for an agreed set of catalogue attributes. Feedback and input from operators and service companies is now required.
Deal Data Registry
Robert Gatliffe (British Geological Survey) presented the new Deal Data Registry (DDR). DDR is a web-based GIS with definitions, accurate metadata and links to vendor data stores. The Sea Fish Authority database is linked to DDR so that when a trawler operates too close to a pipeline, a warning bell rings on the bridge. Data cleanup is the major challenge to this project.
Geological data management
Jenny Walsby (BGS) described the Survey’s goal of creating a ‘secure digital dataset’ of borehole and geological information. BGS’ million borehole paper records, a million maps, photographs and reports have been scanned. The result : 2 million monochrome paper records and 180,000 color maps—a 17 TB dataset. Data is compressed to JPEG 2000 and archived to LTO 2 tapes. Managing user expectations—of instant access and current metadata—was the hardest part. Users have to be educated as to what was ‘acceptable’ use—‘Digital data still needs managing’.
GIS galore
Several speakers came to praise Geographical Information Systems (GIS). For most, GIS is synonymous with ESRI which has, according to Nick Blake (Lynx) a ‘90% share of the upstream market’. So the following presentations were presumably aimed at the remaining benighted 10%. Blake showed Lynx’s (very pretty) GIS-based geological map of Iran. Gavin Critchley (IHS Energy) introduced the concept of spatial layers—cartographic representations of database features. Critchley pulled another GIS statistic out of the hat—‘90% (again) of oil and gas data has a spatial component’. Oils are ‘not as digital as they think,’ but the availability of map layer information is increasing and end user skills are growing. GIS, in the form of the ESRI desktop is the silent revolution. Karen Blohm (Robertson Research) was also pushing at the open door of ‘GIS is good for you’—with a plug for the Explorationist’s Workstation. The key to spatial integration is metadata on author, source, project, coordinate system etc. Robertson has developed ‘rule-based’ utilities for metadata population. Geoprocessing is used to convolve maps of source rock development, reservoir extent etc. for fairway analysis – illustrated with another pretty map of the Murzuq Basin, Libya. Steve Ashley (Venture) also believes that GIS has a lot to offer—but more focus is needed on implementation. Venture uses an ‘Information Maturity Matrix’ after D’Angelo and Troy. The ‘Matrix’ (starring Venture) provides metrics to help companies move from the ‘heroic effort’ state to ‘predictable risk’. Nathan Balls showed how Petrosys’ DirectConnect (DC) makes more pretty maps from diverse data sources. Many interoperability ‘solutions’ for the upstream have not caught on (Balls cited POSC, PPDM, Geoshare, OpenSpirit—and Petrosys’ own dbMap) because they fail to address the issue of finding and using data across multiple Oracle instances. DC fixes this by retrieving GIS across multiple databases—including SDE, dbMap, Finder and OpenWorks. Balls bravely claimed that the GIS war was not over—‘Petrosys has more experience of E&P data objects than ESRI’.
Seismic processing
Adam Mitchell presented the results of Paras’ recently completed survey of the geophysical marketplace. Seismic data management, processing and interpretation are increasingly integrated. Bandwidth and media are ‘struggling to keep up’ with field data volumes. Existing software was not built to handle multiple attributes or depth domain data. As much as 50% of seismic processing is production related. All companies surveyed forecast that in-house processing would grow. Companies believe that in-house processing provides ‘a major differentiator’. Eleanor Jack, Landmark gave the inside track on seismic data management—which can be a risky business: tapes can get lost and be omitted from a job, files may be only partially transcribed or corrupted in transcription or demutiplexing. A multitude of technical pitfalls await the transcriptor—from incorrect header data to disappearing inter-record gaps. Jack showed one survey where a format code snafu scrambled the whole survey in transcription. Transcription programs are ‘very robust’—they keep going even with the wrong input parameters!
Records Management
Trudy Curtis, PPDM Association CEO believes that records management is growing in importance. Information needs to be properly managed or else ‘you may go to jail!’ In Canada, PIPEDA can audit information management systems. In the UK, the data protection act places similar constraints on what can be done with corporate data. A major problem is that today’s users are all untrained records managers. Taxonomies and metadata are the way to go. PPDM is working on Dublin Core, and the Web Ontology Language (OWL). Jamie Burton (Instant Library) believes that companies need to develop retention strategies and shouldn’t keep records for longer than necessary. Regulatory compliance is a legal necessity – with the data protection act, HSE legislation and with legal ‘holds’ for pending court cases. Corporate policy has to be developed and enforced.
Standards
Barry Wells (Conwy Valley Systems) reported that the IUGS commission for systematics in petrography is working on sediments. The IUGS nomenclature is embedded into Conwy Valley’s Petrog product. Glen Mansfield (Flare Consultants) reviewed the POSC E&P Catalogue Standard—EpiCat V 0.5 which consists of a set of attributes and valid values for E&P objects. Catalogues are used in document management for control, retention, bibliographic studies etc. The key to successful integration is to present the catalogue in a meaningful way to the end user. Flare is working on an XML cataloguing standard—CAML. Another initiative—the POSC Business Process model (ex Shell) will provide a ‘contextually valid value list’ for EpiCat. POSC’s Paul Maton called for more work to homogenize the ‘huge number’ of curve types and complex curve and tool naming schemas. Work on WellHeaderML has been customized to the UK DEAL environment. Maton described the ongoing E&P data catalogue as ‘wide ranging—possibly too wide ranging’. A project to ‘align’ the catalogue with W3C recommendations on RDF and Dublin Core metadata standards is to kick-off soon.
2D not dead!
Christine McKay (Landmark) believes that ‘2D data is not dead’. Shell still needs to display 2D navigation data for all of the UKCS, to see its own data along with other companies’ data and speculative surveys. A lack of 3D data in the 21st round made old 2D lines from the 1980’s vital—one 1973 survey was used and even some paper sections. Issues arose with multiple versions of the same line, ‘unexpected’ line groupings and inconsistent names. Landmark was asked to ‘streamline’ the Shell dataset of 5,000 surveys and over 120k lines. The process was automated ‘as far as possible’. QC plotting –was reduced to a ‘minimal but acceptable risk level’.
OASIS
Jill Lewis introduced Troika’s new Open Architecture Scalable Information Store (OASIS), a binary object store built on Open Systems—Linux, mySQL/PostgreSQL—offering seismic metadata management, positioning data and random access to trace data.
Deviation
Two Johns, Harries and Kelly, (Hydro Projects) demonstrated how failure to take account of the earth’s curvature, other errors and misapplied corrections could make for a 100m difference in bottom hole location for a 11km horizontal well.
Services, not products
Wendy Kitson (Kadme) reckons that services, not products, are the powerhouse of Information Management in the oil industry. Decentralization has created havoc for the data manager—with data everywhere and no checks and balances on use. Business consultancy and services are where the money will be made in the next few years.
eSearch
Ben Trewin reported that IronMountain has signed with Schlumberger for the merger of AssetDB and IM’s eSearch. The product merge will complete in 2004. AssetDB and OpenRSO clients will be migrated ‘real soon now’.
Disaster!
Paul Duller (Instant Library) asks how quickly will your company be up and running after a disaster? A survey showed that a major IT disaster was invariably bad for a company’s health – and often proved fatal. Companies must plan for disasters – singling out ‘vital records’ as priority targets for the protection effort. Practical suggestions include moving stuff away from pipes, keeping things dry with roofs that deflect water from shelving, installing smoke alarms and so on.
This article has been abstracted from an 11 page illustrated report produced as part of The Data Room’s Technology Watch report service. If you would like to evaluate this service, please contact tw@oilit.com.
.
© Oil IT Journal - all rights reserved.