PESGB Data Management Seminar

The Petroleum Exploration Society of Great Britain’s Data Management Seminar was held in London last month in Burlington House on Piccadilly, the stately home of the Geological Society. This excellent location allows for impromptu viewing of “The Map,” William Smith’s 1815 stratigraphy of the British Isles. Highlights of the meeting were Tim Haynes’ (Ceradon) un-presented paper on environmental data management, Paul Duller’s (Instant Library) heads-up on a new ISO standard for document management and Dave Gunn’s (Southampton Oceanography Center) review of emerging standards in ‘discovery level’ GIS metadata. Also worthy of note is the new GrandBasin - Accenture’s joint venture into the corporate portal – ‘E&P Online’.

Nick Blake (Lynx) explained “Why GIS is good for data managers - you!” Geographical Information Systems (GIS) are set to change the poor image that conventional data management has suffered from in the past. GIS-enabled data management offers immediate and visible return on investment.

Explorationist’s dream world

Noting that Landmark (with Open Explorer) and IHS Energy (with Probe) both deploy ESRI’s ArcView, Lynx has followed suit and is building the explorationist’s ‘dream world’ offering GIS access to the corporate data base from the desktop. Lynx software adds SEG Y and other vertical viewers to ArcView, and Adobe Acrobat to view bitmaps, manipulated using compression software from MrSID. Geological cross sections can be constructed with ArcView scripts and 3D Analyst offers perspective viewing. Why ESRI? ESRI has 1 million licenses, the ESRI Petroleum User Group is ‘very active’ and ESRI is in the top ten software companies for R&D expenditure. Lynx produces data packages for sale which are sold in GIS format for integration with corporate data. In the future, ArcIMS will allow for data to be held in SDO/SDE on a server with web access from a browser.

Green data

Tim Haynes (Ceradon) was unable to present his paper, we report from his handout. Haynes stresses the growing importance of the environment in oil and gas. Corporations now report a ‘triple bottom line’ based on economics, social and environmental sustainability. Haynes’ thesis is that these new constraints mean that “timely and reliable access to the ‘greener’ part of the data spectrum is now a necessity.”

Waste management

Environmental measures and drivers include emissions to air, discharges to water, waste management, contaminated estate, hazardous substances and nuisance. Companies must manage liabilities such as the failure to apply for licenses, permits, bad public relations, emissions trading penalties and compliance audits. Green data management should monitor targets against achievements, and measure the environmental impact across multiple business streams. The usual data management issues apply to green data - QC, data preservation and meaningful presentation. Particular green constraints include legal and audit implications and the need for continuous monitoring. Green data is moving from a secondary data source to mainstream, auditable publicly available record. So watch out for the legal consequences of poor data quality and management. Inadequate data access, quality and audit can have legal, financial and public relations implications in the broader world. More from www.ceradon.com.

E&P Online

The recently announced joint venture between Landmark and Accenture is bearing fruit in a new upstream portal “E&P Online.” E&P Online is ‘not a product’ – yet, but sets out to showcase technology from SAP and Plumtree to provide a configurable user interface to corporate and other data sources. The E&P Online home page is a dashboard-like screen, configurable to display news feeds, alerts, key performance indicators, contacts and email linked to tasks and a calendar.

Workflow tools

The intent is to develop workflow tools to guide users through prospect evaluation. The specificities and challenges of working with high-volume upstream data are addressed by incorporating proven PetroBank technology. Heavy-duty UNIX-based applications can be run over low bandwidth links (ISDN), using hosting software (Citrix MetaFrame). GrandBasin, Halliburton/Landmark’s e-business unit intends to leverage the global HAL network to support the Portal. You can check out the portal on www.eandponline.com.

Data Migration

For Common Data Access’ Peter McCartney, data migration is a human activity that uses technology - not vice versa! McCartney recommends using the ‘best data model available’ and populate it correctly. Provisional standards recognized by CDA include the DTI Well Header, the POSC Data Model, EPSG Coordinate reference system and SEG Trace Formats. CDA now supplies a backdrop of coastlines and other North Sea data in ‘Open Source’ format. Pressed on what was ‘the best’ data model, McCartney opined that both the POSC and PPDM model were ‘perfectly good’. Both have had ‘lots of energy put into them.’ The real problem is in getting data from one to the other; for McCartney this is ‘rather like the MAC-PC debate.’

Atlas Online

Steve Kentish from Baker Atlas e-business unit Atlas Online wants to eliminate paper and tapes from data delivery by replacing them with https (the s is for secure) connections. These will facilitate offsite backups and controlled, distributed project areas. It is now possible to run a log in Angola and transmit the data by satellite to Aberdeen for processing and on to a client in Milan. But better, a two-way link allows for rig site access to the database, so operators can view offset well data. POSC curve type and units are used - data is archived in Recall in original format with entitlements set to allow for partner access. This web-enabled archival and retrieval system will support Application Service Provision (ASP) real soon now! Check the system out on www.totalrecall.cc.

Java GeoScope

Wally Jacubowicz (Hampton Data Services) unveiled the new, Java/ArcView interface to Hampton’s GeoScope. The new technology allows for the merger of GIS themes (data sources) on different servers. User interface enhancements allow for drag and dropping of file icons around the tree-view, to re-organize data on the fly - note that the files themselves do not move. For Jacubowicz , the key to the system is its use of unique document identifiers. GeoScope is said to be highly configurable and is being extended to Health Safety and the Environment (HSE) reporting.

ISO 15489

Paul Duller (Instant Library) offered a heads-up on an important new standard coming from the International Standards Organization (ISO). The ISO 15489 international standard for information and document management was launched at the American Records Management Association (ARMA) conference in Canada this year. Based on an Australian standard used by BHP, the ISO program involves the design and implementation of record keeping systems. Currently, Duller is unsure what the take-up will be but the early Australian adopters believe it will have significant impact. More from www.iso.ch.

Metadata standards

Still on the theme of standards, Dave Gunn (Southampton Oceanography Center) provided an overview of current developments in ‘discovery-level’ geospatial metadata standards. Gunn has been working on the Eurocore project (www.eu-seased.net ) collection of 22000 sea floor samples and realized that metadata standards were needed to allow data consumers to ‘discover’ geographical datasets without necessarily accessing the bulk data. Similar needs for ‘discovery-level’ metadata have been noted in fields such as hydrography, telecommunications, environment, resource management, and marine geology. There has been some EU work on GIS metadata standards, but this ended up ‘arguing about terminology.’ Other projects include the Federal Geographic Data Committee (FGDC ) spatial metadata standard and the Dublin Core metadata standard.

Internet Geoscience Data Index

Geremy Giles of the British Geological Survey believes that there is no gain without (data) pain! Using a subsurface model from the mining industry, Giles stressed that serious use of spatial data in a business requires a knowledge of the degree of confidence in the data and observations making up the model. BGS’ work builds on data acquired over four centuries. This can prove something of a liability - the BGS was sued recently over a 19th century mapping error involving a single data point!

Geological Lexicon

To reduce ambiguity in geological mapping and modeling, BGS has built a lexicon of formation names used to enforce a standards nomenclature. The BGS has put its data management rationale to the test in the Internet Geoscience Data Index - IGDI on www.bgs.ac.uk/geoindex/. Here the original flight lines or data points can be seen along with the full data sets. Data Management and ‘discovery-level’ metadata are the keys to proper support of modeling. For more on the BGS approach to metadata management see www.bgs.ac.uk/discoverymetadata/home.html.

Metadata standards

Steve Kentish had a second crack at the whip with a paper on data management in practice. Key issues are who is responsible for data management, who is responsible for the system and are these functions core business for the organization? Kentish observes that upstream interpretation is frequently done on incomplete data. Data management should aim at using all available data, and at freeing up time spent by geoscientists in controlling lost or missing data, and in cleaning existing data to a consistent state. Evaluation of such wasted time can provide a powerful business case and cost driver for data management projects. Where are we heading today? Kentish notes, “More data will be acquired in the next 5 years than in the last 50!”

33 million curve meters

For Baker Atlas, such data cleanup is exemplified by a recent BP contract involving the QC/QA of 2,500 wells, 52,000 curves and 33 million curve meters. QC was ‘built in’ to the work process - you cannot load a curve unless it is ‘clean’. Lessons learned from this project - do not start too quickly, don't finalize procedures too soon and start small. Recall templates ensure consistency and provide an auditable QC trail and standard procedures.

Traceability

Thorough operator training is required and record sheets ensure traceability of changes through a checklist. These are built around an Access database (with web access for operators) that tracks project progress. Baker mirrors the client’s software system onto a local project network, independent of the Atlas group network. Recall now handles VSP logs, cores and photomicrographs. Similar projects have involved database cleanup and core analysis databasing. Kentish notes that “there are no standards for core analysis data”.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.