ESRI European Petroleum User Group, London

ESRI’s ‘mobile first’ approach. Location is the ultimate foreign key. ArcData spatio-temporal repository for big data/AI analytics. Hadoop/NETCDF trials. Apple Macs as de facto endpoint. BP OneMap poster child. Woodside’s cognitive GIS ID’s ‘bike incidents.’ CGG NPA’s authoritative satellite backgrounder. IOGP on North Sea geodetic engineering fail and the Helmert transformation. Wintershall’s tricky permissions management. Shell’s myMap for seismic planning. Zolnai on Harvey, GIS and social media. Voyager’s cognitive search. More from ESA, IHS, Statoil, OGA, AkerBP.

Opening the proceedings at the 2017 ESRI EU Petroleum User Group, ESRI’s Danny Spillman advocated building a digital twin in ArcGIS. ArcGIS fits the digital twin paradigm well as a natural system of record for spatial data and more. ESRI is encouraging developers to build a ‘system of engagement’ to manage legacy data in original format.

Spillman sees five GIS trends as follows. 1) ‘consumerization’ and a mandatory ‘mobile first’ strategy for developers, 2) integrate with other systems with location as the ultimate ‘foreign key’ connecting real time systems, ERP and more, 3) the cloud, ‘3 of the 5 largest oils will have a ‘cloud first’ GIS strategy in 2018, leveraging ArcGIS, 4) ‘Open standards’ for Esri these mean REST, JSON and OData sources, 5) AI, VR/AR, big data and data science. Esri’s system of record is build around the ArcGIS Data Store, a spatio-temporal big data store for internet of things and big data geoanalytics. This requires its own scalable infrastructure and is ‘highly indexed and fast, you will need to rethink your system of record.’

On the client side ‘App users are now map makers.’ WebApp builder for ArcGIS can create data viewers and social media style apps ‘without spending days on custom code.’ Such map/apps are launchable from within Excel. The may only exist in a user’s personal project space or can be shared across the organization.

Real time tracking enhances situational awareness of oil and gas operations. Marathon Oil used the ArcGIS geo event server to track visits to remote well pads. It discovered that the contractor was only visiting 62% of the agreed locations, skipping weekends and overbilling.

More sensors are bringing ‘big data’ opportunities. Esri is researching Hadoop and artificial intelligence to bring new insights to existing data. A proof of concept involved a million production data records in a NETCDF space-time cube. ArcGIS Pro running on Amazon was used to wrangle the data, edit field names and view on a base map. A time slider allowed investigation of production history across the field, showing where the production was coming from. The demo had ArcGIS Pro running on a Mac endpoint to the Amazon workspace. ‘Deployment patterns are changing.’ A fact that was borne out by the array of Macs used for the demos.

Brian Boulmay presented BP ‘s OneMap corporate GIS system, ESRI’s oil and gas poster child. OneMap is the largest application in BP’s portfolio and is a key component of BP’s ‘digital transformation’ as a single source of truth. OneMap began in subsurface but is slotted to extend across the business. Boulmay insisted that OneMap is a platform, not a point solution. It has applications in pipeline integrity management, oil spill preparedness and as a component of BP’s common operating picture. BP currently operates both in the cloud as well as with local deployment for Luanda and other harsh environments.

Boulmay observed that ‘GIS has no user manual.’ Deployment and use can differ widely so BP is working on standardized roll-out and management. The company has over 100 GIS specialists and 9,000 plus users. OneMap innovates with published services, linking to Petrel, Spotfire and PowerBI. Boulmay believed that OneMap has ‘taken GIS out of the closet.’ OneMap provided support to BP’s team working during the Harvey flood. A custom mobile app showing the location of flooding and other key resources was delivered within 4 hours of the request. In the Q&A, Boulmay was quizzed as to the wisdom of letting everyone publish. He answered, ‘wide open is the norm,’ and this has produced benefits, particularly in countering the prevalence of Google Earth skunkworks projects before OneMap.’

Matthew Griggs is a GIS analyst ‘embedded’ with Woodside’s data science team. His mandate is to integrate geospatial with cognitive computing à la IBM Watson. Woodside uses Watson to ingest and access unstructured information in documents, PDFs and reports. Cognitive computing is the key to ‘unlocking the knowledge.’ Text analytics leverages rules developed by Woodside’s subject matter experts. Cognitive computing is seeing ‘feverish deployment’ at Woodside with funding from all business units. Spatial’s first intersection with cognitive came when an early drilling events project showed a requirement for spatial search. This sparked off a proof of concept for GIS integration. The result is a webmap GUI which identifies documents with terms like ‘kick’, ‘leak off test,’ ‘influx’ to be plotted within a polygonal area of interest. This was successfully rolled out to the business. Next HSE came along with 30 years of HSE information in multiple databases. These too have been spatialized and have pinpointed ‘bike incidents’ around the front gate. A cognitive subsea tool likewise was developed and integrated with ArcGIS and SAP for work orders.

Richard Burren from CGG’s NPA Satellite Mapping unit provided an authoritative backgrounder of Satellite technology as used in earth resource mapping. The satellite scene is hotting-up with the ‘smallsat’ revolution. Planet’s Dove satellites will provide daily earth imagery at 3-4m resolution. Much of ESA’s Sentinel radar and optical coverage is open access. All onshore areas are now acquired every 12 days with radar. Soon we will have intra-day imagery. Challenges remain in cost and usage terms, data quality weather and view geometry (most satellites look down!) Keeping up with the different offerings is hard. The art is in matching your needs to what is available.

Craig Allinson from the IOGP Geodesy Subcommittee told an interesting tale of an engineering fail on a North Sea facility. The operator tried to install a 60 meter long prefabricated bridge between two offshore platforms. But once on location, the bridge did not fit! Seemingly the engineers had used 1994 WGS84 reference data to get the location of one platform and a 2013 WGS84 survey on the other. What went wrong? We have a mental image of the earth as static relative to national and regional reference systems that are anchored to the earth. Coordinates do not change with time. Unfortunately, the global WGS84 CRS actually moves over time due to plate tectonics. This is of the order of 3 cm/year in N America/EU and more elsewhere. Coordinates of a point on the earth are dynamic. Measurements of ‘static’ trig points need both coordinates, rates of plate motion and a reference epoch. Enter the ITRF-based coordinate reference system. Allinson offered various approaches to managing geospatial data with time-dependent transforms. The differences between frames of reference can be significant. ITRF and WGS84 are 75cm out in Europe. In Australia there is now a 1.5m shift between GDA94 and the ITRF. This is rumoured to have caused a traffic accident as a driverless car worked on a different CRS from its roadmap!

Rigorous mapping between reference systems requires the Helmert transform and involves seven parameters, each with a rate. Different reference epochs make matters even more complicated. Allinson concluded that the issue is complex and confusing. Many data sets have inadequate metadata for their epoch. Time-dependent transformations are only available in specialist software. The IOGP has new Guidance Notes in preparation and is adding dynamics to the EPSG dataset. As the bridge builders discovered, WGS84 is dynamic. The first platform’s coordinates should have been changed or resurveyed to account for plate motion. Software developers should add time dependent transform methods, add velocity grids and allow for coordinate epoch as a dataset attribute. ‘Be aware that WGS84 is approximate and that the use of ETRF is ‘increasingly unacceptable, for sub meter accuracy stop using it.’ In the Q&A it emerged that for Esri, time dependent transforms have yet to be embedded in software!

Karina Schmidt (Wintershall) showed how difficult some apparently straightforward tasks can be. Wintershall uses Schlumberger’s GeoX for prospect and field assessment. GeoX runs on an Oracle server with Citrix/PC clients for some 250 users. Getting spatial parameters from Esri into GeoX was hampered by the fact that both GeoX and ArcGIS have proprietary access permission management. Moreover, users’ roles often change. With help from Conterra, a permissions manager was built to interface with the two systems and open up access according to an independent policy database. This has allowed for fine-grained access control beyond what is possible with ArcGIS. The work has now begotten the GeoX SPAR (spatial portfolio analytics and reporting) project. In parallel, Schlumberger has launched a JIP to spatialize GeoX.

Ernyza Endot and Nick Kellerman showed how Shell’s myMap ArcGIS development is used to plan land seismic surveys, a ‘complicated process.’ Today, data availability means that it should be possible to optimize a survey in the face of contrasting requirements. Enter data-driven planning and least-cost routing, staying inside geophysical constraints but allow use of roads and avoiding obstacles and no go zones. Human sentiment analysis also ran. The work is used in the impact analysis phase but it was not clear if the survey plan flows through to the acquisition contractor.

Andrew Zolnai told a ‘social media’ story of the Harvey flooding. He began mapping the situation out of curiosity, but as the Facebook messages about the floods came in and reports of the explosions at the Arkema Crosby plant he realized that this might be of greater usefulness. His post on Twitter was picked up by a local association and, during the subsequent events, Twitter proved an extremely robust communications medium as cellphone masts were outside or above the flood.

Founder Brian Goldin presented Voyager Search’s technology which combines documentary inquiry with complex geospatial query. Voyager claims to do IBM Watson-style text analytics (without the marketing) leveraging natural language processing and machine learning. Voyager has Apache Solr/Lucene under the hood along with more open source tools for data cleansing.

Andrew Cutts (ACGeospatial) provided an update on the European Space Agency’s Earth Observation for Oil and Gas (EO4OG) project, actually four projects that set out to study the geo-information needs of the sector and what services and products might best meet them. The projects identified 224 oil and gas challenges amenable to satellite investigations. These have been whittled down to 19 use cases available on the EARSC website. Earth observation is also benefiting from a new breed of satellites with greater resolution and bandwidth and more flexible deployment. Falling acquisition costs and high-performance processing with GPUs complete the rosy picture.

Robert Long (IHS Markit) discussed approaches to map web services. IHS offers SOAP-based integration but this doesn’t work with new analytical tools like Esri Insight or Microsoft Power BI. The alternative REST protocol is under investigation. IHS is interested in the possibilities of the cloud and has some proof of concept projects underway.

Statoil turned to its Esri GIS when looking for an integration/planning platform for its offshore windfarms. Tor Inge Tjelta, from Statoil’s New Energy Solutions unit, presented the offshore Scotland ‘Hywind’ project which, ‘if and when developed’ on the Dogger bank will be the largest in world with a 40 x 20 km extent. GIS has allowed for real time shipping activity tracking and mapping unexploded ordinance (the site was a WW2 battleground). The study rolled-in more geological and geotechnical data. The Dogger Bank was previously thought to be a sandbank, it turned out to be a glacial moraine. GIS is used to model, visualize and communicate with contractors. The IOGP seabed survey data model also ran.

According to John Seabourn, the UK Oil & Gas Authority’s digital offerings are now ‘spatial by default and web by default.’ OGA, with help from Fivium has several data sets available from its Portal much in ‘open source’ i.e. shapefile format (curiously, OGA eschews the Inspire EU mapping standard.) A range of Esri technologies have been deployed to promote UK oil and gas including a 30th Round Web App for data release and Story Maps of historical activity.

Vidar Brekke presented AkerBP’s ‘Kartportal’ a cloud-based data lake combining technology from Microsoft, Esri, Geocap and others. A full stack of Esri software is deployed in an enterprise agreement that underpins Aker BP’s digital transformation. GeoEvent services stream real time information into the system alongside third party data sources including Norway’s License2Share and Rystad. The system combines geoscience data from Petrel, Geoteric, LR/IC, PetroMod and Trinity. Interpretations are captured as polygons in the GIS database. Other tools of the trade include Microsoft Office 365 and (real soon now) the Unity rendering engine and Geocap’s seismic-in-ArcGIS. More from the conference website.

Click here to comment on this article

Click here to view this article in context on a desktop

© Oil IT Journal - all rights reserved.