The last few weeks have yet more partnerships targeting the application of artificial intelligence and big data style applications in the upstream. Anadarko has entered a ‘collaborative partnership’ with Calgary-based RS Energy Group to combine RSEG’s analytics with Anadarko’s internal data. RSEG’s RS Data solution (released in April 2017) provides multi-sourced permit, completion and production data of ‘unparalleled completeness and quality’ across key North American and international energy plays. Working with Anadarko’s advanced analytics and emerging technologies team, RSEG has developed solutions that ‘integrate seamlessly’ with daily workflows.
Rockwell Automation has invested in The Hive, a Silicon Valley innovation fund and ‘co-creation studio,’ to access its ecosystem of technology start-ups that focus on the application of artificial intelligence to industrial automation. T.M. Ravi, MD and co-founder of The Hive said, ‘AI-powered applications target the cognitive enterprise, edge intelligence, security and smart machines.’
Schneider Electric reports uptake of its IIoT-based asset performance management (APM) solutions and ‘EcoStruxure’ converged OT/IT industrial software platform. Machine learning and pattern recognition are said to play a central role in APM as is the use of the cloud. Deals with APM/risk consultants MaxGrip and a new ‘digital services factory,’ in cooperation with Accenture are set to accelerate development. Schneider’s supply chain management and economic modelling has been deployed by BP refining for feedstock evaluation.
Aker Solutions has formed a collaboration with Cognite to strengthen its digital offering. Cognite CEO John Lervik founded the company following an engagement with Aker ASA. Aker BP is also a partner on the new IIoT platform. The partnership plans to aggregate data from industrial sensors and applications, combined with 3D-models into visualization and machine learning applications for optimization and automation.
Siemens has meanwhile announced the arrival of its MindSphere (Version 3.0) in the Amazon S3 cloud. Siemens has also created the MindSphere World community for its cloud-based ‘open’ IoT ‘operating system.’
At the first GE/Baker Hughes user meet in Florence, BHGE chairman and CEO Lorenzo Simonelli elected to highlight not Predix, its IIoT platform, but JewelSuite, which is now ‘revolutionizing the way Shell executes field development, integrating data and workflows from seismic interpretation all the way through to geological modelling.’
Shell has sold the intellectual property rights to its in-house developed engineering design database and related software EDD and PDMSi to Cambridge, UK-based Aveva*. Aveva’s software portfolio can also now be used by EPCs on Shell capital and operational projects and across the Shell Project Vantage platform. In 2015, Project Vantage was said to provide Shell, its partners and contractors with a ‘single and continuously updated version of data’ throughout the project and asset lifecycle.’ At the time Vantage represented a shift from a pure-play standards-based solution to a product-based, pragmatic approach to standardization.
EDD and PDMSi functionality will be integrated with Aveva’s software portfolio and will be available to third parties. Aveva’s Steen Lomholt-Thomsen said, ‘Most of Shell’s upstream offshore and downstream operating assets were designed using AVEVA technology.’ Shell reports that PDMSi ‘has reduced drawing production time by up to 85% on deepwater projects.
* Aveva is currently being acquired by France’s Schneider Electric Software, assuming that the deal (the third attempt since 2015) works this time!
2017 saw a feeding frenzy on several fronts. Digital twin, internet of things, big data, artificial intelligence and predictive analytics have featured large in our (and everyone else’s) reporting. But it’s sometimes hard to tell them apart. The digital twin concept from product lifecycle management was enthusiastically reworked in big data/AI offerings from GE, IBM, Siemens, AMEC/FW, ABB, Emerson and many others. The feeding frenzy continues unabated in 2018, witness our current lead.
My Obama Peak prediction did not stand for long, it was bested in November as US crude oil production blasted through the 2015 high and on to new records. I offer no defense other than the observation that forecasts are a bit like armies of monkeys trying to write Shakespeare. Someone is bound to get it right. My resolution for 2018. Stick with reporting, no more forecasts.
Mergers and acquisitions in the upstream software space included Palantir and PetroVR/Caesar Systems, Paradigm and Emerson, Pason and Verdazo, Quorum and WellEZ to name but a few. 2018 starts with a bang on the A&M front too – see our Done Deals section on page 8.
On the standards front, the big thing in 2017 was progress on Cfihos, the capital facilities handover standard. Cfihos though, reflects a long-term degradation in upstream data knowhow. In the 1990s it was the hard but smart Express language of DLIS and Epicentre. This was replaced by various XML-based ‘utility’ standards and (for ISO 15926 but not much else) the simple but ultimately inadequate RDF in the 2000s. Now there are calls to replace XML with the latest tech du jour, JSON. But Cfihos is going ‘back to basics’ with an Excel spreadsheet-based ‘standard.’ Basics are all very well, but Excel? Really? Other significant standardization efforts last year include the alignment of Energistics’ standards with the new common ETP protocol and ExxonMobil’s initiative along with The Open Group for a new process control standard.
2017 saw the rise of a plethora of competing ‘platforms,’ all designed to capture your big data and store it in a vendor’s proprietary/open cloud (cognitive dissonance intended). Such offerings came from GE, DNV GL, SAP, Siemens and Schneider. Others moving cloud-ward included Schlumberger with its ‘Delfi’ ‘cognitive’ E&P environment in the Google cloud and Halliburton’s announcement of a deal with Microsoft on the Azure’s platform.
Along with the rise of the IoT and the cloud came the realization that it is impossible to have every sensor plugged into the cloud. Enter ‘edge’ computing, again with a plethora of offerings, not least the Linux Foundation’s EdgeX. Other IT novelties included FME’s use of a Docker ‘swarm’ in geoprocessing and on INT’s introduction of its Ivaap microservices-based back end. We also reported (less enthusiastically) on an avalanche of blockchain offerings in oil and gas.
Much of the big data/AI hype that has come across our desk is ‘forward-looking’ stuff. As I said, it’s better to report than predict. So we went forth and formed our own picture of what is happening in this space. Agile Scientific’s machine learning hackathon provocatively set out to ‘cut out the science’ in geoscience. In some cases, the application of AI appears to have merit (machine learning for mineral microscopy). Other applications are more contentious, such as replacing seismic modeling or reservoir simulation with ‘black box mapping’ from ‘labelled’ cross sections or sketches. But it seems likely that the AI pioneers may contribute to speeding awkward parts the workflow, such as handling large volumes of dirty data (see page 4).
At the EAGE Workshop on data science in geoscience (N° 7 2017) Total’s Michel Lutz summarized the AI phenomenon as the ‘democratization’ of neural nets and decision trees thanks to open source software. Lutz reported successful production forecasts in shale wells, augmenting decline curve analysis with data-driven analytics. Shell reported on its GeoDNN-deep neural net-based seismic feature extraction, co-developed with MIT. Shell is also using ML in reservoir engineering with ‘AutoSum,’ a prototype tool for summarizing large ensembles of reservoir models to help understand key sensitivities. Agile Data Decisions reported work on the seminal CDA unstructured data challenge, a ‘fantastic dataset’ of logs and reports from decades of North Sea exploration.
One possibly lasting spin off of the big data/AI movement is the acceptance of open source software in the upstream. Only a few years ago this was considered anathema. One important piece of open source kit that is getting traction is the venerable Lucene/Solr search engine. This is now baked into commercial offerings from Fuse, EnergyIQ, Voyager Search, and this month’s novelty, BCT’s Geodatafy (page 12). Fuse has leveraged Lucene for a decade, so maybe 2017 is more of a coming out for open source software in the upstream.
Looking ahead we have some great big data/ML reporting in store for you in 2018, see page 4 and our upcoming report from the IFPen DataScience in Energy event.
Probably the biggest news of 2017 actually slipped into January 2018 with Schlumberger’s announcement that its seismic acquisition business ‘does not meet our return expectations going forward, even factoring in an eventual market recovery.’ Schlumberger has ‘exited’ land and marine acquisition. The venerable WesternGeco, which historically embeds GSI, Prakla, SSL and others, is going ‘asset light.’ Wow! If acquisition is going ‘asset light’ where will our big data be coming from in 2018?
At the 40th Artificial Lift R&D Council’s gas lift workshop in Houston, Shauna Noonan’s keynote traced Oxy’s efforts to implement gas lift in the Permian basin. Traditional ESP and rod pumps were prevalent but suffered from high failure rates and downtime. Following a three well 3 pilot in 2013, gas lift has grown steadily and was used on 250 wells in 2017. Well uptime has improved, production is up and well head spacing is down leading to a smaller pad footprint. Today, 20% of Oxy’s unconventional production is gas lifted and all new wells are equipped (if possible).
Stephen Edward Faux presented Silverwell Energy’s DIAL, (digital intelligent gas lift device, a.k.a. ‘the future of gas lift.’ The novel surface-controlled downhole tool, currently in commercial pilot phase, allows injection parameters to be changed in real time. Six injection orifices can be independently controlled. The unit includes sensors for real time annulus and production tubing pressure and temperature data.
Dimitrios Giorgis, speaking on behalf of Process Systems Enterprise’s Kevin Wade enumerated several use cases of PSE’s ‘gOilfield’ full field and facility production modeling and optimization approach. PSE deploys full physics models of wells and the flowline network to evaluate asset behavior in different situations. Mathematical solvers identify operating conditions that optimize production. gOilfield has been validated in a number of geographies.
One Gulf of Mexico study on 18 offshore wells reported a 6% production hike through ‘better use of riser base gas lift and re-routing.’ PSE deprecates current models that over simplify facilities as ‘simple constraints.’ Facilities need to be treated with detailed modeling along with the production network. The approach was described in a recent paper, SPE-189263-MS given at the SPE symposium on production enhancement and cost optimization, held in Kuala Lumpur. More from the ALRDC.
At the 2017 Energy Forum in Houston, Tibco strengthened its position in upstream business intelligence with a shift into the world of open source big data and analytical applications. Tibco presents its portfolio as a ‘system of insight’ for energy spanning upstream exploration and production surveillance, midstream to refining. Alongside its business intelligence applications, Tibco offers extensive data connectivity to business and technical data sources.
The extreme data volumes of unconventionals are driving a shift to open source big data technologies. Manny Rosales from ExxonMobil’s shale unit XTO Energy has addressed the challenge of a 400,000 well dataset of log ascii (LAS) files that were stored in disparate systems around the company. XTO, with help from the Exxon parent, used the Apache NiFi data wrangler and Hortonworks’ LAS toolkit to consolidate these into a Hadoop/Hive data lake. Spotfire acts as a query/front end to the system whose performance has been enhanced with Apache LLAP’s hybrid caching execution engine. Statistical investigations are carried out with TERR (Tibco Enterprise Runtime for R) and the R-ODBC database connector. A range of log and cross section viewers are now available, driven by ‘simple and intuitive’ Spotfire workflows and the TERR ‘workhorse.’
Prior to 2015, Spotfire was a niche/specialist app at Linn Energy. But then, as Audrie Luna described, usage went ‘viral,’ with applications across IT, operations and finance. Spotfire Analyst and WebPlayer have now displaced Linn’s legacy Birst business intelligence solution. The solution shows well failure trends. Root cause analysis of well performance distinguishes between design and equipment failures. Operations and IT now have better understanding of needs of the business and there is ‘less reliance on third parties that take months to implement solutions.’
Calvin Caraway showed how Chevron uses Spotfire as a ‘one stop shop’ for evaluating its shale well performance. Chevron deploys its own smarts developed in Iron Python and R (via TERR) to address specific Permian basin challenges with decline curve analysis, spatial interpolation and data distribution assessment.
BP’s Gustavo Carvajal has also packaged decline curve analytics in Auto-DCA, a Spotfire based, semi-automated process again leveraging TERR. The solution provides ‘unprecedented’ evaluation of production and reservoir performance and is used to optimize well and fracture spacing.
Neha Reddy showed how Spotfire underpins ConocoPhillips’ Supply chain integrated visibility (SIV), a new reporting solution that represents a shift from finding and curating data into ‘more data-driven actions.’ SIV is used to create dashboards which are ‘reactive, easy-to-use, and intuitive for individuals with varying levels of Spotfire skill.’
Joe Dominick presented NRG Energy’s partnership with Tibco that has seen the development of a comprehensive dashboard for its power generation operations. Diverse data sources including Oracle, OSIsoft and SAP are consolidated into a single data warehouse that allows NRG to keep tabs on its corporate cash burn rate and other KPIs. More from the Energy Forum home page.
At a meeting of the SAID, the French chapter of the Society of Professional Well Log Analysts, Emmanuel Caroli* described a trial of machine learning with neural nets to see if it would be possible to screen a large set of well logs in a data room context. In a major acquisition there may be thousands of logs available which precludes a ‘classical deterministic approach’ to formation evaluation. Total tried ‘deep forward’ and ‘deep convolutional’ neural nets in a variety of geological facies. Training was performed on a minimal log suite (GR, Neutron, resistivity, density) against interpretations of poro-perm, water saturation and clay volume. The results were complex but interesting. It emerged that the neural nets performed better when left to their own devices. Separating different facies for training was not successful. It’s better to use all the data and let the machine sort things out. Overall the best ML-driven interpretations were good, with only 5% errors. But there are a few enigmas. The machine can produce spurious physically impossible results (volumes add up to over 100%). When constraints are added to mitigate such aberrations, the error rate rises. But the results were deemed encouraging, particularly in the light of a very large amount of rather poor data where a massive amount of preparation would be required to perform a petrophysical analysis. ML provides a quick look interpretation and could be seen as a pre-processor for a physics-based workflow.
* With help from Quentin Groshens of France’s Supelec.
No, it’s not Frankenstein. Halliburton’s hybrid digital twin (HDT), as revealed in a recent white paper represents an attempt to marry the two worlds of forward, physical modeling and data-driven analytics*. The authors recognize that in oil and gas, ‘modeling has existed for decades and the availability of high-performance computing and software tools has allowed for their widespread acceptance.’ But HPC can’t simulate everything. There is a need to couple physical models with data-driven analytics into a ‘hybrid digital twin.’
Halliburton proposes a digital twin for predicting future behavior and performance of the physical asset, and even a digital twin for ‘system of systems thinking,’ to cater for ‘interoperability and emergent behaviors.’ Physical models are routinely based on engineering assumptions and validated on limited data sets. Such weaknesses can be offset with data-driven models. The adaptive nature of the HDT is claimed to provide ‘significant benefits’ in well construction and production planning, ‘where variations between wells and fields are the norm.’
The authors argue against the use of ‘fashionable’ neural networks alone. Neural nets ‘only comprehend’ the data and ignore the underlying physics. The HDT promises a system that understands both and ‘will have a widespread impact.’
The HDT leverages concepts developed by Matthew Franchek of the University of Houston. See for instance his work on BOP condition monitoring SPE 189987-PA.
* See also our editorial on this subject.
INT blogger Thierry Danard has ‘simplified the learning curve’ of the free and open source Seismic Unix library. Seismic Un*x as it is curiously, if more correctly, known, comes from the Center for Wave Phenomena at the Colorado School of Mines. SU is hosted on Github by John Stockwell. CSM is a powerful package that is widely used in the geoscience community.
INTViewer (which isn’t free un-fortunately) includes a Seismic Workbench plugin that includes SU documentation and an SU command line builder. INTViewer displays seismics in real time as the SU commands run. Parameters and workflows can be saved for later reuse. On Windows, SU can be run under control from INT using the Cygwin Unix emulator.
Geosoft’s 2017 survey of 1,400 geoscience data managers in a thousand companies provides food for thought for upstream data managers. Of the key challenges facing geodata managers, the need for an ‘integrated search tool’ comes out top. ‘Support for big data’ ranked lowest across all industries including energy. Time spent on managing geo data is down from the early days when some startling numbers were routinely cited. Still, 36% of respondents spend at least 30% of their time on data management tasks (some may be time well spent!)
64% of respondents were confident in their data, again across all industries roles. Two challenges arise in collaborating with outside consultants and contractors or in joint ventures: ensuring the most current and best quality version of data is used and providing this in a timely manner in a usable format.
The preferred route to data success remains the purchase of a commercial solution in-house development. About half of respondents were ready to entertain a cloud-based geoscience data management solution.
Katalyst has announced a new well file and log scanning ‘paper to digits’ service.
Librestream has announced the Onsight Cube an industrial Ex-rated wearable device for inspecting, diagnosing and troubleshooting complex assets.
The 2017b release of Sintef’s Matlab Reservoir Simulation Toolbox (MRST) includes better documentation, speed improvements and new modules. Also new is support for the open source Matlab clone Octave, allowing most of the solvers to run without Matlab.
LMKR has launched Gverse Go, a pay per use subscription program for access to the GeoGraphix and Gverse suites.
The 2018.1 release of MVE Move includes elliptical fault flow, a new 2D kinematic algorithm that models complex deformation of isolated normal faults.
P2ES’ Qbyte FM 2.0 adds support for multiple monitors, context sensitive links to documentation and pervasive drill-down and export to Excel.
Petrolink’s new PetroVue Mobile can be downloaded from the App Store and from Google Play. The app connecting to Petrolink’s Witsml store and displays time and depth-based data for active wells.
Petrosys PRO 2017.1.2 introduces map templates to speed map creation by allowing a corporate standard set of map styles to be maintained. The new PRO release adds drag and drop integration with Landmark’s DecisionSpace Geoscience suite to display OpenWorks data in the Petrosys canvas.
Blue Marble GeoCalc SDK 7.4 adds support for several new projections and JSON wrapper classes for interacting with the GeoCalc cloud web service.
Safe Software’s FME 2018 includes bi-directional Hadoop connectivity, web 3D views, mosaics and read/write to Bentley Microstation format. Checkout the (great) FME 2018 minisite.
Rock Flow Dynamics’ tNavigator 17.4 includes coupled geomechanical modeling and an enhanced designer for multiple realizations of a dynamic model.
Ikon Science’s RokDoc 6.5.1 adds a new attrimod module for seismic forward modelling and updates to Ji-Fi’s data trending capability for frontier and sparse data exploration well prediction. The multi-well analysis Python toolkit is now ‘AI and machine learning ready.’
SharpReflections’ Pre-Stack Pro V5.0 features a new pre-stack well tie module, for tying synthetic angle gathers or stacks to real seismic at well locations, along with an improved 3D viewer.
A new release of Weatherford’s ForeSite software extends the platform to naturally flowing wells, gas-lift and ESP systems.
WellSim’s HiDrill v3.0 drilling simulator upgrades the BOP accumulator system and stuck pipe functionality.
Honeywell has rolled out Uniformance, its data historian in the cloud, fusing real-time process data analysis with a data lake to integrate production, ERP and other business data and analytic tools.
The Fall 2017 release of Arundo’s industrial analytics solution introduces an Edge Agent for connections to rugged, remote or disconnected environments.
Machfu’s MACH-3 IIoT gateway is a hardware device that connects legacy infrastructure to cloud-based IoT and scada systems.
Intertek has announced a new pipeline quality verification solution, PipeAware.
Siemens has rolled-out Pipelines 4.0, tailored to North American operations.
Energistics’ National Data Repositories (NDR) is changing its format. The bi-annual, itinerant gathering will be replaced with a ‘smaller, low-key conference, with a focus on collaboration and discussion.’ Meetings will be hosted by the TNO-sponsored North Sea data management forum, comprising regulators from Denmark, Holland, Ireland, Norway, and the UK. The next gathering is planned for Q3/Q4 2019.
The UK regulator, the Oil and Gas Authority (OGA) is to transform UKOilandGasData.com from a member-funded platform provided by Common Data Access (CDA) into a key component of a new central UK National Data Repository. The service will be delivered by CDA under contract to OGA. UKOilandGasData, in operation since 1995, allows industry to meet its statutory obligations to share data on wells and seismics and provides data management services to industry. The new two year contract begins on 1st January 2019.
CDA has issued new guidance for the retention of information and samples after decommissioning. The guidance was developed in collaboration with the Shell Brent Decommissioning Project and Aberdeen University’s School of Law. Feedback on the initial guidance document will be gathered at an open workshop to be held in March 2018.
RyderScott reports that the comments on the 2017 Society of Petroleum Engineers Petroleum Resources Management System are to finalize this year. The draft recognizes the role of the learning curve and step changes in the field. SPE-PRMS is backed by the Society of Petroleum Evaluation Engineers, the World Petroleum Council, the American Association of Petroleum Geologists and the Society of Exploration Geophysicists
RyderScott also reports a challenge to Alberta Securities Commission interpretations of a 2015 regulation covering abandonment and reclamation costs (ARC). A recent interpretation excluded ARCs on exploration wells. The Calgary chapter of the SPEE is on the case and is ‘tweaking’ a new edition of the Canadian Oil & Gas Evaluation Handbook in the interests of clarity.
The Texas Railroad Commission’s electronic well log filing system has saved more than $300,000 for industry and thousands of man-hours for the agency to date. Since its launch, the system has received approximately 8,000 submissions, saving the RRC nearly $50,000 in staff time and scans. Commissioner Ryan Sitton is pushing for more ‘smart’ IT solutions.
Opening the proceedings at the 2017 ESRI EU Petroleum User Group, ESRI’s Danny Spillman advocated building a digital twin in ArcGIS. ArcGIS fits the digital twin paradigm well as a natural system of record for spatial data and more. ESRI is encouraging developers to build a ‘system of engagement’ to manage legacy data in original format.
Spillman sees five GIS trends as follows. 1) ‘consumerization’ and a mandatory ‘mobile first’ strategy for developers, 2) integrate with other systems with location as the ultimate ‘foreign key’ connecting real time systems, ERP and more, 3) the cloud, ‘3 of the 5 largest oils will have a ‘cloud first’ GIS strategy in 2018, leveraging ArcGIS, 4) ‘Open standards’ for Esri these mean REST, JSON and OData sources, 5) AI, VR/AR, big data and data science. Esri’s system of record is build around the ArcGIS Data Store, a spatio-temporal big data store for internet of things and big data geoanalytics. This requires its own scalable infrastructure and is ‘highly indexed and fast, you will need to rethink your system of record.’
On the client side ‘App users are now map makers.’ WebApp builder for ArcGIS can create data viewers and social media style apps ‘without spending days on custom code.’ Such map/apps are launchable from within Excel. The may only exist in a user’s personal project space or can be shared across the organization.
Real time tracking enhances situational awareness of oil and gas operations. Marathon Oil used the ArcGIS geo event server to track visits to remote well pads. It discovered that the contractor was only visiting 62% of the agreed locations, skipping weekends and overbilling.
More sensors are bringing ‘big data’ opportunities. Esri is researching Hadoop and artificial intelligence to bring new insights to existing data. A proof of concept involved a million production data records in a NETCDF space-time cube. ArcGIS Pro running on Amazon was used to wrangle the data, edit field names and view on a base map. A time slider allowed investigation of production history across the field, showing where the production was coming from. The demo had ArcGIS Pro running on a Mac endpoint to the Amazon workspace. ‘Deployment patterns are changing.’ A fact that was borne out by the array of Macs used for the demos.
Brian Boulmay presented BP ‘s OneMap corporate GIS system, ESRI’s oil and gas poster child. OneMap is the largest application in BP’s portfolio and is a key component of BP’s ‘digital transformation’ as a single source of truth. OneMap began in subsurface but is slotted to extend across the business. Boulmay insisted that OneMap is a platform, not a point solution. It has applications in pipeline integrity management, oil spill preparedness and as a component of BP’s common operating picture. BP currently operates both in the cloud as well as with local deployment for Luanda and other harsh environments.
Boulmay observed that ‘GIS has no user manual.’ Deployment and use can differ widely so BP is working on standardized roll-out and management. The company has over 100 GIS specialists and 9,000 plus users. OneMap innovates with published services, linking to Petrel, Spotfire and PowerBI. Boulmay believed that OneMap has ‘taken GIS out of the closet.’ OneMap provided support to BP’s team working during the Harvey flood. A custom mobile app showing the location of flooding and other key resources was delivered within 4 hours of the request. In the Q&A, Boulmay was quizzed as to the wisdom of letting everyone publish. He answered, ‘wide open is the norm,’ and this has produced benefits, particularly in countering the prevalence of Google Earth skunkworks projects before OneMap.’
Matthew Griggs is a GIS analyst ‘embedded’ with Woodside’s data science team. His mandate is to integrate geospatial with cognitive computing à la IBM Watson. Woodside uses Watson to ingest and access unstructured information in documents, PDFs and reports. Cognitive computing is the key to ‘unlocking the knowledge.’ Text analytics leverages rules developed by Woodside’s subject matter experts. Cognitive computing is seeing ‘feverish deployment’ at Woodside with funding from all business units. Spatial’s first intersection with cognitive came when an early drilling events project showed a requirement for spatial search. This sparked off a proof of concept for GIS integration. The result is a webmap GUI which identifies documents with terms like ‘kick’, ‘leak off test,’ ‘influx’ to be plotted within a polygonal area of interest. This was successfully rolled out to the business. Next HSE came along with 30 years of HSE information in multiple databases. These too have been spatialized and have pinpointed ‘bike incidents’ around the front gate. A cognitive subsea tool likewise was developed and integrated with ArcGIS and SAP for work orders.
Richard Burren from CGG’s NPA Satellite Mapping unit provided an authoritative backgrounder of Satellite technology as used in earth resource mapping. The satellite scene is hotting-up with the ‘smallsat’ revolution. Planet’s Dove satellites will provide daily earth imagery at 3-4m resolution. Much of ESA’s Sentinel radar and optical coverage is open access. All onshore areas are now acquired every 12 days with radar. Soon we will have intra-day imagery. Challenges remain in cost and usage terms, data quality weather and view geometry (most satellites look down!) Keeping up with the different offerings is hard. The art is in matching your needs to what is available.
Craig Allinson from the IOGP Geodesy Subcommittee told an interesting tale of an engineering fail on a North Sea facility. The operator tried to install a 60 meter long prefabricated bridge between two offshore platforms. But once on location, the bridge did not fit! Seemingly the engineers had used 1994 WGS84 reference data to get the location of one platform and a 2013 WGS84 survey on the other. What went wrong? We have a mental image of the earth as static relative to national and regional reference systems that are anchored to the earth. Coordinates do not change with time. Unfortunately, the global WGS84 CRS actually moves over time due to plate tectonics. This is of the order of 3 cm/year in N America/EU and more elsewhere. Coordinates of a point on the earth are dynamic. Measurements of ‘static’ trig points need both coordinates, rates of plate motion and a reference epoch. Enter the ITRF-based coordinate reference system. Allinson offered various approaches to managing geospatial data with time-dependent transforms. The differences between frames of reference can be significant. ITRF and WGS84 are 75cm out in Europe. In Australia there is now a 1.5m shift between GDA94 and the ITRF. This is rumoured to have caused a traffic accident as a driverless car worked on a different CRS from its roadmap!
Rigorous mapping between reference systems requires the Helmert transform and involves seven parameters, each with a rate. Different reference epochs make matters even more complicated. Allinson concluded that the issue is complex and confusing. Many data sets have inadequate metadata for their epoch. Time-dependent transformations are only available in specialist software. The IOGP has new Guidance Notes in preparation and is adding dynamics to the EPSG dataset. As the bridge builders discovered, WGS84 is dynamic. The first platform’s coordinates should have been changed or resurveyed to account for plate motion. Software developers should add time dependent transform methods, add velocity grids and allow for coordinate epoch as a dataset attribute. ‘Be aware that WGS84 is approximate and that the use of ETRF is ‘increasingly unacceptable, for sub meter accuracy stop using it.’ In the Q&A it emerged that for Esri, time dependent transforms have yet to be embedded in software!
Karina Schmidt (Wintershall) showed how difficult some apparently straightforward tasks can be. Wintershall uses Schlumberger’s GeoX for prospect and field assessment. GeoX runs on an Oracle server with Citrix/PC clients for some 250 users. Getting spatial parameters from Esri into GeoX was hampered by the fact that both GeoX and ArcGIS have proprietary access permission management. Moreover, users’ roles often change. With help from Conterra, a permissions manager was built to interface with the two systems and open up access according to an independent policy database. This has allowed for fine-grained access control beyond what is possible with ArcGIS. The work has now begotten the GeoX SPAR (spatial portfolio analytics and reporting) project. In parallel, Schlumberger has launched a JIP to spatialize GeoX.
Ernyza Endot and Nick Kellerman showed how Shell’s myMap ArcGIS development is used to plan land seismic surveys, a ‘complicated process.’ Today, data availability means that it should be possible to optimize a survey in the face of contrasting requirements. Enter data-driven planning and least-cost routing, staying inside geophysical constraints but allow use of roads and avoiding obstacles and no go zones. Human sentiment analysis also ran. The work is used in the impact analysis phase but it was not clear if the survey plan flows through to the acquisition contractor.
Andrew Zolnai told a ‘social media’ story of the Harvey flooding. He began mapping the situation out of curiosity, but as the Facebook messages about the floods came in and reports of the explosions at the Arkema Crosby plant he realized that this might be of greater usefulness. His post on Twitter was picked up by a local association and, during the subsequent events, Twitter proved an extremely robust communications medium as cellphone masts were outside or above the flood.
Founder Brian Goldin presented Voyager Search’s technology which combines documentary inquiry with complex geospatial query. Voyager claims to do IBM Watson-style text analytics (without the marketing) leveraging natural language processing and machine learning. Voyager has Apache Solr/Lucene under the hood along with more open source tools for data cleansing.
Andrew Cutts (ACGeospatial) provided an update on the European Space Agency’s Earth Observation for Oil and Gas (EO4OG) project, actually four projects that set out to study the geo-information needs of the sector and what services and products might best meet them. The projects identified 224 oil and gas challenges amenable to satellite investigations. These have been whittled down to 19 use cases available on the EARSC website. Earth observation is also benefiting from a new breed of satellites with greater resolution and bandwidth and more flexible deployment. Falling acquisition costs and high-performance processing with GPUs complete the rosy picture.
Robert Long (IHS Markit) discussed approaches to map web services. IHS offers SOAP-based integration but this doesn’t work with new analytical tools like Esri Insight or Microsoft Power BI. The alternative REST protocol is under investigation. IHS is interested in the possibilities of the cloud and has some proof of concept projects underway.
Statoil turned to its Esri GIS when looking for an integration/planning platform for its offshore windfarms. Tor Inge Tjelta, from Statoil’s New Energy Solutions unit, presented the offshore Scotland ‘Hywind’ project which, ‘if and when developed’ on the Dogger bank will be the largest in world with a 40 x 20 km extent. GIS has allowed for real time shipping activity tracking and mapping unexploded ordinance (the site was a WW2 battleground). The study rolled-in more geological and geotechnical data. The Dogger Bank was previously thought to be a sandbank, it turned out to be a glacial moraine. GIS is used to model, visualize and communicate with contractors. The IOGP seabed survey data model also ran.
According to John Seabourn, the UK Oil & Gas Authority’s digital offerings are now ‘spatial by default and web by default.’ OGA, with help from Fivium has several data sets available from its Portal much in ‘open source’ i.e. shapefile format (curiously, OGA eschews the Inspire EU mapping standard.) A range of Esri technologies have been deployed to promote UK oil and gas including a 30th Round Web App for data release and Story Maps of historical activity.
Vidar Brekke presented AkerBP’s ‘Kartportal’ a cloud-based data lake combining technology from Microsoft, Esri, Geocap and others. A full stack of Esri software is deployed in an enterprise agreement that underpins Aker BP’s digital transformation. GeoEvent services stream real time information into the system alongside third party data sources including Norway’s License2Share and Rystad. The system combines geoscience data from Petrel, Geoteric, LR/IC, PetroMod and Trinity. Interpretations are captured as polygons in the GIS database. Other tools of the trade include Microsoft Office 365 and (real soon now) the Unity rendering engine and Geocap’s seismic-in-ArcGIS. More from the conference website.
Are Føllesdal Tjønn heads-up Aker Solutions’ new software unit. He hails from DNV GL. Sastry Yagnanna Kandukuri leads the new 3D printing center in Singapore. Ben Oudman is EAME oil & gas regional manager.
Michel Alain Proch is group chief digital officer with Atos. Patrick Adiba is SEVP, CEO North America operations.
Adrian Reyes has joined Atwell as senior project manager, oil & gas.
David Stroble is CFO at CAM Integrated Solutions. He hails from Talisman Energy.
Florence Verzelen has joined Dassault Systèmes as EVP. She hails from Engie.
Dennis O’Neill (W. hemisphere) and Dave Wallis (EAME) have joined Energistics as advisors.
Johan Kinck is now business development, data management specialist at ET Geo AS. He hails from Cegal.
Knut Eriksen (recently retired from Oceaneering) has joined FairfieldNodal’s board of Directors.
Halliburton and Akwa Ibom have opened Nigeria’s first oil and gas training center.
Jennifer Presley has been promoted to executive editor at Hart Energy’s E&P magazine. Rhonda Duey has retired, but will continue to contribute content as senior editor, exploration.
Geoff Wagner is EVP and chief commercial officer at Helix Energy Solutions. He hails from Atwood.
Victor Barcot is MD of Houlihan Lokey’s oil & gas E&P group. He was previously MD of Deutsche Bank’s oil & gas unit.
OPIS has named Alexandra Kern director of business development for Mexico. IHS Markit has appointed Lord Browne and Nicoletta Giadrossi as directors. Lance Uggla is now Chairman and CEO following Jerre Stead’s retirement.
Tom Wilson has joined the iLandMan sales team in Dallas.
Kadme has hired Dirk Adams as Houston-based sales rep.
Anne Siw Uberg Berge is VP MarCom with Kongsberg Maritime. Ariane Jayr is VP Sales at Kongsberg Digital Energy.
Nilesh Dayal and Senjit Sarkar lead LEK Partners’ Houston oil and gas unit.
Howard Gruenspecht has joined the MIT Energy Initiative as senior energy economist. He hails from the US DoE.
Brian Coffman is Motiva Enterprises’ president and CEO replacing Dan Romasko. He hails from Andeavor Corp.
NEN has appointed Rik van Terwisga as its MD, replacing Piet-Hein Daverveldt. He was previously with Ecorys.
Elisabeth Maråk Støle is CEO of Norce, a new merger of Uni Research, Christian Michelsen Research, the International Research Institute of Stavanger, Agderforskning and Teknova.
PG Flow Solutions has promoted Steve Paulsen to CEO. He succeeds Roy Norum who moves to EVP sales.
Armando Gomez (Halliburton) and Louis Hendriks (Global Value Web) are now PIDX country ambassadors. Gomez for Latin America and Hendricks for the EU.
RPSEA has released its Technology Roadmap, outlining the US’ oil & gas research needs for the coming decade.
Jennifer Ricklin has been named director of the CMU Software Engineering Institute software solutions division.
Anna Hardesty (Ryder Scott) is now a director of the Society of Petroleum Evaluation Engineers.
Brent Vyvial has been appointed Stress Engineering Services principal and lead of its midstream practice.
Teradata has named Oliver Ratzesberger, COO.
Grace Bochenek is to retire as Director of the National Energy Technology Laboratory.
Death
Flotek reports the death of Jerry Dumas, Sr., the company’s leader and longtime executive.
AspenTech has acquired Apex Optimisation, developer of the GDOT software that aligns processes, planning and scheduling in refineries and petrochemical plants.
3esi-Enersight has acquired Aclaro Softworks, creator of the PetroLook reserves management system.
Element Analytics has raised $19.5 million series A funding the VC arms of GE, Honeywell, ABB, Mitsui and others.
Gryphon Oilfield Solutions has received a ‘significant’ investment from Saudi Aramco Energy Ventures and CSL Capital Management. Funds will be used to expand Gryphon’s dissolvable tool portfolio. Cash committed to date exceeds $1.5 billion!
Maana has raised $28M from China International Capital, Accenture and others. Investments to date come to $68 million. At the same time, Accenture has partnered with Maana in a ‘strategic alliance’ targeting oil and gas.
Petroleum Experts has acquired Midland Valley Exploration, developer of the Move structural geology tool.
Kongsberg Digital has taken a 34% stake in NSG Digital. The deal is set to ‘digitalize’ the oil and gas and offshore wind supply chain. A new logistics system, NSG End-to-End has been developed on top of Kongsberg’s Kognifai platform.
AspenTech has acquired RTech’s ‘Cipher’ IIoT cloud-based software and edge connectivity assets.
French geoscience service provider Georex was wound-up mid 2017. Its assets are now held by Groupe CVA.
Tieto has acquired Petrostreamz whose PipeIt software will integrate Tieto’s Energy Components offering in a new integrated hydrocarbon management solution.
Ex-Shell employees Dick Nijen Twilhaar and Willem Peuscher have created the Safety Leaders Foundation to leverage IT in HSE. The pair have developed the ‘iLife-Saving Rules’ game, based on the 12 rules they developed while in Shell.
The V8 release of Phast, DNV GL’s software for process hazard analysis, introduces a new dispersion model to increases the accuracy of predicting the movement of short-duration toxic clouds in a windy environment. Phast 8 also adds support for modelling releases from buried pipelines and more realistic modelling of the true nature of fireballs. The ‘along wind diffusion’ model was co-developed with BP.
Lloyd’s Register has kicked-off a global program to accelerate safety innovation. The LR Safety Accelerator will test innovative digital solutions to critical safety and risk challenges. Technology businesses can apply for funding to trial their digital solutions in an industrial environment. The first theme is safety on-board. Applications open in summer 2018.
The US Chemical Safety Board is investigating a fatal gas well explosion near Quinton, Oklahoma. Lease holder Red Mountain Operating and contractor Patterson-UTI Drilling are collaborating with the CSB on the investigation into the incident which occurred on the Pryor Trust 0718 1H-9 well.
The CSB also recently provided an update on its investigation into the fires which occurred at the Arkema plant, in Crosby, Texas, as a result of Hurricane Harvey. Preliminary results of the CSM’s findings are summarized in a video.
The IOGP has released Report 432, ‘Managing HSE in a geophysical contract.’ The report provides a framework of best practices and standards for geophysical operations. A supplement (IOGP Report 432-01) covers the management of subcontractors and temporary workforce in geophysical operations and another (IOGP Report 432-02) covers risk management in geophysical operations.
The World Wide Web Consortium and the Open Data Institute recently carried out a survey of practices and tooling for Web data standardization. The results of the study offer a useful summary of the history of data on the web with its roots going back to Tim Berners-Lee’s original 1989 design pattern for the web and subsequent initiatives to promote sharable, open data such as the semantic web and the linked (open) data movement.
The study enumerates a long list of W3 groups with an ‘interest in web data standards’ but warns that they ‘vary considerably in how active they are.’ This is an understatement in respect of the now defunct Oil & Gas W3 group which peaked at half a dozen individual members before it closed.
The study includes some interesting editorial content. For instance, the W3 considers that the use of ‘complex ontologies’ could be avoided through the use of machine learning algorithms applied to a training corpus. ‘Cognitive architectures’ like John Anderson’s ACT-R ‘have proven themselves in terms of replicating common characteristics of human memory and learning.’
The W3 provides statistics on the number of downloads it has seen for various technical documents. Top of the list is the Semantic sensor network ontology (vocab-ssn) and the Time ontology in OWL (owl-time). JSON-LD is more popular than other formats reflecting the ‘popularity of JSON amongst web developers, superseding the previous high levels of interest in XML.’
Developers often express negative sentiments about the semantic web due a ‘them and us’ attitude with regard to their linked data colleagues, compounded by the complexity of OWL ontologies and the esoteric focus of much published work.
The report is something of a mea culpa for the W3C whose ‘Web of Data’ ‘needs greater visibility both within the W3C Team, W3C Members and the public at large.’ There has been a lack of guidance for communities interested in developing standards, ‘new approaches are needed to sustain the level of resources needed.’ The study received financial support from Innovate UK’s Emerging and Enabling Technologies program and the Open Data Institute.
France’s IFP Energies Nouvelles R&D organization has teamed with the prestigious Collège de France (founded in 1530) on a machine learning challenge. The challenge involves predicting residual oil saturation in a porous media from a 500 sample labeled core dataset provided by IFPen.
Data science students are invited to demonstrate the application of statistical methods that best predict residual oil from the three-dimensional microstructure of a core. Sign up for the Challenge on the Collège de France website and or watch the video (both in French).
The IFPen 2018 Data Challenge is run by newly-elected professor of data science at the Collège de France, Stéphane Mallat.
Amros Corp. and QuantumPro have signed a technology alliance to address reservoir and production challenges in shale plays.
Amalto and ConsenSys have launched Ondiflo, an Ethereum blockchain-based solution to ‘revolutionize’ ticketing-based processes in the oil and gas industry.
Refiner Saras is to implement Aspen Mtell big data/ML package to drive reliability across its refinery operations.
Cognitive Geology has secured a $1.2 million contract from Shell to develop its geological mapping software. In another deal, of undisclosed value, Baker Hughes/GE is to bundle the product with its JewelSuite modelling software.
Cegal is to establish a common datacenter, delivered as infrastructure as a service, for Norway’s EPIM data organization.
IFPen’s PumaFlow simulator will be fully integrated into Kappa Engineering’s Workstation by 2021 and will share numerous features with Rubis, Kappas’s full field numerical model.
ExxonMobil has signed a joint development agreement with MagnaBond to develop technology for through-tubing cement evaluation with a single tool, ‘prior to the arrival of a costly rig or workover unit.’
Exxon has also teamed with nine other oil and gas companies to form the Plugging and Abandonment Collaborative Environment, an industry network to promote innovative plugging and abandonment technology. PACE is run by OTM Consulting.
Archer has upgraded to IFS Applications 9 for Offshore Services to enhance organizational transparency and provide secure access to information across the supply chain.
Kwantis and Pakistan Petroleum are jointly evaluating drilling risks and performance. Kwantis has also signed a technical and commercial partnership with Eftech to distribute and support its solutions in South-East Asia.
Miracle Software Systems is now a PIDX member.
PrismTech is now Adlink’s IoT solutions and technology group.
myQuorum Land on Demand has been selected by Dallas-based Arcadia Operating to modernize its land management operations.
SNC-Lavalin is to provide Shell’s projects pre-feasibility and feasibility for modular options. The scopes of work under this agreement will be executed from three ‘centers of excellence’ in Singapore, Houston and Dubai. SNC-Lavalin and Saudi Aramco have signed a MoU to create and accelerate opportunities for the local workforce. The agreement is a component of the In-Kingdom total value add program to strengthen and diversify the Saudi economy.
Bureau Veritas and GE’s Avitas Systems unit have formed a strategic alliance to bring to market cross-industry, analytics-based inspection. Bureau Veritas will leverage the Avitas cloud-based platform, combining automated data collection and artificial intelligence, and offer continuous industrial risk management for asset owners.
Yokogawa has received an order for its Enterprise Pipeline Management Solutions and the its Fast/Tools scada software from the Gasco, the state-owned Egyptian Natural Gas Company.
BSI/ISO have released a draft Knowledge management systems standard, ISO/DIS 30401.
Dexpi, the P&ID data standards initiative and Cfihos, capital facilities information handover spec have announced future ‘intense cooperation’ on alignment of their respective standards.
The IOGP has published Geomatics Guidance Note 24, a 30 page instruction manual for the use of vertical data in oil and gas applications.
The OMG’s Cloud standards customer council has published V2.0 of its guide to interoperability and portability for cloud computing. The release reflects the new ISO/IEC 19941 CCIP standard and addresses the issues of containers and automation.
PODS, the Pipeline Open Data Standards organization has published a short note to clarify terminology used across its Next Gen, Lite and V7.0 releases and their relationship with the Esri APR model.
PPDM is collecting information about how industry feels about standards. Visit the PPDM survey here.
The EY 20th survey of Global Information Security (2017–18) finds that the oil and gas sector is ‘more worried than ever about the breadth and complexity of the threat landscape.’
Cyber deception specialist Cymmetria’s MazeHunter is legal ‘Hack Back’ technology that counters and contains advanced threats as they happen. The tool is compliant with the US Computer Fraud & Abuse Act.
Leidos has partnered with Nozomi Networks, Claroty and Security Matters to add passive monitoring of scada systems to its Industrial Defender portfolio.
The Canadian Gas Association has joined the Downstream Natural Gas Information Sharing and Analysis Centre. The Centre provides physical and cyber-threat information and monitors industry-affecting events.
The Carnegie Mellon Software Engineering Institute has produced its 2017 Emerging Technology Domains Risk Survey, a 28 page investigation into the security aspects of blockchain, IoT, AI/ML and robotics.
CGI has opened a security operations center in Germany to provide commercial and public sector clients with IT security services.
The EU has established a Computer Emergency Response Team (CERT-EU) to protect against cyber attacks on EU institutions.
The NATO Cooperative Cyber Defense Centre of Excellence has published ‘Frankenstack: toward real-time Red Team feedback.’ Also of note is the CCDCIE’s 2017 Tallinn Manual 2.0 on the international law covering cyber operations.
Siemens has teamed with Tenable to offer utilities and oil and gas companies a new solution for industrial asset discovery and vulnerability management.
Crystal Group’s RCS5516FW Rugged network firewall for harsh environments provides a 1.8 Gbps bandwidth and 250,000 concurrent sessions.
Tanker, provider of encryption and key management as a service reports that thousands of passwords and security codes in plain text were discovered on Amazon S3 servers, including Accenture’s and its client’s keys. The incident ‘illustrates the importance of end to end encryption.’
A Honeywell-sponsored survey of industrial cybersecurity by LNS Research found that a lamentable two-thirds of the respondents did not monitor for suspicious cyber behavior, and this, despite the fact that over half already have been breached.
Schneider Electric has partnered with Cylance on AI-powered protection for industrial control systems.
Oildex has provide useful advice on response to the Spectre and Meltdown CPU security flaws.
The OPC Foundation and the FieldComm Group have announced a new standard information model for process automation devices. The standard targets ‘multi-vendor interoperability and simplified integration.’ A working group is to leverage the FieldComm Group’s experience with the HART and Foundation Fieldbus’ field device integration (FDI) protocol. The 2007 FDI protocol will be aligned with the OPC UA base information model and companion device information specification.
FieldComm Group President and CEO Ted Masters said, ‘FDI provides the new standard (sic) for device integration to deliver a protocol independent path to configuration, diagnostics and runtime operation for process devices. Our partnership with the OPC Foundation further builds on the common information model of both to deliver process automation data in context.’
Thomas Burke, OPC Foundation president added ‘OPC and FieldComm are working on this important initiative, and will be partnering with other organizations, end-users and suppliers to make the dream of a standardized process automation device information model a reality.’
Comment – the announcement sounds strikingly familiar to another process industry standardization initiative, The Open Group/ExxonMobil’s ‘Open Process Automation Forum’ of which OPC is a member!
PSE has signed an agreement that makes its gProms process modeling software globally available throughout ExxonMobil. The agreement also provides for co-development of advanced automation and control applications as part of PSE’s gPROMS operational excellence solutions. The deal complements ExxonMobil’s long-standing relationship with PSE for detailed engineering design and simulation tools as well as expert consulting services.
gProms is a ‘unified, equation-oriented’ process modeling environment for downstream applications from complex catalytic reaction and separation to wastewater treatment. Mathematical optimization and system analysis, driven by high-fidelity process models, are said to accelerate innovation, optimize process design and operation and manage technology risk. More from PSE.
You probably see self-driving vehicles as being somewhat futuristic. Not at all. They are here now! working in the oil patch. Oil sands miner Suncor has commenced a phased implementation of autonomous haulage systems (AHS) at its company-operated mines, starting at North Steepbank. Over the next six years, 150 AHS will be deployed. The technology offers many advantages over existing truck and shovel operations, including enhanced safety performance, better operating efficiency and lower operating costs.
Initial deployment is for Caterpillar’s 400 ton capacity 797F, retrofitted for autonomous operations with Cat Command software. A smaller unit, the 793F, used in Australian mining operations for three years, has achieved a 20% productivity advantage over standard trucks. Caterpillar also provides its Command package to other vendors, notably to retrofit Komatsu’s 930E truck.
Suncor COO Mark Little said, ‘Autonomous trucks operate predictably and employ a suite of safety features like prescribed route mapping and obstacle detection systems. AHS decreases incident rates and injury potential.’ Implementation is ‘changing roles and skill sets,’ Suncor is working with its union to ‘minimize workforce impacts.’
Muscat, Oman-headquartered Bahwan CyberTek (BCT) has rolled out ‘Geodatafy*,’ a database management solution for subsurface data. Geodatafy uses a web GUI and a flexible Lucene/Solr search engine to find structured and unstructured data across the enterprise. Selected data can be viewed or exported to other geoscience and engineering applications. David Barnett, one-time technical architect upstream with BP, is Geodatafy product manager and technical architect.
Barnett explained the rationale behind Geodatafy thus, ‘For years, organizations have been dealing with locked down proprietary data sources making it very hard to extract to the full potential from their data. Digitalization and transformation programs attempt to get to grips with what is a huge issue for many. A drive to get technical applications and data into the cloud has spawned great thinking and open mindedness on what was once seen as ‘too difficult’ or ‘too specialized.’ Geodatafy facilitates organizing these disparate data sources and provides a perfect integration platform for all things Oil & Gas.’ BCT Oil and Gas is a division of BCT’s software, services and consulting business. BCT CTO is Clay Harter formerly with Tibco/OpenSpirit.
* a verbification of geo-data!
Proliferating process control protocols (Exxon/TOG and OPC/Fieldcom, page 11, to name but two) got us thinking about the OSIsoft message format, OMF. A casual glance at an OMF message revealed that units of measure are handled in a cavalier fashion, as free text annotations in a ‘description’ field. In a short email exchange, we asked why, say, OPC-UA was not considered. We also asked how robustly OMF handles metadata like units of measure. A spokesperson told us ‘We do offer an OPC UA connection, if you want to use it, it’s all yours. But there are situations where someone may not want to reformat into OPC just to send data to the PI System. Here you can use OMF as an alternative.’ OSIsoft did not rise to our UoM bait!
On the metadata front, there are good reasons to go with OPC UA since it can handle units rigorously, with the EUInformation DataType. It also leverages the UNECE Recommendation N° 20 for a manufacturer’s equipment ID. Of course, the availability of a data field does not mean that it will necessarily contain any data! ‘Vanilla’ OPC by the way did not handle units very well. We will be continuing our investigation into metadata over the next few months. Your opinions and technical input are welcome – info@oilit.com.
NIST, the US National Institute of Standards has released a very informative draft Blockchain Technology Overview (Nistir8202).
A new MIT study investigates the factors that determine the breakeven points of tight oil production.
Speaking at a recent Credit Suisse gathering, CEO Patrick Schorn provided a bit more (bit not a whole lot more) information on Schlumberger’s ‘cognitive’ E&P environment, Delfi.
Tendeka reports successful field trials of its cloud-connected wireless intelligent completion system.
Total has released a rather lyrical imagining of the Plant of the Future where, ‘digital is ubiquitous, including 3D printing, drones, sensors and big data. Yet people remain the heart and soul of the new plant. Multidisciplinary, connected people interact easily with one another and with the outside world. The plant of the future is responsible, efficient and sustainable.’ Checkout the PoF video here. The PoF is seemingly, ‘already on the drawing board.’
IPCOS has just published a white paper explaining its philosophy of tuning interacting PID loops, heralding ‘the end of the era of trial and error.’