Oil IT Journal: Volume 21 Number 9


World of Watson

Woodside struts its cognitive stuff at IBM’s Las Vegas spectacular. Watson’s Engagement Advisor and Explorer are changing how Woodside works. ‘Willow,’ an intelligent oilfield avatar unveiled.

Woodside’s Russell Potapinski has given a resounding endorsement to IBM’s Watson as a means of leveraging its engineering document and data resource. Woodside’s cognitive capabilities build on IBM’s Jeopardy-winning artificial intelligence technology. Following implementation of the Watson Engagement Advisor for linguistic analytics, Woodside deployed Watson Explorer to expose its document corpus as a body of accessible knowledge for the enterprise. Explorer extracts insights from decades of engineering and geological knowledge.

Speaking at the 2016 World of Watson event in Las Vegas last month Potapinski reported on three out of the twelve Watson instances that Woodside has deployed. First up was Watson for Projects where some 33 thousand engineering and drilling documents have been ingested. These were analyzed using a human-machine reinforced learning process conducted by Woodside’s subject matter experts, some of whom have put back their retirement to train the system.

The system can field high level questions such as ‘what are the lessons learned from the Vincent phase III drilling campaign?’ In training, the system returns multiple answers which are ranked by the SME and used to ‘teach’ Watson.

The Explorer now provides pertinent insights for engineers and is used to onboard new employees. As project teams disband and reform, Watson’s ‘memory’ means that lessons learned really get applied. Incidentally, it would take five years to read all the documents in the projects instance.

Watson for drilling likewise has evaluated all prior art, chiefly the well completion reports. The system now understands the relationship between inflows, kicks, stuck pipe and other drilling upsets. It used to take Woodside’s drillers weeks to go through all relevant reports from its own and partner-operated wells. W4D’s GIS interface lets well planners zoom in on an area of interest to show geohazards immediately and to analyze kick risk probability ‘in seconds.’

Watson does not replace humans. But while humans are ‘lousy’ at reading thousands of reports, machines are lousy (for now at least!) at figuring what to do with the information. Putting the two together has changed Woodside’s way of working.

But how do you interact with dozens of Watson instances that blend information from multiple systems? The answer is Potapinski’s pièce de résistance, ‘Willow,’ a bi-directional natural language interface to Woodside’s Watson instances. Willow not only answers questions but displays pages of relevant documentation and even a graph of ‘the evidence,’ tracing the logic and semantics from its response back to the original information sources. Pretty nifty stuff! Watch the video here.


IoT action!

Internet of Things wheeling and dealing as AspenTech buys MTell in $37 million deal while GE adds Bit Stew to Predix data platform.

AspenTech has acquired Mtelligence Corp. for $37 million comprising $32 million in cash plus up to $5.5 million more held as security for ‘certain Mtell obligations.’ MTell’s ‘predictive and prescriptive’ maintenance technology, Previse, is an ‘end-to-end’ machine learning based solution that monitors equipment health 24/7. Previse detects early indicators of failure, diagnoses root causes and prescribes corrective action. MTell also develops and markets ‘Reservoir,’ a high performance, scalable, big data repository for large volumes of time series, event and asset data from multiple sources.

~

In a separate announcement, we learn that GE Digital has acquired Bit Stew Systems, along with its MIx Core solution for complex data integration and analysis across OT and IT systems*. MIx Core automates data integration by applying machine intelligence to the process.

Mlx Core will join GE’s Predix platform, extending its capability for integrating ‘data in motion’ from the edge right into the cloud. GE is a long time backer of Bit Stew having led a series B investment round in 2015.

* operations technology, information technology.


In praise of Open Spirit

Neil McNaughton follows up last month’s promise to investigate commerciality and openness with a potted history of interoperability initiatives. COM for Energy, Project Synergy, POSC Business Objects are long forgotten. But Open Spirit is still alive, almost 20 years on. How come?

Industry software developments fall along a spectrum of standards, consortia, in-house and outsourced and commercial vendors. There is and has always been a sort of yin and yang thing between the idea that software, especially that which is concerned with data access, should somehow be open and that a vendor tool with the best data access in the world is missing something if it does not have an ‘open’ API. Folks blow hot and cold about the different approaches.

Looking back through the first few years of Oil IT Journal (then Petroleum Data Manager) I was reminded how lively the interoperability debate was back in the day, from 1996 to the end of the millennium. The interop issue was lively and complex with many facets and contributions from the two major standards bodies, PPDM and POSC (now Energistics), the two major vendors Landmark and Schlumberger and the oil and gas majors, particularly Chevron and Shell. These majors have, over the years, divested themselves of bits and pieces of intellectual property, either to standards bodies or to consortia, in the hope that they might be ‘taken up.’

The year 1999 was something of a high-water mark for IT at large, with great expectations and investment before the dot com bubble finally burst. The upstream shared in the general enthusiasm and there were several ‘initiatives’ targeting interoperability between upstream data stores. COM for Energy, Project Synergy and the POSC Business Objects are history, albeit interesting history, but OpenSpirit is still standing. Let’s try and see why…

The OpenSpirit Alliance was announced at the December 1997 SEG conference. OSA built on Shell’s in-house Spirit II development, an application-independent software platform that promised ‘plug and play’ upstream application interoperability. Chevron was also a backer of the OSA, bringing its own ‘object integration server’ to the table. Chevron’s Clay Harter was a strong advocate of the technology and the possibility of a move from ‘bloated applications’ to a ‘more modular computing environment with slimmed-down apps talking to data stores through an OSA middleware layer.’ The Shell spin-out PrismTech was named development contractor for the project.

Throughout these early years, there was some overlap and confusion as to the respective interoperability initiatives. COM for Energy was eventually replaced by the vaporware of Microsoft’s ‘upstream reference architecture,’ now also off the radar. POSC’s business objects and Open Spirit evolved in parallel and were to a degree, in competition with each other. I say to a degree because in a 1999 interview with the Oil and Gas Journal, CEO Keith Steele described Open Spirit as ‘the first commercial implementation of the POSC business objects standard.’ A careful trawl through Oil IT Journal’s record of the events suggests that this might have been an over simplified categorization of the relationship between POSC and OSA. Whatever.

With the bursting of the dot com bubble and the general disenchantment of things IT, those holding the purse strings decided that two initiatives was one too many. Thus, in 2000, the OSA was incorporated as Open Spirit Corp. with Shell, Chevron and Schlumberger as stakeholders and Harter as CEO. Open Spirit V2.0, the first commercial release, shipped in the same year. The ‘standard business object’ ideal was downplayed in favor of a more pragmatic approach. Open Spirit was to devote its resources to adding interoperable functionality to the major vendors’ software platforms, rather than realizing the vision of a building a stand-alone componentized platform. In an interview with Oil IT Journal, Harter explained, ‘Standard is a bit misleading. OpenSpirit is rather an interoperability solution although we would love it to become the de-facto standard for upstream interoperability.’

The next chapter is the OS saga was writ in 2010 when the company was acquired by Tibco which, a couple of years earlier, had acquired Spotfire. Combining Spotfire’s analytics with data access was a smart move. Tibco has raised Spotfire’s profile in oil and gas into a strong position as our short summary of presentations made at the 2016 Spotfire Energy conference (page 5) shows. Spotfire, rather like Esri, has raised itself to the enviable position of providing an integration platform that is also an application. Both Spotfire and Esri fill niches in the portfolio (analytics for Spotfire, GIS for Esri) without competing directly with the mainstream E&P vendors. OpenSpirit provides the connectivity with Spotfire (and in some cases with Esri) into the E&P platforms.

As the only game in town, OS has its detractors. Some hanker after the days of an ‘open’ standard and bemoan the fact that OS is owned by a single vendor, Tibco. One criticism that is levelled at OS is that it suffers from the deployment issues that come with any enterprise level software that crosses operating system boundaries and software releases.

There is a feeling that there ought to be something less vendor dependent, more open, less hard to maintain. Such a desire for a ‘true’ business objects-based solution was expressed at this year’s PNEC in the surprise announcement of ‘yet another’ business objects proposal from EnergyIQ.

But stepping back from the fray it is almost twenty years since the Open Spirit Alliance launched. Attendees to tradeshows, the SEG in particular, will have witnessed Clay Harter’s countless demos of the technology. His indefatigable, hands-on approach to promoting the technology contrasts with many who delegate such activity. Harter has seen the technology from inception to widespread deployment. Hats-off to Clay for this technology’s long-term success.

@neilmcn


Oil IT Journal Interview - Bert Beals, Cray

Cray’s global head of energy talks to Oil IT Journal about machine learning in seismic imaging, Halliburton/Landmark’s iEnergy community initiative, the merits of CPU vs. GPU computing and PGS' in-memory processed Gulf of Mexico Triton mega survey. Cray’s initial support for iEnergy centers on seismic imaging but there are plans for to add support for reservoir modeling and interpretation.

Cray stole the show at last month’s Society of Exploration Geophysicists annual meeting in Houston with the unveiling of PGS’ work on machine learning-based seismic imaging (see page4). Cray is also involved in Landmark’s iEnergy community. Oil IT Journal asked Bert Beals, Cray’s global head of energy, how the initiative was going and what machine learning is bringing to seismic imaging.

Cray has engaged with Landmark for some years, notably with Steve Angelovich, SeisSpace product manager, on seismic processing workflows. Recently we have been collaborating on how to migrate software to current technology, with the resulting announcement of SeisSpace/ProMax being certified to run on our CS400 cluster supercomputer*.

What is your focus with iEnergy?

Many Landmark clients have not upgraded their hardware, especially processors, for a few years. It can be hard for a software company to advise on this. Which is where we at Cray can help, by removing hardware impediments to running the current versions of SeisSpace processing software and getting the full benefit of the latest technology.

What particular technologies are we talking about?

This depends on the client, but it might be a migration to the latest operating system, RedHat Enterprise 7 and or the new OmniPath architecture, Intel’s latest on-processor embedded interconnect.

You mean as an alternative to OpenMP?

No. This is down at the hardware level, more of an alternative to Ethernet. Interconnect has been a longstanding bottleneck in high performance computing. We are also helping clients analyze the potential benefits from new processors. If their current processors are 2-3 years old, what performance hike can they expect by moving to the current generation, say from the Intel IvyBridge to Broadwell.

Migrating a cluster in the current climate seems a bit improbable!

OK, the current downturn means that budgets have been budgets ripped apart. But Cray takes a long term view. I’ve been through half a dozen downturns since the 1970s. The key thing is to innovate your way out a downturn with smart technology. You can’t cost cut your way to profitability indefinitely.

We reported on PGS’ Abel supercomputer recently as being CPU-based (as opposed to GPU) for ease of programmability. Is that a fair categorization?

Yes it is. Abel turned seismic processing inside out. Instead of the old Beowulf cluster and the massive/embarrassingly parallel approach which involves significant data movement and reorganization, PGS has refactored its code to load everything into memory and process data in situ. The only problem was that the huge Gulf of Mexico Triton survey needed 600 terabytes of memory, which is why PGS came to Cray.

And what is new with Galois?

Galois adds more capability, allowing for different workflows from a separate system image that shares storage with Abel. It too is a CPU machine.

This is all very different to Cray’s GPU-based machines we hear so much about!

It depends on which press release you read! Yes, we do have a lot of GPU-based machines, but we also do a lot of CPUs. We are also one of the largest Knights Landing shops. KNL is a self-hosted multi core processor.

The one Intel has been talking about for years as an Nvidia killer**?

No comment.

All of this is for on-premises deployment. Elsewhere iEnergy seems to have quite a cloud focus?

We do not have a cloud focus currently. iEnergy runs Landmark’s application suites on whatever current technology you chose.

Does your work encompass interpretation? DecisionSpace as well as SeisSpace?

All this is in our game plan. First SeisSpace, the Nexus reservoir simulation and then DecisionSpace interpretation. We are also working on remote visualization which we demoed at SEG along with PGS. Here we use Nice Software*’s DCV high-end remote graphics.

Remote graphics! The philosophers stone of upstream IT as of 20 years back!

Yes! Actually, I used to be with Sun Microsystems. What has changed is that 20 years ago, the internet was not up to it. Now remoting the desktop can be done. Another thing, GPUs are getting a lot of traction in deep learning. Our new XC50 supercomputer features the latest Nvidia Pascal P100 accelerators. Also big GPU-based machines are not just for computing, you can do really high-end visualization of models as they progress, in true interactive simulation workflows.

~

* This month Cray joined the Landmark iEnergy community. iEnergy members can now run Landmark’s SeisSpace seismic processing software on a Cray CS400 cluster supercomputer. Cray claims ‘substantial improvements’ in run-time performance over other clustered infrastructures. The CS400 can scale to over 27,000 compute nodes and 46 peak petaflops.

** The earlier ‘Knights Corner’ edition was reported as a Nvidia Tesla killer back in October 2011. Both are still doing fine!

** Nice was recently acquired by Amazon Web Services.


BGS’ semantic geo-hackathon

British Geological Survey team investigates semantic pathways into multiple geological databases.

The British Geological Survey (BGS) recently organized an internal hackathon to investigate ‘semantic pathways’ that would make BGS resources more accessible. BGS’ Rachel Heaven led ‘team semantic search’ that set out to implement a ‘semantically and spatially intelligent search service’ and to ‘cut through the tangle of complex geoscience terminology and improve navigation between information resources.’

BGS’ textual information includes observations and interpretations that accompany the hardcopy 2D geological map. Current digital technology has failings in searching across such data sets and may be ‘divorced’ from the documentary evidence they are based on because of a lack of ‘provenance’ information.

Vanilla search simply matches a sequence of letters. Semantic search adds domain-specific terminology and an ‘ontology’ of relationships between terms and concepts. BGS here leverages its collaboration with the W3C’s Data Activity initiative. In fact BGS has been working to codify geoscience terminology since before the semantic web, notably via the IUGS-CGI geoscience terminology working Group.

The hackathon team used ElasticSearch to index its textual material and then used PL/SQL to create a gazetteer and query expansion tool. The BGS’ chrono-stratigraphic ontology was thrown into the blender along scripts to retrieve terms from web pages. The team demonstrated text-matched results from the indexed document collection and also manufactured a virtual cross section from disparate BGS resources including the Vale of York 3D model. Read Heaven’s blog here.


Petroleum Geo-Services - machine learning for FWI

PGS’s Abel supercomputer finds ‘hidden and unexpected insights’ in seismic data.

A blog post by Geert Wenes, senior practice leader at Cray lifts the lid on PGS’ use of machine learning in full waveform inversion (FWI) for seismic imaging. PGS’ Abel supercomputer has been used to ‘find hidden and unexpected insights in complex data’ with minimal programming effort. The test was run on the 2004 SEG/BP velocity estimation benchmark model. Previous approaches take the initial velocity model and use ‘some sort of a least squares fit’ to seek convergence to a ‘true’ velocity picture, with ‘some level of success.’

The novel PGS/Cray approach uses ‘constrained minimization of a regularized and steered misfit function’ a.k.a. machine learning, to provide what is claimed to ‘dramatically improve’ the quality of the resulting model. The supercomputer ‘learns’ the velocities for a substantially clearer final image without the artefacts of conventional FWI.

For more on machine learning in seismic processing checkout the Sinbad consortium at the University of British Columbia. Sinbad (seismic imaging by next-generation basis function decomposition) is working to adapt recent developments from compressive sensing and machine learning to seismic imaging. Sinbad members include Chevron, Conoco, Schlumberger and … PGS.


FuseIM readies for geophysical micro-services

FuseIM reports win for master data repository, envisages move to Docker-style microservices.

Following its acquisition by Meera/Target, FuseIM has won its first major contract for its PPDM 3.9-based master data store (MDS). Oil IT Journal caught up with FuseIM CTO Jamie Cruise at ECIM last month. Cruise began by showing a tablet-based GUI for MDS with the backend running in the AWS cloud. The idea is to break the dependency between application software and the data platform.

Fuse is looking now to further leverage the cloud with an Amazon web services S3 tape interface for SEG-Y data manipulation. This is in anticipation of a new Docker-style microservices architecture. Cruise opined, ‘the old guard is wedded to classic IT but today’s cost pressures are making the industry look at these new solutions.’

The microservices architecture will be supported with GraphQL* and .NET-based APIs for connectivity with standard industry packages such as Petrel. JSON style data access makes for efficient query of data stored in Oracle or SQL Server instances and for bulk data retrieval from the S3 cloud. More from FuseIM.

* As used by Facebook.


ROV-mounted Lidar survey used to 3D print hot tap connector

Fugro and 3D at Depth team on digital manufacturing for Australian well abandonment program.

Fugro and 3D at Depth report a ‘world’s first,’ the manufacture of a subsea well part using Lidar data and 3D printing. During an Australian well abandonment project, it emerged that accurate data on the wellheads was no longer available. Fugro called on 3D at Depth to perform a subsea laser scan of the wells, in 110 meters of water. 3D@D’s subsea LiDAR SL 2 acquisition technology was used aboard two of Fugro’s FCV Work Class remotely operated vehicles (ROV). Physical measurements were also collected by the ROV using V-gauges and rulers and this data was cross-referenced with the 44 million point cloud data set.

The data was then used to 3D print a full scale model of a damaged well part to help design a ‘hot tap’ connector using in ABS thermoplastic material. Other parts were made on a CNC machine tool in Acetal and high-molecular-weight polyethylene. 3D@D MD Adam Lowry said, ‘Wide area point clouds avoid costly surprises which may only be discovered when the intervention vessel is on location.’ More from 3D at Depth.


Tibco/OpenSpirit/Spotfire Energy Conference

Hess - from data lake to self service analytics. Nexen - Hadoop, R and Spotfire align unconventional reporting with SPEE Monograph 3. Spotfire ‘Insight’ big data analysis platform unveiled. OpenSpirit and Voyager team on subsurface/GIS data mashup. LSG delivers GIS data widgets to BP. OpenSpirit to see further Esri ‘widgetization.' OpenSpirit central to ConocoPhillips’ data framework.

Today, big data and analytics (BDA) are frequently presented as the ‘next big thing.’ At the Tibco/Spotfire energy forum, BDA is already core business, with a good measure of high end, Excel-killing analytics and visualization with domain-specific connectivity provided by OpenSpirit.

Hess is something of a poster child for Spotfire as David McConkey showed, with application across the enterprise, from engineering to finance. The last year has seen ‘tremendous’ growth of self-service reporting, bringing value, insight and an impetus for changing behavior. Along with the benefits, growth has created challenges in report, people, and environment management.

One use is in exception-based production surveillance with an HTML5/JavaScript data mash-up of live plant schematics, production profiles and statistics. Hess has deployed a data lake along with an enterprise data model and is standardizing its usage patterns to encourage best practices. Leankit’s Kanban board also ran. McConkey observed that today’s engineers are more savvy and demanding of IT.

Nexen’s Tobi Adefidipe showed how its data lake (Hadoop, R and Spotfire) is used to align unconventional reserves reporting with the SPEE Monograph 3. This presents a workflow for proven undeveloped reserves (PUD) that involves statistical and spatial analysis. This used to involve many hours of multiple software applications. The whole process has been streamlined into a single, interactive Spotfire dashboard with statistical analysis running in Tibco enterprise runtime for R (Terr). A machine learning genetic algorithm with 25 variables has been used to estimate reserves for some 14,000 Eagle Ford wells.

Kelby Reding (Chevron) showed how Spotfire’s automation services can be used to distribute visualization workflows to users and provide a consistent look and feel across the organization.

Gwyn Thorn (Shell) with help from Dan King (WG Consulting) have developed a finely tunable access control system for Shell’s Spotfire setup using Iron Python, JavaScript, HTML and Spotfire controls.

Tibco’s Michael O’Connell and Catalina Herrera demoed new Spotfire functionality for data discovery, graphics, maps and predictive analytics, notably with connectors for SAP Hana and Spark certification. Collaboration is supported with threaded, searchable ‘conversations,’ annotations and guided analytics. Mobile use has been enhanced and the API updated for simple web deployment. The Tibco Insight BDA platform* combines data integration, analytics, collaboration, and complex event processing into a ‘closed-loop system of insight.’ Insight comprises Spotfire and Tibco StreamBase. Again, OpenSpirit provides connectivity with upstream data sources while Tibbr adds collaboration.

Tibco’s Clay Harter teamed with Sam Smith (VoyagerGIS) to demo search across a mash-up of subsurface and GIS data. The approach is to crawl and index internal and external data sources and expose them via various Apache Solr servers including ‘Toes,’ Tibco OpenSpirit enterprise search and OpenSpirit Connect web services. Voyager exposes vendor data sets from Woodmac, IHS and Tellus inter alia. In a follow up presentation, Harter demonstrated access to Tellus and CGG data sets retrieved via OpenSpirit. Harter also demoed rule based tools for matching, blending and data Q/A to ease the data manager’s task and ensure consistency by automatically propagating validated data across systems. OpenSpirit also offers several device independent HTML5-based subsurface viewers developed in conjunction with INT and built into a ‘customizable platform for viewing and analyzing E&P data.’

Todd Buehlman, (Logic Solutions Group - LSG) has leveraged ESRI’s ArcGIS Portal technology to let users construct custom web map applications using widgets without programming. Oil and gas clients like BP can download widgets from GitHub and customize them. LSG has used the Esri WebApp Builder along with OpenSpirit’s web services to build widgets for selecting and manipulating seismic lines and surveys and locating child objects. Watch the OpenSpirit widgets video here. LSG and Tibco’s plans include the further ‘widgetization’ of OpenSpirit functionality within the Esri framework.

Shah Zaman, (ConocoPhillips) showed how OpenSpirit copy manager and rules manager have saved ‘dozens of person-hours per week’ for geologists and geology techs. ConocoPhillips uses OpenSpirit to integrate weekly IHS Enerdeq updates with Landmark OpenWorks, IHS Petra, and Schlumberger Petrel Studio applications. The same technology also pushes interpretation data back the well master database daily. More from the Tibco Energy Community.

* Tibco’s own ‘next big thing,’ the ‘Tibco Insight Platform (TIP)’ is described as a ‘digital nervous system’ for real-time data management, visualization and analysis. TIP comprises Spotfire, Stream Insights and Jaspersoft, Tibco’s open source business intelligence suite. Also of note in Tibco’s burgeoning product line-up is the ‘Fast data platform.’ FDP is Tibco’s solution for internet of things initiatives and its support for big data in the cloud. Download the free eBook for more.


2016 ECIM Data Management conference, Haugesund

Teradata on internet of things. NPD on delivering data in the downturn. Sirius students ’skeptical’ of industry. ConocoPhillips ‘cuts down hedge’ between data managers and business.’ Statoil’s ‘Gold Finder.’ Diskos gets Whereoil API. Cegal/Iron Mountain’s big data cloud solution. Schlumberger’s Studio World Map for Diskos. TNO on North Sea Data Management Forum. ILAP, more ...

The 2016 ECIM data management held up well in the face of the severe industry downturn with 220 attendees and some 70 presentations. In his introductory keynote, Teradata’s Niall O’Doherty asked whether the ‘internet of things’ was something that matters or just a ‘big eye roll.’ IT is terrible at naming things, something that leads to confusion for users and especially for management. For some, marketing literature has been revisited with a search and replace of ‘big data’ for ‘IoT.’ Sensors have been around for a very long time but their data has not been very accessible. For the IoT to be transformational we need to look at systems as a whole, both IT and operations technology, although IT/OT cultural differences make this hard. IoT success in other industries come more from good data curation rather than fancy new math. One Teradata poster child is Siemens whose ‘Sinalytics’ IoT initiative was the keynote feature of the 2016 Teradata Universe. The buy not build issue is also germane to the IoT. In other words, as O’Doherty asked, ‘are you an oil company or a data integration company?’

Bente Nyland, head of the Norwegian Petroleum Directorate noted that cost cutting means that many projects and new technology developments have been halted, although she wondered if all the cost cutting was really necessary. It is now particularly important to manage business information through the hiatus in activity. Norway has an unparalleled record in the fifty years since its first licensing round. Data management is a cornerstone of the Norwegian model which promotes competition on use of, over access to, data. The NPD’s data deliverables are the Fact Pages, Fact Maps and via the Diskos consortium. NPD also produces regular Resources Reports showing major undiscovered oil and gas.

The falling oil price has hit Norway hard. Recently there has been a slight recovery leading to a hope for stable prices ‘a little higher than today.’ The sudden shift from growth to cost cutting has meant that projects, people and whole companies disappear in acquisitions and mergers. This means a loss of expertise, stiff competition for funds and more short termism. Data re-use is likely to be important in the future and we need to avoid losing focus. Today we take for granted the availability of reliable data.

The NPD has laid the ground rules to encourage good data management. The authorities have a legal right to ask for data in a usable form and to mandate long term reporting and data retention. Despite ‘how to’ guidelines, the NPD is concerned with under reporting, reporting errors, data loss and degradation, and declining competency. We need to prioritize the retention of old data to be in a position to reap the future value. So, ‘clean up your data stores and work for the long term.’ Data management underpins long term value creation and the NPD is maintaining its focus on ‘simple, smart solutions’ based on open standards under the Diskos umbrella.

David Cameron of the Sirius SSI at the University of Oslo observed that in 2020, the oil price could be anything between $20 and $100. The world may even ‘turn away from oil.’ But in all events, we need to keep retain expertise in oil and gas ‘even if it is just for maintenance, safety and decommissioning!’ Unfortunately, oil and gas has shot itself in the foot, again! In 2013, there were 420 applicants for the petroleum technology course at Norway’s NTNU. In 2013, they dropped to 31. The University of Stavanger has only 12 applicants for 25 places reflecting a ‘profound skepticism to oil, gas and heavy industry.’

Universities have a hard time keeping two masters happy. On the one hand, academia measures researchers’ output on the volume of ‘papers with fancy set algebra,’ on the other hand, industry sponsors often doubt the business value of such efforts. Sirius is the ‘center for scalable data access in the oil and gas domain.’ Collaborative R&D initiatives include the Trondheim Integrated Operations Center and DrillWell in Stavanger. Today much time is wasted finding and accessing data. The answer lies in part on cloud computing, ‘where oil and gas computing is going.’ Cameron ended up with a reprise of his Optique presentation at Intelligent Energy demonstrating natural language access to various commercial and public data sources for ‘quick and easy’ ad-hoc requests. Components include ABS, a modeling language for distributed software systems and the EU Envisage/HATS project, ‘a potential game changer for the cloud and a deliberate EU attempt to counter Google and to provide users with the tools to keep Google honest.’ One Sirius finding of note is the fact that computers find ‘negation detection’ in natural language hard to catch, let alone the double negative, ‘when no means yes!’

Kristine Karoliussen (ConocoPhillips) has ‘jumped over the hedge’ from the business of geoscience and into the data management department. Data managers are expected to provide easy access to data, but the reality is more complex. Data management does not ‘do itself.’ Changes in corporate tools (a move to OpenWorks5000, the introduction of Studio) may overlook prior data management art. It’s all very well to say ‘just load the well data’ but this implies locating surveys, metadata and context. The business also tends to overlook reporting, data QC, maintaining the ‘gold’ databases and keeping OpenSpirit up and running during critical well operations. ConocoPhillips has put together a data management advisory team to handle such issues. Procedures and standards are complex. It would be great to a common standard, a Google-like portal for data access shared by operators and contractors. Times are challenging but this maybe an opportune moment to rethink how we do things and to ‘cut down the hedge,’ providing geoscientists with more insight and interaction with data managers.

A presentation from Frode Jansen Lande on Statoil’s ‘Gold Finder’ data discovery system demonstrated that effective solutions can be developed in-house with minimal (400 hours) effort. Statoil, like most large companies, has data and scattered across many systems and information silos. Gold Finder mimics a file browser with a single sign-on and provides a user-configurable results tree view. Master data means that the retrieved results are ‘richer than any single system.’ The web app leverages a single flat database table and runs on Oracle and AngularJS. The system was well received and went straight into production. Users of say, Recall, are even finding stuff in their own systems! Gold Finder has revealed data busts, strange names and duplicates. Statoil is now working to improve connectivity with document systems and to add Solr/Lucene for full text search and indexing. The ensuing discussion revolved around the relative merits of buy vs. build with a representative of the Norwegian vendor community pointing out that this type of functionality was already available in the marketplace, with map-based search too. Statoil is nonetheless well pleased with Gold Finder which has ‘succeeded where many earlier portal projects have failed.’

Eric Toogood and Elin Aabo Lorentzen of the Norwegian Petroleum Directorate gave an update on Norway’s Diskos national data bank. Diskos technology is provided by CGG, Kadme and Evry. The low oil price means that Diskos is losing members due to mergers and acquisitions and leavers and this makes for rising costs for the remainers. The industry is also losing skilled data managers. Legacy data remains a problem. Much data is on now obsolete 9 track tapes in storage. Remastering these is a complex process requiring investment and specialist knowledge. Public data in Diskos has been a success and NPD is working with CGG to make public data from relinquished areas available at release date. A programming interface (API) for Diskos is finally seeing the light of day, based on the Kadme Whereoil REST API. The ‘teething troubles’ with the Diskos seismic module have now been fixed and a lot of the data has been loaded. The trade module is proving more challenging. Production data is working well thanks to the Epim Mprml machine to machine reporting with its Schematron-based data validation. Currently data is still submitted on physical media. The plan is to move to online data load possibly leveraging an open source technology stack from Netflix. Currently SEG-D 3.1 is mandatory for reporting. A new SEG-U ‘unknown’ format was mooted, ‘something that blends SEG-D with SEG-Y into a common shared format for acquisition and processing.’

A joint presentation from Stein Sigbjornsen (ConocoPhillips) and Arve Osmundsen (Cegal) introduced a ‘big data’ seismic archive solution built on the new Iron Mountain Cloud. The new tool features a catalog and map interface with editing functions and plugins for Petrel and ArcGIS. Data is stored on EMC Isilon with replication across dual sites. Data can be cropped to an area of interest for workstation loading. In the Q&A, Sigbjornsen revealed that after migrating to the archive, the original tapes were destroyed, ‘a world first’ according to session chair David Holmes (EMC).

Odd Inge Thorkildsen unveiled Schlumberger’s Studio WorldMap for Diskos. Diskos holds some 322TB of public data that is ‘not necessarily ‘application ready. This builds on Schlumberger’s Petrel Studio/WorldMap offerings, now plumbed into Diskos for transparent access to data in the repository or in-house. Search now spans all public NPD data and data hosted by Schlumberger. SWM for Diskos is refreshed with monthly scripts that index and spatially enable Diskos data into the Schlumberger cloud in Stavanger.

Statoil’s Robert Skaar asked why oils accept paying twice for data, once to acquire it, and again to extract it from proprietary software for re-use. Skaar heads the Integrated lifecycle asset planning (Ilap) steering committee. Ilap, a.k.a. ISO 15926 Part 13 sets out to standardize scheduling data and terminology for megaprojects. The idea is to avoid having to re-key data from Primavera into SAP or between engineering contractors’ software tools. So far, some 800 terms have been standardized and packaged into an XML data transfer standard. Adapters allow users to leverage in-house software tools which are not changed. The idea is to offer users asset tracking functionality along the lines of Amazon’s ‘where’s my stuff.’ Ilap has support from ConocoPhillips, ENI, Statoil, EPIM and PoscCaesar. An Ilap draft international standard has been submitted to the IOGP ISSC.

TNO’s Stephan Gruijters provided an update on the North Sea Data Management Forum a low key international collaboration that effort that is working to ‘make reporting companies’ lives easier’ by providing common reporting standards across all circum-North Sea nations. TNO has surveyed North Sea regulators and found a desire for cooperation but also many subtle differences in data definitions. The participating regulators are interested in building a business case and possibly moving to a more formal arrangement. Gruijters wound up with a demo of the ‘North Sea Data Portal’ combining web services from four countries. This currently leverages Esri technology. In the Q&A, Gruijters acknowledged that an open data/Inspire-based approach would be possible in the future.

Pernille Hammernes and Darren Kearney (Statoil) presented on the new digital reality in our industry. Statoil’s big data and automation effort has been underway for some years. Lately ‘digitalization’ has landed on the top table and has sparked off a ‘sprint’ over the last year for ownership of the analytics function. Statoil is evolving local initiatives into corporate opportunities for generating a competitive advantage. This is achieved by bringing all stakeholders, discipline advisors together in a global calibration effort. The aim is to ‘use all the data.’

Petrel Studio and Documentum were highlighted as key components of Statoil’s digitalization effort. So far, some fifty improvement initiatives had been proposed which are being costed and prioritized. The initiative is driven from an ‘analytical factory.’ Here an innovation lab finds out what’s new and promising and tests it on business use cases. If they work, a new solution is productize and handed over to IT for deployment. Statoil is also working to plug the data science/analytics skills gap. Digitalization represents the golden age of IT and data management and an opportunity to understand new technology, to take more bets and build a digital competency in phase with the business.

One foundational project involves making data available for analytics by aggregating and consolidating diverse data sources. Statoil adopts a ‘cloud first’ approach. NoSQL and data virtualization technology herald a move ‘from silo to the enterprise,’ leveraging Statoil’s long established data management organizations in both business and IT. Current initiatives in the IoT, big data, automation and robotics feed into the ‘stepping up’ phase before ‘transforming industry.’

Shell’s Lars Gaseby wound up proceedings with a couple of quotes from some industry thought leaders viz. Michele Goetze (Forrester) the ‘demand for data and expectations are high but what the business wants is ambiguous.’ Jelani Harper (Dataversity) ‘trends in data management emerge rapidly but mature slowly.’ Gaseby added that some changes may be almost invisible. Cloud computing is used without being seen in Shell. Mark your diaries for next year’s Ecim on 11-13 September 2017 in Haugesund.


Folks, facts, orgs ...

API, Atwell, BP, GGR International, ClearEdge, ClearStream Energy Services, Deloitte, EQT, GE Digital, PG&E, Hexagon, Intsok, IOGCC, Mitsubishi, MOL Group, Altair, AGA, Marathon, Tendeka, P2ES, Piper Jaffray, Viking Venture, Navigant, Petrofac, SCA, Seanic, SHIP, Statoil.

Marty Durbin has been promoted to executive VP and chief strategy officer of the American Petroleum Institute. Louis Finkel is leaving API.

Jim Curry is project manager at Atwell.

Nils Andersen has joined BP’s Audit Committee and board of directors as non-executive director.

Chris Cottam has been appointed VP of Sales at GGR International.

Kelly Cone is now Verity product manager at ClearEdge. She hails from Beck Group’s VDC unit.

ClearStream Energy Services has appointed John Cooper as CEO and member of the board. Gary Summach is the new CFO.

Deloitte’s John England is to lead the energy and resources industry practice in the US. Scott Smith heads-up the power and utilities sector.

EQT president, Steve Schlotterbeck will succeed David Porges as CEO. Porges becomes chairman of the board.

Patrick Franklin is VP and general manager, applications at GE Digital. He hails from Google. Steve Martin is VP and chief digital officer at GE Energy Connections. He hails from Microsoft’s Azure unit.

PG&E’s president, gas, Nick Stavropoulos is now member of the Gas Technology Institute board.

Melker Schörling is to resign as Hexagon chairman next year due to health issues.

As of January 1st January 2017 Intsok will change its name to Norwegian Energy Partners.

Asa Hutchinson is now chairman of the Interstate Oil & Gas Conservation Commission. Michael Teague joins IOGCC as vice chairman and Matt Lepore as second vice chairman.

Kiyoshi Okazoe is the new president of Mitshubishi Heavy Industries America succeeding Kenji Ando who assumes the position of president and CEO, Energy & Environment.

MOL Group has appointed Berislav Gaso as upstream executive VP and member of its executive board.

Trace Harris and Oren Michels have joined the Altair board.

The American Gas Association has elected Pierce Norton as chairman of the board for 2017. Norton is president and CEO at ONE Gas.

Saudi Aramco retiree Abdulaziz Alkhayyal is now member of the Marathon Petroleum board.

Peter Soroka heads-up Tendeka’s new production unit. He hails from OMV UK.

Ray Hood has been appointed as President and CEO at P2 Energy Solutions.

Kashy Harrison is now Piper Jaffray’s senior analyst in the energy space. He hails from Deloitte & Touche.

Magnus Willumsen has joined Viking Venture as investment associate.

Navigant has appointed Mehrdod Mohseni as MD, John Donleavy as senior executive advisor, Trina Horner and Stephen Pearce as directors.

Alaistair Cochran succeeds Tim Weller as Petrofac CFO and executive director. Jane Sadowsky joins the Board as nonexecutive director following Kathleen Hogenson’s resignation.

Susan Howes is now VP engineering at Subsurface Consultants &Associates.

Ray Maza has joined Seanic’s business development team.

‘Ship,’ the EU shale gas information platform has ‘docked,’ transforming itself into an inactive shale gas information archive.

Jannicke Nilsson is now executive VP and COO at Statoil, succeeding Anders Opedal who becomes Statoil’s Brazilian country manager.

Deaths

Subsurface Consultants &Associates has announced the death of Joe Brewton, mentor, instructor and ‘extremely knowledgeable geoscientist.’


Done deals

Emerson acquires Permasense. Lloyd’s Register has acquired Rtamo and is to hook-up with Greenfence. Optime Subsea Services to merge with Telemark Technologies. Drillinginfo buys Globalview and Ponderosa Energy’s information assets. Circor acquires Critical Flow Solutions.

Emerson is to acquire UK-based Permasense, a provider of non-intrusive corrosion monitoring technologies for the oil and gas and other verticals. Permasense will integrate Emerson’s Rosemount portfolio and will complement the Roxar sand management system.

Lloyd’s Register has acquired Rtamo Ltd. (real time adaptive maintenance optimization), an Aberdeen-based software consultancy. Rtamo clients include Maersk, BG Group and Shell. Earlier this year, LR announced a collaboration with Silicon Valley-based Greenfence, a testing, inspection and certification specialist.

Norway’s Optime Subsea Services is to merge with engineer Telemark Technologies. Concomitant with the merger, Norwegian VC Holta Invest will take a ‘significant share’ in Optime. The merged company will be located at the Telemark Technology Park at Notodden in Norway’s ‘Subsea Valley’ industry cluster.

Drillinginfo is in acquisition mode and has just bought Globalview along with Ponderosa Energy’s product lines and services. Poderosa Energy’s databases and analytical tools (The Fundamental Edge and ProdCast) will integrate the Drillinginfo portfolio as will GlobalView’s risk and data management solutions.

A Sun Capital Partners affiliate has sold Critical Flow Solutions to Circor International for some $210 million. Critical Flow incorporated three Curtiss-Wright units, DeltaValve, TapcoEnpro and Groquip.


2016 Pipeline Week Conference, Houston

Good turnout for Pennwell/PODS/GITA-backed event. Chevron on compliance-driven maintenance and inspection. Geomorphic on remote control boat inspection. Rosen on geospatial simulation. Summit/BSD on 'mega gas’ rule. Infosys on process maturity. BSD on the ’summer of incidents.'

The upstream may be struggling through the downturn but, judging from the 700 plus attendees and forty or so presentations made at last month’s Pennwell/PODS/GITA* Pipeline Week conference, the midstream segment is keeping busy. What’s hot in Pipeline? Drone-based mapping and surveillance and compliance.

Chevron’s Paul Herrmann presented on ‘compliance-driven pipeline maintenance and inspection data collection.’ Chevron needed an end-to-end field data collection process to meet regulatory and operational requirements.

The solution manages pipeline repair approvals, in-field workflows and data capture to the PODS system of record. This supports ongoing operations, asset integrity and regulatory compliance activities. New Century provided the PODS database and API, and CartoPac the field data workflow management and data validation solution.

Jeff Barry (Geomorphic Solutions) demonstrated the use of unmanned vehicles (both aerial drones and remote control vessels) in pipeline integrity management. Remote control boats equipped with GPS, depth and sonar instrumentation are cheaper and safer than boats and divers.

A typical use case is to evaluate the flood risk at a water crossing before deciding on a costly replacement or potential shut-in. In addition, a buried hydrophone array can be deployed to monitor real-time changes in pipeline acoustics during a flood event. Geomorphic’s technology has been tested at the Idaho Center for EcoHydraulics Research.

Otto Huisman showed how Rosen’s geospatial simulation is used in ‘consequence assessment’ i.e. what happens if a pipeline springs a leak. Rosen’s approach builds on the ISO 3100 standard for risk management.

Model functionality includes high consequence area (HCA) mapping of spills and gas dispersion and explosion risk. The latter should incorporate real time weather information so that potential hazard zones take account of wind speed and direction. Risk is presented in summary ‘linear risk integral’ plots.

Although some speak of ‘burdensome’ regulations, for Matthew Stratmann (Summit Midstream) and Nichole Killingsworth (BSD Consulting), ‘Compliance is king!’ And the key to compliance is streamlining data from multiple sources into PODS Spatial. The US Phmsa regulator is proposing a new ‘mega gas rule’ that will encompass testing, corrosion inspection, HCA and integrity verification in a ‘traceable, verifiable and complete’ process. A similar new rule is proposed for liquid pipelines. Summit is migrating its legacy data into PODS in its safety and compliance push.

Dipayan Mitra (Infosys) observed that Phmsa reporting requirements have rocketed in the last decade, but that this has led to a gradual decrease in serious incidents over the last two decades. Infosys’ analysis of Phmsa orders to operators show that the greatest risk comes from outside the pipe and lies in ‘process maturity’ and records management. Infosys proposes a pipeline integrity business risk metric along with a conceptual architecture for risk management. Infosys has a patent (US 8510147) that describes a ‘method and system for calculating pipeline integrity business risk score for a pipeline network is provided. The method includes a step of first calculating a structural risk score, an operational risk score and a commercial risk score for each pipeline segment in a pipeline network.’

Rene Ramirez (BSD) offered a less sanguine analysis of the current situation, recalling the 2016 ‘summer of incidents’ and an aging pipeline infrastructure that has seen an incident or fine in eight of the last ten weeks. Non-compliance is often evidenced during the construction phase, in part due to poor data acquisition and traceability to source documentation. Audits are now more frequent and they are lasting longer. A major issue is the mismatch between the GIS-focused systems of the operator and the CAD and document systems used in construction. BSD advocates a ‘simple change in delivery specifications’ to make as built data reliable, traceable and complete. More from the GITA/Pipeline Week home page.

* Pipeline open data standards/Geospatial information and technology association.


Upstream crowdsourcing innovators

Crudefunders, Draupner Energy solicit investment, ideas.

If you fancy a punt on oil and gas, you have two new options. In the US, Crudefunders has launched its equity crowdfunding project, claimed to be the first portal offering direct investment in oil well drilling projects to both non-accredited and accredited investors. Crowdfunding oil and gas has been made possible following the passing of Reg CF under the US Jobs Act. US citizens can invest from $1,000 up.

On the other side of the pond, Stockholm headquartered Draupner Energy proposes a different take on oil and gas crowdsourcing. Draupner doesn’t want your money, just your ideas. ‘Creative and curious industry professionals, academics and passionate amateurs’ are invited to join the community and get access to Draupner’s systems and data. If the lucky geoscientists’ ideas are accepted, they receive ‘cash prizes and bonuses.’


Sales, deployments, partnerships ...

Archeio, Accenture, P2 ES, SAP, Geophysical Insights, AASPI, Schlumberger, Adrok, Aker Solutions, Ansys, GE, Emerson, Ipres, IFS, Industry Task Force, Society of Petroleum Engineers, Department of Energy, Leidos, Amphora, US Bureau of Economic Geology, Ikon Science, Narrative Science, Deloitte, Paradigm, Petrobras, Total, CGI, Verifone, ATIO Group, Voyager Search, WellDog, WEX.

Archeio’s machine learning-based well file solution has been deployed at Arcadia Operating to evaluate wells and data from newly acquired assets.

Hess is to use Accenture’s cloud transformation services and partner ecosystem to realize its strategic ‘as–a–service’ IT vision.

P2 Energy Solutions is offer P2 Land on SAP/Hana.

Geophysical Insights has joined the University of Oklahoma AASPI (attribute assisted seismic processing and interpretation) consortium and is to incorporate AASPI in its Paradise software platform.

Schlumberger has outsourced its finance and accounting to Accenture in a five-year deal.

Adrok and IGas have received £10 million funding from Innovative UK and the EPSRC for a feasibility study on ‘innovative remote sensing’ of UK onshore gas.

BP has awarded Aker Solutions five-year framework agreements for subsea production system EPC and equipment servicing.

Ansys’ simulation technology is to be combined with GE’s Predix platform.

Emerson is to market Ipres’ Iprisk and Ipresource solutions as complements to its Roxar ‘big loop’ reservoir management solution.

PetroWell is to implement IFS Applications 9 to streamline its operations. The solution was sold and will be implemented on Microsoft Azure by IFS’s Scandinavian partner, Addovation.

The Industry Task Force and Society of Petroleum Engineers are to ‘share and enhance’ technology development in oil and gas. ITF is to get detailed information from the SPE’s applied technical workshops while the SPE will use ITF’s roadmap mechanism for technology implementation.

The US Department of Energy has awarded Leidos a five-year, $285 million contract for program management, technical expertise and operations support to the Energy Information Administration.

Circle K has selected Amphora’s Symphony Oil software platform to manage commodity trading and risk for its European operations.

Shell and the US Bureau of Economic Geology have launched a new phase of the Shell–UT unconventional research project. Sutur-II will investigate water issues related to tight oil production in the Permian Basin.

Ikon Science’s RokDoc Ji-Fi software has won the UK Institute of Physics 2016 award for technology.

Narrative Science has joined the Deloitte Catalyst program, a network of start-ups working on artificial intelligence in business solutions. Deloitte is to implement Narrative Science’s Quill natural language generation solution.

Task Fronterra is to base its borehole data analysis services on Paradigm’s Geolog.

Petrobras and Total have formed a strategic alliance covering upstream and downstream activities in Brazil and in potential international opportunities.

Solvay has selected CGI to modernize its IT applications supports operations.

Verifone and ATIO Group are to provide Mexico’s petroleum retailers with ATIO Control Gas POS and Verifone Commander Site Controller for at-the-pump and in-store payment acceptance, fueling operations, and back office control.

Voyager Search has been selected as one of the 15 finalists in the National geospatial intelligence agency’s first ‘disparate data challenge,’ a search for innovative ways of retrieving and analyzing data from different locations and in different formats.

WellDog is to form a joint venture with the Shaanxi provincial institute of energy resources to assess coal bed methane development in Shaanxi province.

WEX has got a 10-year extension of its North American fleet card contract with ExxonMobil and Imperial Oil.


Standards stuff

OPC-UA selected for Statoil’s OneIMS initiative. IOGC reviews ECIM ILAP standard. US Metric Association’s 100th birthday. EU interop reference architecture. CFA endorses XBRL financials.

Statoil is to leverage the OPC-UA communications standard in its ‘OneIMS’ initiative. OneIMS represents a unified way of accessing operational data from different assets from enterprise systems, with a standard protocol and stand data semantics. Prediktor AS’s Apis platform will act as the OneIMS OPC-UA C UA gateway, with standards such as S95, Prodml and Witsml mapped to the OPC-UA semantic model. First deployment is on Statoil’s Johan Sverdrup field. Statoil has an option for a ‘complete rollout’ to all Statoil assets.

The International organization of oil producers is currently reviewing ECIM’s draft international integrated lifecycle asset planning standard (Ilap). Ilap terminologies are published along with the model documentation.

We missed Metric Week (again!). Moreover, 2016 marks the 100th anniversary of the US Metric Association that is helping the US ‘complete’ its transition to the metric system.

Proceeding at a comparable speed is the EU’s drive for an ‘interoperability reference architecture’ for e-government. Read The Open Group’s interview with program manager Raul Abril.

The CFA Institute has published a report on the transformation of the financial landscape by data and technology. The report provides a strong endorsement to structured financial reporting in XBRL as opposed to PDF documents and argues for additional company-specific extensions to the protocol. XBRL International has formed an entity-specific disclosures task force to work on the issues.


Wolfram/Mathematica Data Summit

AGU on data management. Mathematica ‘one liners’ impress. Version 11 embeds graph database.

Speaking at the recent Wolfram/Mathematica Data Summit, Shelley Stall of the American Geophysical Union provided a progress report on the AGU’s collaboration with the CMMI Institute. The AGU’s position statement on data affirms that ‘Earth and space sciences data are a world heritage.’ The AGU’s data effort is likewise aligned with the President’s 2013 OSTP memorandum advocating that ‘the results of federally funded scientific research are made available to the public, industry, and the scientific community.’ The AGU’s data program is developed in partnership with the Coalition for publishing data in the earth and space sciences. AGU has adapted the CMMI’s data management maturity framework to its requirements. AGU member organization the USGS is also working on its data policies to align with federal open access/open data memoranda with the evolution of ScienceBase as a ‘fully functional’ repository for distributing USGS data. The ISO 16363 standard for trustworthy digital repositories also ran.

Of less immediate relevance to industry, but very interesting, were the entries in the Wolfram one-liner competition, a showcase for developers’ ingenuity and for the power of Mathematica. Unlike one-liners in low level languages, Mathematica on liners are easy to read. First prize went to a ‘fully functional game of Pong in a single tweet.’ Of some relevance to the GIS community was ‘Projections,’ a smooth transition between map projections. Check out the other winners here.

Earlier this year Wolfram released Version 11 of Mathematica. Stephen Wolfram, founder, CEO and co-developer produced an informative blog about the release which adds new functionality for 3D printing. New machine learning and neural network functions include automated image identification with a built-in library of 10,000 objects and graph database functionality. Also new are word

frequency analytical tools, including links with Wikipedia data, weather forecast data and mathematical function data. Wolfram also offers a plethora of ‘geo’ function-ality, projections, geodesy and data, notably from Open street map. OSM’s inventor Steve Coast also presented at the summit. Stephen Wolfram claims that Mathematica is ‘arguably the highest-level language that’s ever been created.’ Much more from his blog.


Wireless world

'Fiber-equivalent’ satellite. VSAT for Aramco. ExactEarth AIS. Navigant’s research. Telemar sold.

O3b Networks, RigNet and Modec are to provide ‘fiber-equivalent’ satellite connectivity to Brazilian FPSOs. The broadband communications solution will connect Modec’s FPSOs with onshore ‘advanced collaborative environments.’ O3b’s medium earth orbit satellite network replaces a previous solution based on geostationary satellites and has resulted in a ‘75% reduction in latency.’ O3b is a wholly owned subsidiary of SES*.

Saudi Telecommunication Company (STC) is to use Intelsat’s satellite solutions to support oil and gas operations throughout Saudi Arabia. The multi-year agreement provides VSAT connectivity via Intelsat’s 10-02 geostationary satellite located at 1° West. More from Intelsat.

ExactEarth has partnered with Harris Corp. to deliver a real-time ship tracking solution by optimizing a global satellite constellation for AIS. The ExactView RT solution provides global satellite coverage from 58 hosted payloads in the Iridium Next constellation, inter-satellite communication for real time AIS positioning, patented message processing and a customizable maritime VHF receiver architecture. The first launch is scheduled for early 2016 and the constellation is to be completed in 2017. More from Iridium Next.

A recent report from Navigant Research* puts the market for communications subscription services and equipment for upstream oil and gas at ‘nearly $2.3 billion’ in 2025, driven by the ‘potential for real savings and efficiency gains thanks to digital oilfield applications and connectivity.’ Most new drilling and production sites will have modern communications installed at the outset.

Apax Partners is to acquire Telemar from its current shareholders. The unit will combine with Marlink, acquired earlier this year, to create the ‘world’s leading maritime communications, digital solutions and servicing specialist’ for all offshore segments.

* Formerly Société Européenne des Satellites.

** Communications Technologies for Digital Oilfields.


Yokogawa’s process data analytics

Process quality issues anticipated with controversial pattern recognition technology.

Yokogawa has announced ‘Process data analytics’ (PDA), a software solution for the early detection of production quality issues in process industries including oil and gas. The solution builds on Yokogawa’s analytical services offering which has performed over 100 contracts for such services since 2008.

PDA embeds the controversial ‘Mahalanobis Taguchi’ pattern-recognition technique for multivariate analysis. PDA analyzes production operations using temperature, pressure, flow rate, liquid level and other data and equipment information available in the plant information management system.

If this happens to be Yokogawa’s Exaquantum PIM, data is read natively. Other systems will need conversion to CSV format. PDA is compatible with MathWorks’ Matlab for ad hoc/bespoke calculations. The M-T pattern-recognition technology is used under license from AngleTry. PDA will be released for sale in March 2017. More from Yokogawa.


Where is a well?

Fugro study of Shell’s Irak asset considers tectonic plate motion in high accuracy positioning effort.

You may think that geomatics is hard enough with geographic projections, datums and such but, as a recent presentation by Fugro’s Jean-Louis Carme, speaking at France’s GeoPos event showed, it can be harder than you think to pin down an accurate location. The reason? Plate motion!


Various models of global plate motion have been proposed. But there is little consensus as to which to use. This can be problematical in geodesy, and also when using historical maps, especially in areas of rapid plate motion such as New Zealand or the Middle East.

Working for Shell on Irak’s Majnoon oilfield, Fugro studied some 13 geodesic reference points and integrated them with the ITRF2005 plate velocity field for future accurate positioning of assets. The study showed that for assets with large extent (like pipelines) plate motion needs to be considered. Checkout your local plate speed on the ITRF here.


Honeywell instruments Statoil’s Valemon remote operations

Human factors study and Experion technology enable operator teams to relocate shoreside.

A recent presentation by Gjert Gkertsen on human factors and remote operations focused on Statoil’s North Sea Valemon platform. The gas and condensate field is in production since 2015 with ongoing drilling and currently operated from a control room on the platform. Statoil is planning to move control from the platform to a shore based replicated control center and to operate the field remotely from Sandsli near Bergen.

Gjetrsen’s presentation describes the planned ‘periodically unmanned’ operations and the fail-safe mechanisms that have been implemented. These leverage Honeywell’s automation technology, notably the Experion process knowledge system.

Honeywell is the EPC automation contractor for Valemon and will provide a range of control and safety technologies for the project including new operator stations and critical alarm panels at the onshore center. Honeywell claims the system will simplify remote operations and reduce operating costs. More on Valemon here.


Range Resources finds 86,000 lost Marcellus wells

Esri, Geocortex and boots on the ground combine in historic map making effort.

Range Resources’ Lacey Selvoski writing in the Fall 2016 issue of Esri News for Petroleum describes how, with help from Esri partner Latitude Geographics/Geocortex, it has provided comprehensive maps of old wells and leases. These have been used by Range’s land department in the dash for acreage in the Marcellus shale play.

The land department needed to know who owned what and whether prior lease transfers were legal. The geology and drilling departments wanted historic well location data to avoid collisions and minimize environmental impact.

Range studied records and maps showing landownership as far back as the early 1900s. Field visits were made to validate historic well locations, looking for clues such as rusty casing, holes or small depressions and collecting GPS coordinates. Range digitized maps and field data for over 86,000 historic wells throughout the area.

Today, Range’s land department has web-based GIS access to wells, maps and other cultural information. The historic land and well map provides a starting point for title clearance. Users click historic wells to see the production status of the well and retrieve the lease’s status.


Clariant’s digital chemicals management

Veritrax couples real time field data with the chemicals supply chain.

Switzerland-headquartered chemicals specialist Clariant has announced an upgrade to its ‘Veritrax,’ ‘intelligent’ chemicals management system. Veritrax is chemical control, monitoring and ordering system that optimizes chemical management tasks and labor-intensive processes. A ‘significant’ reduction in total operating costs is claimed from the platform that deploys ‘industry-leading’ automation and cloud-based technologies to optimize the chemical supply chain, ensure maximum production uptime, and provide operators with more control of their chemical spend.

Veritrax can be integrated with existing Scada and systems to provide real-time data to a laptop or smartphone. Users can monitor multiple data streams, such as well production and chemical injection rates, and to optimize inventory management, with new chemical deliveries coordinated automatically. Additionally, continuous system monitoring alerts the operator to potential problems, allowing rapid interventions to minimize production losses.

The system provides chemical pump control and data analysis to help ensure that the correct chemical is delivered at the wellsite, at optimum dosage levels. Online surveillance aligns production rate data streams with analytical results uploaded from the field or lab, allowing for trending analysis and the troubleshooting of problematic wells. Clariant Oil Services head Jon Rogers said, ‘The priorities for the oil and gas industry are risk management, production efficiency and cost optimization. IT is playing an increasingly important role for oil and gas companies as they look for ways to optimize all aspects of their operations.’ More from Clariant.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.