Oil IT Journal: Volume 21 Number 6


ExxonMobil floats OSAF

Open, secure automation forum to launch next November. Objective is for new vendor-independent, standards-based process control framework for oil, gas and other verticals.

ExxonMobil research and engineering and The Open Group held an ‘Industry day’ event last month to raise awareness of Exxon’s ‘next-generation,’ open, interoperable process control framework. Exxon floated the Framework in a presentation at last year’s Arc industry forum. As some 40% of existing control equipment will need be replaced by 2025, Exxon has set out to explore technology replacement options and future requirements.

The aim is for a distributed, low cost, modular architecture, both to replace todays control systems and HMIs and to add new functionality. The ‘interoperable, open architecture,’ has been inspired by the platform developed by Real Time Innovations for the military and by Face, The Open Group’s ‘future airborne capability environment’ consortium.

Four years of internal R&D resulted in ‘case for action’ and ‘core characteristics’ documentation that formed the basis of a tender for further development. This was awarded to Lockheed Martin last December.

The June 2016 Industry day event saw the official publication the initiative’s functional characteristics documents. Vijay Swarup, VP R&D at ExxonMobil Research and Engineering Co. said, ‘We continue to challenge ourselves by looking at existing processes and finding innovative ways of working using both internal and external ideas. This breakthrough initiative could help transform refining and chemical manufacturing through high-speed computing, modular software, open standards and autonomous tools.’

The need for an open standard has been driven by the high cost of technology refresh of existing commercial DCS solutions that limits the roll out of leading edge performance. The expense of integrating best-in-class third party components and the absence of intrinsic security were also cited. Current solutions are restraining innovation in the application software market.

Speaking after the event The Open Group’s Jim Hietala told Oil IT Journal, ‘The industry day was well attended with people from many end user organizations and communities, not just oil and gas. The initiative, provisionally called the Open secure automation forum, is a stand-alone The Open Group project to create a standard of standards. As with Face, we will not duplicate work where useful standards already exist. These will be embraced and referenced as we plug any gaps with new standards.’

Exxon and TOG are planning a second Industry day late August and the first Osaf member meet will take place during the 2016 Aiche annual meeting in San Francisco next November. The ultimate goal is for a ‘ready for deployment’ system by year end 2020.


Technip, FMC combine

Cashless transaction consolidates prior Forsys joint venture into engineering behemoth with $20 bn backlog and potential for $400 million/year cost savings.

FMC Technologies and Technip to are to merge into ‘TechnipFMC.’ The new surface/subsurface and onshore/offshore engineering behemoth had combined 2015 revenue of $20 billion, ebitda of $2.4 billion and a $20 billion backlog in March 2016. In an all-stock transaction, Technip shareholders will receive 2 shares of the combined company for each Technip share while FMC’s shareholders get one share of the combined unit for each of their shares. The transaction builds on a prior joint venture, Forsys Subsea.

The combined company has some 49,000 employees in 45 countries, although some $400 million annual cost synergies are announced. Technip chairman and CEO Thierry Pilenko becomes executive chairman of TechnipFMC while FMC Technologies’ president and COO Doug Pferdehirt becomes CEO. The new company will have three ‘operational headquarters,’ in Paris under Pilenko, in Houston under Pferdehirt and in London, where the new corporation will be domiciled. A Technip spokesperson told Reuters, ‘There is no reason why Brexit should impact the deal.’ A global integrated R&D center will be located in France.


'Big data and analytics?’ Pick one!

Neil McNaughton observes that much technological progress is concerned with dull infrastructure and scaling issues. To sell them they need spicing up with a neat idea. Examples? Google’s Page rank and the (apocryphal) beer and diapers tale of business intelligence success. Buzz at this year’s PNEC data management conference suggests that Hadoop’s promise remains elusive.

Way back, a major vendor released an update to its flagship database, lets call it ‘Founder’ with the promise of a major new feature. This got a lukewarm reception from the community since the ‘new’ feature was the reason that they had bought the software in the first place. IT has always been great at promising a lot and delivering... rather less, rather late, or as they say in the trade, ‘real soon now.’

There are good reasons for this. Much of what appears to be straightforward in the data and IT space is, in fact, rather hard to achieve. This is frequently due to the problem of scale. If you have a table of production values on your desktop, a simple click is all you need to order them. If you have data coming into different systems all over the world things are rather different. The whole history of major enterprise systems like ERP is like a long game of catching up with users’ prior expectations.

Meanwhile, industry has to maintain excitement and interest while it gets on with the grunt work of scaling and debugging the ‘next big thing.’

I’m not 100% sure of this, but I suspect that Google adopted this strategy back in the early days of search. At the time there were a few search engines about, Google, Altavista and others and for the end user, there was not a great deal to separate them in terms of results. As the web grew though, the problem for the search engines became the boring old one of scale. Keeping up the humongous indexes, tracking the clicks became an arms race.

As I said, scale is not a strong selling point. Google never said, ‘We will be the biggest and therefore the best.’ Instead it came up with a nice story, the Page rank and some cute math to back it up. Google may have used the Page rank in the very early days but after a few months it must have amassed its own, much more accurate click stream data for its recommendations. This has never stopped the Page rank being duly trotted out by Wikipedia and others since then to explain Google’s genius. Success was more likely was due to having mastered the tricky task of drinking from the clickstream firehose and, of course, from monetizing the results. Google’s approach to its big data is arguably more about doing dumb stuff at very large scale than smarty-pants data driven analytics.

The marketing ploy at work here is camouflage what’s boring with a great story. The big data movement today has two sides to it. The first, the boring one, is monitoring. The second, exciting one, is ‘analytics.’ Analytics is exciting because of the notion that today, our data has gotten so big that it just has to contain hidden gems of information that have so far eluded us. A great story indeed.

The relative importance of these two facets of big data is frequently misunderstood by end users and is misrepresented, probably quite innocently, by the marketing folks. In fact in the literature, both scientific and marketing, ‘big data’ is almost always conflated with ‘analytics’ and sometimes acronymized into ‘BDA.’

The first time we reported on what later became ‘big data’ was at PNEC in 2006 when we heard Nancy Stewart on how WalMart’s used powerful computing to track in-store sales in near real-time. I confess that when I thought back to this talk I ‘remembered’ the old retail business intelligence chestnut, the tale of nappies/diapers and beer. This has it that WalMart’s business intelligence ‘discovered’ that men who buy nappies also buy beer. This oft-cited, unexpected retail information gem is a cute tale but is it true?

I checked our Technology Watch report from the 2006 PNEC. Stewart made no mention of this story which is almost certainly an urban legend. I don’t doubt that analytics play a role in Wal-Mart’s use of high end computing, but, just as that major oils use SAP, the main use case is monitoring and situational awareness. It is as hard to track worldwide sales in real time as it is to track a major’s worldwide oil and gas production.

If, like many you are embarking on a big data project (as pretty well any IT project seems to be these days) you might like to reflect on the fact that most of the trendy technology that is deployed around big was developed for monitoring and situational awareness rather than for analytics.

At PNEC (see page 6) I caught the following snippets from different speakers. ‘The big data movement has left people behind in the quest for some machine learning future.’ ‘80-90% of our data scientists’ time is spent on data prep.’ ‘There are pockets of what passes as analytics but this is actually reporting.’ ‘We have Hadoop and we are trying to figure out what to do with it!’ ‘Today we have data lakes which are a lot like the old data warehouse!’

The good thing about the big data movement is its focus on data. Maybe ‘big data’ will succeed where ‘data management’ has failed. The whole data management problem is essentially one of scale and a lot of, but probably not all, the new stuff will undoubtedly help.

This is not going to happen though, if it is being sold and bought with a predominantly analytical focus. Companies pouring money into analytics-driven initiatives are likely to be disappointed as a) the grunt work of data management is neglected and b) the ‘analytics’ fail to turn up much that we did not already know! For instance in their application in a field like imaging ‘big’ seismic data. There again, maybe you know something that I don’t. If you have a true version of the beer and nappies story I’m all ears.

@neilmcn


Oil IT Journal interview - Donald Thompson, Maana

Maana founder and CTO Donald Thompson tells Oil IT Journal how enterprise semantics (no, not the ‘failed technology’ of the semantic web), category theory and Maana’s patented graph technology is used in operations, engineering, accounting and more!

What’s your background?

I was fifteen years with Microsoft working on semantic search, knowledge management and reasoning, on Satori and Cortana. At Maana we are now working on the application of semantics in the enterprise, using it to ‘understand’ data, both big and small.

When you say semantics, do you mean the semantic web?

No the semantic web is essentially a failed technology, an academic effort which had some limited success in pharma and healthcare. It’s capacity for machine readability and inference was promising but the effort required to prep data for such applications was unsurmountable. We tried in at Microsoft but it is not practicable for an unknown, ad hoc domain. At Maana, semantics means a blend of data structure and statistics. In healthcare, you might know that a certain drug treats a specific disorder. But what is of real interest is how often it is taken, who’s taking it and what the outcomes are. Here, machine learning is a great way of leveraging both the statistics along with the structured data.

You use the Accumulo graph database.

The graph database is key to our approach but we have moved on from Accumulo. We now work with a proprietary store that embeds our patented graph-based ‘dynamic semantic model.’ We still use the Hadoop file system and we support Spark. Our system provides a fluid data representation that builds on category theory, the math that underpins our technology.

So this is a shift from open source to proprietary software?

Yes, for storage and for scale and efficiency. We still have extensive support for Spark and R, in fact we contribute to these projects and we also work with Hortonworks and MapR.

So how does it work?

Our flexible representation of data allows us to remap information into ‘kinds.’ Thus we repurpose and link data say from well to geoscience or from well to finance, changing the perspective of the model in a computationally efficient manner. Maana also allows ad-hoc filtering and data structures and indices that are tuned to different data types. This includes text with substring matching and time based data. This might connect a kick event in real-time data with text recorded within a certain time frame, rolling-in Bayesian inference.

‘Kinds’ sounds like business objects?

Sure, or ‘views.’ For us a ‘kind’ is our version of concept variation.

Accessing different data sources sometimes comes unstuck on issues like well names in difference source databases. How do you handle this?

There is no magic involved! You need to align terms in different databases. Our machine assistance can help here, guiding and learning from users as they map field names across different systems.

Can you give some use cases?

The front end takes raw data from various APIs into line of business apps for say, field services, such as part ordering by field teams to minimize returns. Maana’s enterprise architecture is integrated with the business. You can type in device/error codes or symptoms and the system advises on remediation. We call this ‘data driven device decision support.’ Another tool provides predictions on accounts receivable, predicting when an invoice will be paid. Others ingest well data sets, match on the well name and see what happened following stimulation, rotating views and traversing the graph.

GE (an investor in Maana) talks about smart preventative maintenance. Has Maana displaced GE’s internal tools?

No. But we are working with Predix and with GE Digital on this kind of thing.

What about upstream data sources? Companies have spent years building connectors for hundreds of data sources.

Database toolbox connectors and OBDC does fine for most of our stuff. We outsource connector development on an as-needed basis.

Do you work downstream of the historian?

Certainly, some clients have large elaborate data lakes. We support the business ecosystem, Tableau, Spotfire… We elevate these systems to the level of a canonical source of knowledge.

What does Maana mean?

It is Urdu for ‘meaning.’

More from Maana.


Statoil’s IoT at Carnegie Mellon Saturn 2016

At the heart of the Internet of Things lies ... the ‘Thing!’ What else?

Speaking at the Carnegie Mellon software engineering institute’s 2016 Saturn conference earlier this year, Jørn Ølmheim presented Statoil’s experience of the devices, challenges and opportunities of the internet of things. Statoil’s poster child is its Ocean Observatory, a window on multi sensor data for environmental monitoring with video of the sea floor and echo sounder tracking of ‘biomass’ a.k.a. fish. Workers too are plugged into the network to monitor exposure to workplace sound levels. Statoil proposes a seven layer IoT architecture, from fog computing at the edge, into the cloud and through to big data and business applications.

IoT spans the operations/IT boundary. Ølmheim sees OT as maturing and assuming functionality that is today in the IT camp. One thing unlikely to change is the role of the historian which is seen to act as a buffer between OT and IT for some time. An enigmatic slide concluded the presentation, with the anatomy of a ‘thing,’ a sematic web derived object that models anything! Oil IT Journal readers will hear echoes of earlier initiatives which live-on in the EU-backed Optique project. Progress on this €14 million R&D program will be reported during the SPE Intelligent Energy event in Aberdeen this September. More presentations on the SEI architecture technology user network.


EAGE workshop on open source software in geosciences

Academics revel in open source software for (mostly) geophysics. dGB shows how a ‘freemium’ business model can work. But oils are wary and seem to prefer proprietary consortia.

Introducing the 2016 EAGE workshop on open source software in geoscience, physics Filippo Broggini (ETH Zurich) observed that open source software (OSS) is ‘taking over,’ on internet browsers, on mobile phones (with Android) and especially in high performance computing. The ensuing Workshop showed that while there is enthusiasm for OSS in academia, its penetration into the upstream is so far rather limited.

dGB Earth Sciences’ Kristofer Tingdahl was the only presenter to address the issue of creating a viable business around OSS. Users are interested in vendor independence and object to the lock-in and high maintenance costs of commercial software. While dGB’s Open dTect seismic interpretation package is open source, there is little code coming back from users who are not in general developers. dGB is working to expand its community of developers with a Python interface. The trick, for dGB and others, is to strike a balance between OSS purism and commerciality. Tingdahl’s dream is one of an open cloud infrastructure supporting a DDS-based (see below) storage layer with hooks for JavaSeis and Madagascar for seismic processing, Open dTect for interpreting and other OSS software for geology and mapping.

Sergey Fomel (U TX Austin) traced ten years of Madagascar’s development. The OSS seismic processing package was rolled on at the June 2006 EAGE. The package exceeded expectations and is now in release 2.0, with 40,000 downloads and 260 ‘reproducible’ papers published. Madagascar is comparable to Matlab but with better handling of array data on disk. Madagascar benefits from OSS-style governance, all contributors have equal control, there is no decision tree hierarchy. The ‘CircleCI’ continuous integration platform is used to perform thousands of tests prior to a new commit. Fomel concluded with a pointer to a new ‘Geophysics papers of the future’ initiative to kick off in 2017.

John Stockwell (Colorado School of Mines) presented OpenSeaSeis, again with a Matlab analogy. The package allows sophomores to process seismics and provides a data viewer and QC package. The package is well written but ‘incomplete.’ It provides trace by trace processing and interactive workflows. 3D is under development.

Jeffrey Schragge (University of Western Australia) has deployed Madagascar in the cloud, specifically on Australia’s Nectar research cloud. Nectar offers ‘SME* computing’ to smaller processing shops and universities without big clusters. Again Madagascar (a.k.a. M8R) is available along with the ‘Scons’ python interface that recognizes parallel parts of code. A ‘challenging’ 3D full wavefield model of the Australian NW shelf required 100 million core compute hours! ‘Burst cloud computing’ and dynamic pricing made this possible. But Schragge warned of the risk of bidding wars for compute resources. CYthon, HDF5, ZeroMQ, NumPY also ran. Cuda accelerators are also an option. The toolset is being bundled as an M8R extension ‘mycloud.py.’

In the concluding discussion, the perennial issue of researching and reading the plethoric seismic data formats was raised with a suggestion for an industry-backed open source API. For John Stockwell, this already exists, ‘it’s called DDS.’ While we need to avoid creating new data formats, conversion is a problem that might be amenable to a common API. It was suggested that the academic community might ‘lead the charge’ here as industry was unlikely to get involved. HDF5 was suggested as an ‘ideal universal format’ as was Apache Arrow, an open source format for big data. The requirement (and difficulty) of involving the SEG Standards committee was also an issue. Another problem for the OSS community is the availability of good real world (not Marmousi) data. There is need for an OSS data repository. In fact, this is in the process of being created as a component of the SEG Wiki. Finally, the issue of how upstream research is funded was raised. OSS code is a poor fit with the sponsored consortia paradigm. Consortia are ‘sold’ to oils by researchers and OSS ‘is not a good business reason.’

Comment - While the OSS geoscience movement exists (in geophysics at least), it is mostly constrained to academia and even there, suffers from a degree of fragmentations as long term projects proceed in parallel. The meeting concluded with a suggestion that another gathering could be organized ‘in five years.’ This movement proceeds at a snail’s pace. But the SEG open seismic processing workshop (August 17-20, Houston) is another step in the right direction.

* Small and medium enterprise.


Eurotech bundles Nutanix, Nvidia, VMWare for Paradigm

3D virtual desktop infrastructure bundle supports interpretation suite on hyper converged hardware.

Eurotech has just published a position paper on the use of 3D ‘virtual desktop infrastructure’ (VDI) technology in seismic processing and interpretation. The Eurotech solution, developed in collaboration with Paradigm, provides a cloud-based, ‘easy to manage,’ high performance, scalable platform for remote delivery of a 3D seismic processing and interpretation capability.

To date, costly, high powered desktops have been required for such functionality. Today, geographically dispersed users expect remote access to data, creating an ‘unprecedented’ technical challenge for IT. Eurotech’s answer, move the whole shebang to the cloud and leverage modern VDI to deliver 3D graphics to a local terminal. VDI enables a server in a remote data center to support multiple virtual desktops. Moreover these can deliver a 3D seismic application to any device, across almost any connection. VDI can be delivered as a private or public cloud, eliminating the need for businesses to make costly infrastructure investments or to manage complexity that doesn’t add direct value.

The Paradigm/Eurotech cloud solution is based on the ‘hyper-converged’ Nutanix 7000 series nodes that are designed specifically for 3D VDI workloads. High end graphics performance comes from the latest generation of Nvidia GRID graphics cards and VMware’s GPU virtualization technology.


Software, hardware short takes

Baker Hughes, CGG, AeroLift, American Reliance, Ikon, Aspen Tech, Bit Stew, Blue Marble, BPT, Drillinginfo, EIA, FaultSeal, Gexcon, Geocomputing, Petrosys, Quorum, Energy Navigator.

Baker HughesWellLink performance service helps drillers minimize ‘invisible’ lost time by transforming data into ‘instantly actionable’ information.

CGG GeoSoftware’s Jason 9.5 adds depth inversion, broadband reservoir characterization and anisotropic inversion that improves model well ties.

Valmie Resources and AeroLift eXpress are to unveil drone-based delivery for offshore operations. AeroLift’s ‘military grade’ drone technology delivers payloads of up to 14lbs out to 250 miles.

American Reliance and Airbus have released Gator, a rugged, ‘battle-grade’ laptop for geospatial applications in the field. The hardware/geospatial data server bundle targets inter alia oil and gas field crews, first responders and exploration.

Ikon Science’s RokDoc 6.3.3 update includes usability and visualization enhancements, a variogram analyzer for geostatistical analysis of wells, maps and volumes and improvements to Ji-Fi and to Ikon’s geomechanics and pore pressure prediction solutions.

The new V9.0 release of Aspen Tech’s AspenOne brings improvements to the Excel interface and HySys’ case study tools. A new DMC3 Builder speeds APC controller deployment. Schedules can be tracked in real time with the new map monitor in Fleet Optimizer.

Bit Stew Systems has announced MIx Core V10, a ‘self-service solution’ for industrial internet data management. MIx Core applies machine intelligence to automate data integration, eliminating the time and cost associated with ‘continuous data wrangling.’

Blue Marble’s Geographic Calculator 2016 now supports Petrel database files. The latest 2016 SP1 release adds updates from a new online GeoCalc geodetic registry and access to coordinate system definitions in a Petrel CTL database. Seismic survey conversion now supports SEG-Y data and is synchronized with the EPSG Registry.

Billington Process Technology’s FSG flare modeler detects and stores scenario candidates for further analysis, avoiding over-engineering of flare networks. BPT-FSG can triage model results and identify those sources that are likely to generate peak flows.

Drillinginfo has added new functionality to its E&P intelligence and decision support tool DI Transform 5.1. Users can leverage predictive and prescriptive analytics to gather insights into potential M&A targets and to establish best practices for drilling and completion. Clients also have access to a new probabilistic decline curve analysis tool providing a range of possible EURs.

The US Energy Information Administration has released a free software tool that allows users to import energy data from its online API into Google Sheets. The EIA API exposes 1.2 million data series in its open data program. The previously released Excel add-in generates some 100,000 requests/month. Google’s spreadsheet is available through the Chrome Webstore.

FaultSeal has released Prospect VX, a ‘fast, easy-to-use volumetrics’ package designed for ‘21st Century’ reservoir types. The tool models uncertainty in geologic processes with a range of methods for generating gross rock volume distributions. The package is available for Windows, Linux, and Mac OSX.

Gexcon’s Flacs V10.5 supports native import of Aveva RVM files, digital terrain model import and a new utility for managing and reporting geometry objects and counting volumes.

Geocomputing Group has announced RiVA, the ‘world’s first’ fully integrated geocomputing environment, delivered as a hardware appliance, certified for a wide range of commercial geoscientific applications. RiVA comes in a 42U rack cabinet with each rack supporting 8-40 virtual workstations with 480TB of storage. Each workstation has access to 7-28 processor cores, is allocated between 256GB-1TB of memory, has a dedicated GPU and a 56Gbs InfiniBand connection to the shared parallel file system.

Petrosys has embedded its mapping application into the Schlumberger Petrel ecosystem. The ‘Petrel mapping module by Petrosys’ is available in the 2016.1 Petrel release.

Quorum’s new ‘myQuorum’ pipeline manager improves operational efficiency by collecting and automating disparate workflows and key business processes.

Energy Navigator’s Val Nav 2016 engineering and reserves software promises enhanced data management, data editing and auditing features along with over 60 user-requested enhancements and a new spreadsheet import tool grid viewer.


IGas Energy rolls out M-Files

Document management system supports UK explorers’ compliance and information sharing.

Onshore UK oil producer IGas Energy has deployed M-Files’ eponymous document management solution to meet ISO 9001 certification and for general purpose document management and quality assurance throughout the organization.

Dallas-based M-Files uses a ‘metadata-based’ approach to document management that is claimed to eliminate information silos and provide rapid access to content from any core business system and device. M-Files manages information by ‘what’ it is as opposed to ‘where’ it is stored, with on-premises, cloud and hybrid options.

IGas CIO Chris Holly said, ‘M-Files was originally deployed for compliance purposes. But now, its use has extended across the business as a comprehensive platform that manages as much of our company information and related processes as possible. M-Files was easily configured to meet our needs and has enabled better collaboration and secure information sharing.’

M-Files is integrated with Esri’s ArcGIS in a well management portal that exposes production, land and operations data. The solution provides traceability and versioning of documents and related information. M-Files is also used to managing authorizations for expenditures and license applications. More from M-Files or watch the video.


PNEC E&P Data conference 2016, Houston

ConocoPhillips on AI. Chevron’s IM in the cloud. KOC’s 'FDQOS,’ Fico-based optimization. Geosiris’ RESQML-2 validator. EnergyIQ drops standards bombshell. EnergySys on 10 years of digital oil. Noah/Repsol’s IM benchmark. EP Energy and ’small data.’ More from Petrolink, Shell and CLTech.

Last year we reported a shift in PNEC’s focus from data management per se to a ‘broader, holistic rendering of the business-data-IT triangle.’ The 19th Pennwell/PNEC data integration conference held in Houston earlier this year consolidated the transformation, under the oversight of GeoComputing’s Joel Allard, chairman of the PNEC advisory board.

ConocoPhillips’ Richard Barclay’s keynote offered a ‘drill down into analytics,’ tracing the history of artificial intelligence from the development of neural nets, expert systems and genetic algorithms in the mid to late 20th Century. Around 1980 things stalled with the onset of the ‘AI winter*’ as computers were not up to the task. Things changed with IBM’s Deep Blue chess playing computer, the DARPA driving grand challenge, Watson, Siri and now AlphaGo. Hardware and software performance has now caught up and also, ‘all of human knowledge’ is available over the internet to train expert systems. In oil and gas there is much public domain information available that can be ‘scraped’ from websites and PDFs. Today, any job that entails analyzing data is a potential target for AI. ‘Your company is full of people doing these jobs.’ Barclay divvies up the problem set into ‘easy modeling,’ where the truth is measurable and amenable to AI and other subjective questions such as a well top. ‘Data thinking’ goes beyond traditional physics models. Companies that rely only on physical models are leaving money on the table. Hess’ work on frac fluid and proppant is the poster child for data-driven approach. In the 19th Century, the steam engine brought about the industrial revolution, putting a lot of horses out of work. Analytics will likely change some things for the better but it will also eliminate jobs. In the Q&A it emerged that AI has yet to impact the upstream. ConocoPhillips has tried IBM Watson on well data, but found that it relied on background information which was not available. Analytics teams are plagued by ‘traditional’ data management issues and spend ‘80-90% of their time on the data problem. Questioned as to whether data driven models will replace the simulator, Barclay replied that today, AI struggles with new data and how generalizable its models really are. We are in a transition but AI will overtake a lot of these physical models. It would be nice to experiment more, but in the current climate there is not a lot of R&D dollars to try wild things.

Hakan Sarbanoglu (Chevron) offered a more nuanced take on upstream information management. The challenges of data diversity, long life cycle, real time, cross silo were well known before the arrival of ‘big data’ a couple of years back. Core applications keep subject matter experts happy, but they are distributed around the place and tend to have non system of record storage. They are tied together with behind the scenes data integration and point-to-point ‘back door’ interfaces. More data is moving to the cloud. But this brings its own problems like harmonizing multiple applications before the move. ETL and other tools don’t work in the cloud with its new paradigm of APIs and managed services. Hadoop, at 10 years, is old fashioned. Today we have the data lake which is ‘a lot like the old data warehouse!’ Chevron is piloting various access methods to the data lake, leveraging a logical data warehouse architecture, a canonical taxonomy and cross-silo master data. Echoing Barclay, Sarbanolglu concluded that the current low oil price is limiting deployment of the exciting new technology. Still, now is a good time to refresh your reference architecture and leap forward when things pick up.

Kuwait Oil Co.’s Khawar Qureshey with help from Eudoxus Systems has built a field data quality and optimization system (Fdqos) around Fico’s mathematical programming technology. The system helps KOC satisfy data quality requirements and maintain peak production by optimization across wells, gathering centers and export terminal. Data from KOC’s Schlumberger Finder data base is optimized in a PostgreSQL optimization data base along with Fico optimization modeler 4.4.0. Fico’s Xpress Insight rapid application development environment is used to produce web-based end-user tools. French software house Artelys provides support to the Fdqos team.

Jay Hollingsworth provided an update on developments in Energistics’ standards line-up which Oil IT Journal readers should be familiar with. Beiting Zhu-Colas (Geosiris) presented a recent development in the standards space, a public domain tool to explore and validate data in Energistics’ Resqml V2 format. Resqml is used to exchange reservoir models between different software vendors’ tools. Resqml uses Energistics’ packaging convention (EPC) to bundle standards-based reservoir data into an XML file with bulk data stored in HDF5. The Resqml Explorer checks data against with the official schema and adds configurable business rules. Geosiris builds on the open standards theme with an implementation that leverages the Eclipse Foundation’s modeling framework and ‘Zest’ viewer to view the EPC package content as an interactive graph. Eclipse’s object constraint language is used for business rule validation.

EnergyIQ’s Steve Cooper dropped something of a standards bombshell with his proposal for new ‘data objects’ as a foundation for effective data management. Data objects are not new to the upstream, but are currently ‘hidden in applications.’ To support interoperability we need to abandon ‘point to point’ solutions. A data object should contain metadata, attributes, quality and governance information along with accepted values and ranges and an audit history. ‘We are not yet able to deliver this degree of granularity yet.’ The EnergyIQ team is working with a consortium of operators and vendors on attribute definition. Initial focus is on the well hierarchy, leveraging PPDM’s ‘what is a well’ work. The aim is for an ‘implementation-agnostic’ standard. ‘Folks like to talk about JSON, XML. We don’t want to get dragged down into the reeds on this.’ Ultimately, object definitions will be transferred to a standards organization., Cooper was joined by Matt Huber who, notwithstanding the technology ‘agnostic’ claim, described an open source environment comprising restful web services, NiFi, Kafka and NoSQL. ‘Finance and healthcare are already doing this.’ Huber showed a data exchange manager widget for transfer between Geographics and Open Works along with services for data validation and coordinate transform. The standards community reacted vigorously to this encroachment into its bailiwick. Cooper argued, ‘We have no illusions. Previous attempts have gone nowhere. Agreement is hard. But look at WIAW, there is hope.’ For Energistics all this is ‘reinventing the wheel.’ More cross-examination from Oil IT Journal ascertained that the initiative a) has no name, b) is not close to PPDM and c) that names of consortium partners are not available. Not a great start for the ‘open’ initiative?

EnergySys’ Peter Black has reviewed ten years worth of digital oilfield writings. Of the 57 papers, ‘two were quite interesting!’ His quest for ‘definition and recipes’ for digital oil met with disappointment and led him to wonder if the digital oilfield existed at all. Most papers came from suppliers or vendors and of the large oils, BP and Shell dominated. Some digital oilfield tools and techniques are not really being used. Others are ‘useful but not transformational.’ Was CERA’s DOF push a mistake? Black thinks it’s better to focus on an ‘efficient and productive oil field.’ Black disses big data as ‘a solution in search of a problem’ although the cloud does get his seal of approval as ‘most important and impactful.’ Integration is easier in the cloud as witnessed by Zapier where there are over 500 cloud-based apps that can be linked together to create your own workflows. The digital oilfield was a bad idea because it put technology before the business. In the Q&A some skepticism was expressed as to the role of the cloud as an enabler of interoperability. Already there are competing cloud platforms in the oil and gas space. ‘Big oil and gas operators are hostages to proprietary clouds. Until this problem is solved you’ll never capture and use data on your own terms.’

Noah’s Fred Kunzinger presented the results of an E&P information management maturity benchmark study conducted for Repsol Exploration. Kunzinger observed that ‘industry is strange, it partners with its worst enemies in joint ventures. Execs all think that everybody else is ahead of them.’ In fact there are ‘pockets of brilliance and terror in every company.’ All have ‘hybrid’ IM organizational models, with 1/3 reporting to IT, 1/3 to the business and 1/3 to a technical services group. Data and information management often equates to geotechs on a mostly low to mid-level career path. While one of the 11 companies studied is doing enterprise level IM planning, most are ‘tactical’ and project based. Automation is ‘not really there’ and there is a lot of ‘gimme the data and I’ll do it myself!’ Many users only trust what they control. This is ‘kind of sad,’ and one in the eye for developers of corporate repositories. Businesses do not seem to care whether or not their local information architecture ties into the enterprise! Only a couple had formal governance in place, others may do it on a case by case/project basis, depending on individuals’ influence and preference. Data ownership fared better. Kunzinger despairs of exaggerated vendor claims, but yes, some 30-70% of time is still spent accessing the data. Some have full time data hunter gatherers. Data is not shared across assets. There are pockets of what passes as ‘analytics’ but this is actually reporting. Many have big data initiatives as in ‘we have Hadoop and we are trying to figure out what to do with it!’ A lot of dabbling is going on. On the final summary report card most score around 3 out of 5. There is however a consistent desire for improvement. Changing the culture is tied to the degree of executive involvement and support. The ROI of data management is hard to sell. ‘We all need to work on this.’ A tentative plot of IM maturity and corporate return on capital shows a good correlation.

David Johnson is not a big data dabbler! Petrolink’s cloud-based infrastructure drives drillers’ efficiency and scalability in particular with a flagship deployment for a middle east NOC. Petrolink’s technology targets larger oils and NOCs with big legacy databases. The NOC selected the solution to distribute a country-wide, 7.5 terabyte real time database. The ability to offer triple geographical redundancy and replication was a key consideration. Petrolink has also introduced process-orientated data quality, security and ownership and an alignment of terminology. The client’s main aim was cost reduction which is hard when data management is considered as a ‘cost of doing business.’ The trick is to figure the cost of ‘corporate non-productive time.’ The new solution represents a move forward from the sneakernet era, enabling analytics and avoiding mis-calibrated and unused sensors.

Patricia Herrera Torres told how back in 2004 an audit found that, Shell’s geoscientists spent 50% of their time on searching and gathering data. This was down to the absence of technical data management personnel. Since then, Shell has defined new roles and positions for instance, a project data manager understands IT and knows about data management principles, subsurface data types, and possesses soft skills. While the discipline is now established, it has a long way to go before it is embedded across Shell. One reason is that it is hard to attract graduates or to dislodge staff from other disciplines.

EP Energy’s Chris Josefy suggested a renewed focus on solving the ‘small data challenge.’ Josefy takes inspiration from the work of the Small Data Group which aims for ‘timely, meaningful insights organized and packaged to be accessible, understandable and actionable for everyday tasks.’ The suspicion is that the big data movement has left people behind in the quest for some machine learning future. For EP Energy this boiled down to using the PPDM ‘what is a well’ pamphlet to align API numbers across geoscience, production and operations data sources. SAP Business Objects were then used to connect to the company’s different data sources into dashboard a.k.a. an ‘analytical workbench.’ Josefy describes the approach as ‘the somewhat unified theory of people doing the right thing.’

Jess Kozman (CLTech Consulting) introduced the reference information model (RIM) a UML-based approach to analyzing the business. Kozman has used the RIM to make a graphical map of the CEO’s mind, a holistic, big picture of the business. The methodology combines Energistics’ earlier work on a business process reference model and PPDM’s what is a well. CLTech has used this to help find the answer to questions like ‘what are our finding, development and lifting costs.’ The approach allows different stakeholder who may be looking at data in different systems to understand each other and reach agreement. In one instance this involved protracted analysis of exactly where to place the data custody division between subsurface and production. The exercise helps with workflow and data mapping and supports business decisions and change management.

More from PNEC.

* Wikipedia has it that there have been several AI winters and springs since then.


Folks, facts, orgs ...

AccessESP, AIS, Atwell, Audubon, Burns & McDonnell, Cartasite, Energistics, Enstoa, ESIA, GE Oil & Gas, Geoscience BC, GSE Systems, Halliburton, iRODS, Ikon Science, Laney, MUES, Oniqua, OPC Foundation, Paradigm, Enable Midstream, Rand Group, Texas Railroad Commission, Tecplot, Schneider Electric, Simmons Edeco, SNC-Lavalin, TRC, Trican, UltiSat, Warburg Pincus, NSI Tech.

AccessESP has appointed Anwar Assal as MENA region manager and Ed Sheridan as Asia Pacific region manager.

Joe Fijak is now executive VP and COO at American Industrial Systems.

Makram Musharrafieh is Middle East director of Atkins’ new ‘end-to-end’ consulting business, Atkins Acuity.

Engineer Atwell, has appointed Donna Jakubowicz as director of marketing.

Audubon has named Stafford Menard VP deepwater development.

Steve Adcock, Jeremy Wilkerson, Kirsten Glesne and Chris McFarland have joined Burns & McDonnell.

Dan O’Neil is now VP business development at Cartasite. Heather Wiegand is marketing manager and David Taggart, director of sales.

Tommy Husvaeg (Accenture) and Bryan Pate (ExxonMobil) have been elected to Energistics’ board of directors.

Michael Matthews heads-up Enstoa’s new strategy and consulting business unit.

Energy Software Intelligence Analytics has appointed Douglas Montgomery as a non-executive director.

GE Oil & Gas has promoted Neil Saunders to VP subsea systems and Kishore Sundararajan to CTO and VP engineering.

Geoscience British Columbia has appointed Bas Brusche as VP, external relations and Ron Prasad as GIS specialist.

Emmett Pepe is GSE Systems’ new CFO. He joins from MicroStrategy.

Halliburton has reappointed Mark McCollum as CFO. Bill Albrecht has been named to the company’s board.

iRODS CTO Jason Coposky has been named interim executive director.

Ikon Science has promoted Julio Gomez to VP global sales.

Alan Snider has been named president and COO with Laney Directional Drilling.

Javan Meinwald has just created Marketing Upstream Energy Services, to advise service companies.

Oniqua Intelligent MRO has appointed Joe Berti as CEO replacing retiree and co-founder Andy Hill.

Microsoft’s Matt Vasey has been named to the OPC Foundation’s board of directors.

Shiv Singh heads-up Paradigm’s new center of excellence in Mumbai, India.

Kenneth Greer is retiring from Enable Midstream Partners and from the PODS Association board.

Rand Group has signed Joe Eldridge as VP and CTO. He hails from Microsoft.

Katie McKee is director of public affairs at the Texas Railroad Commission.

Tecplot co-founders Don Roberts and Mike Peery are retiring. Tom Chan has been named president.

Schneider Electric has appointed Annette Clayton to North America president and CEO.

Simmons Edeco has appointed Gavin Sherwood as business development manager at its EU unit.

Martin Adler is to join SNC-Lavalin as president, oil & gas.

Jeff Wiese has joined TRC as VP pipeline integrity services. He was previously with the PHMSA.

Trican Well Service has appointed Deborah Stein as director.

Chris Hetmanski is now CTO at UltiSat.

Former Alberta Premier Jim Prentice has joined Warburg Pincus as oil and gas industry advisor.

In Memoriam

NSI Tech reports the death of its founder Ken Nolte, hydraulic fracturing pioneer and co-developer of the Nolte/Smith plot for interpreting net-pressure behavior.


Done deals

AFGlobal, Managed Pressure Ops., Ennoconn, AIS, AspenTech, Fidelis Group, Barco, MTT Innovation, Bruker, Yingsheng, C&J Energy Services, ETL Solutions, Merlin, S&P Global Platts, RigData, Schlumberger, Omron Oilfield and Marine, Xtreme Drilling, Trimble, GeoSpatial Innovations.

AFGlobal is to acquire Managed Pressure Operations.

Ennoconn has acquired 60% of American Industrial Systems.

AspenTech has acquired Fidelis Group, developer of asset reliability software products, Fidelis Titan 2 and Fidelis WST.

Barco has acquired MTT Innovation, developer of ‘next-generation’ projection technology and high dynamic range, applied imaging algorithms, advanced color science and hardware.

Bruker has acquired assets of Yingsheng Technology of Brisbane, Australia. The deal covers the ‘Amics,’ advanced mineral identification and characterization system.

C&J Energy Services has entered a restructuring agreement with its lenders. The deal includes Chapter 11 reorganization to ‘eliminate’ some $1.4 billion debt.

Employees at UK-based oil country data transformation specialist ETL Solutions have bought the company from its shareholders.

UK-based geoscience and data consultancy Merlin Energy Resources is now an employee owned company.

S&P Global Platts has acquired RigData, a provider of daily information on rig activity for the natural gas and oil markets across North America.

Schlumberger has acquired Houston-based well automation control systems specialist Omron Oilfield and Marine. Schlumberger has also acquired Calgary-based Xtreme Drilling and Coil Services for C$205 million

Trimble has sold its energy transmission solutions unit to GeoSpatial Innovations.


Schneider Electric security for remote facilities

AccessXpert, co-developed with Mercury and Feenics, an open security add-on to SmartStruxure.

Schneider Electric has announced AccessXpert, a web-based security management solution that uses the cloud to provide facility owners and security personnel with remote access to video surveillance, intrusion detection and access control systems.

AccessXpert was developed in collaboration with Mercury Security and Feenics. Mercury’s ‘open’ hardware platforms provides standardized integration to third-party security devices such as door hardware or intrusion detection. Feenics access control software enabled the move into the cloud.

The solution is capable of operating in situations of limited connectivity such as remote oil and gas sites. Sites communicate securely into the cloud with no need for a local server. Access rights can be tweaked remotely to approve site visitors and remote validation of visitors is facilitated by a QR code scanner or IP-based card reader.

AccessXpert interoperates with Schneider’s SmartStruxure system for enhanced building efficiency and performance with support for the BACnet building automation and control protocol. Despite its cloud credentials, the solution is currently only available in North America.


Wireless World

Global tank levels. Subsea Ethernet. Broadband satellite for the oilfield. Managed comms for drillers.

Orange Business Services has provided worldwide connectivity to Sensile Technologies’ remote monitoring services, Netris and GasLink. These monitor oil and propane tanks and meters in the oil and gas industry. Sensile currently monitors some 60,000 tanks across 60 countries. The deal brings 3G/4G coverage with an initial deployment of 25,000 SIM cards.

Siemens’ ‘Advanced converter and switch,’ (ACS) subsea Ethernet communications system claims higher data rates over greater distances than current electrical connections. The ACS has demonstrated communication up to 84 km at 3000 meters water depth.

EMC is to provide broadband satellite communications services to Schlumberger in a multi-year contract. The deal covers equipment, global bandwidth, services and support for seismic vessels and service vessels. EMC’s ‘multiprotocol label switching’ platform, C- and Ku-band antennas and automatic beam switching technology.

Harris CapRock is to install its CapRock One system on driller Transocean’s fleet. The multi-year contract renewal provides satellite communications services to current vessels and pending new builds.

Intellian’s 2.4 meter product line will provide C-band VSAT communications to the ‘next generation’ Petronas new floating LNG vessel.


LBC Artificial lift congress - Pipe Fractional Flow

New analytical model for life-of-well and h-well artificial lift targets complex, multiphase flow.

Speaking at the 2016 LBC Artificial lift congress North America held in Houston earlier this year, Anand Nagoo (Pipe Fractional Flow) presented his analytical multiphase flow model for horizontal well planning and artificial lift. PFF’s approach differs from earlier empirical and mechanistic techniques which have failed to adapt to modern well trajectories and complex liquid loading and transients.

Various equipment combinations and operating conditions can change flow behavior. Slug control, bubbly flow and recirculating gas need to be considered along with well intervention costs and timing. The downturn represents an opportunity to innovate and to replace wrong paradigms with validated solutions. ‘Get efficient in a sub-$50 world or game over!’ To correct earlier ‘unreasonable arguments and practices’ Nagoo has established the global pipe flow database, Anna, billed as a publicly-accessible, cross-referenced source of data spanning over 65 years of peer-reviewed archival journal publications, conference proceedings, and theses from over 110 different multiphase flow loop labs.


Ontology summit Communiqué

Semantic interoperability ecosystems explained.

Semantics and ontologies crop up here and there in oil and gas as witnessed by our report on Statoil’s IoT on page 3. But it can be hard trying to figure out exactly what is meant when folks throw around terms like ‘semantics’ and ‘ontology.’

A recent Communiqué from the 2016 Ontology summit may help those struggling with the black art. The 24 page document, describes the use of ontologies in ‘semantic interoperability ecosystems’ such as the internet of things, the smart grid and machine learning. The venerable ISO 15926 gets a mention as does Nasa’s ‘Sweet’ earth sciences ontology. A section compares the different (competing?) upper ontologies including the BFO we reviewed recently (OITJ 2016 N°3). Download the Communiqué here.


Sales, deployments, partnerships ...

IHS, Wood Group, Librestream, Altair, Maplesoft, eLynx, PODS, RPSEA, Aker Solutions, Allegro, Atos, Berkana Resources, Tory Technologies, Ikon Science, Fairfield Energy, InApril, Geo Energy Group, JP3 Measurement, Motive, OneSubsea, Ortec, Rock Solid Images, AspenOne, Virtalis.

Following a performance-based assessment, Statoil has selected the IHS Kingdom interpretation suite as its primary geoscience platform for its US onshore operations.

Wood Group has signed a collaboration agreement with Librestream Technologies, adding the latter’s video collaboration application to its oil and gas operations, maintenance and integrity services.

Altair is to embed Maplesoft’s Modelica engine in its multi-physics simulators.

eLynx Technologies was has ported its ScadaLynx software to run in the Microsoft Azure internet of things cloud.

PODS, the Pipeline open data standards association has engaged Image Matters to develop its ‘next generation’ standard.

RPSEA, the research partnership to secure energy for America has engaged lobbyist HBW Resources to identify and manage new energy research projects.

Idemitsu Oil and Gas has awarded Aker Solutions a front end engineering design contract for its Vietnamese developments.

Japanese LNG importer JERA has selected Allegro’s software to manage its global trading operations.

Atos is to provide a new digital services platform to engineer Subsea 7. The platform will be implemented on Atos’ Canopy cloud.

Berkana Resources has added Tory Technologies’ MaCRoM control room management solution to its portfolio of scada solutions.

Ikon Science, with help from Fairfield Energy is offering a 6,000 well database of the North Sea to help companies manage abandonment costs and subsurface risks during decommissioning.

Norwegian InApril has signed with Geo Energy Group to market its ‘Venator’ node-based seabed seismic acquisition in Kazakhstan.

JP3 Measurement has announced a new hydrocarbon composition and physical property analysis service. Available packages include: vapor pressure gas and liquid composition, BTU and natural gas custody transfer.

Quintana Energy Services, Phoenix Technology Services, and Scientific Drilling International are to offer Motive’s bit guidance system for directional drilling through a channel partner program.

Woodside has awarded Schlumberger’s OneSubsea unit an integrated EPC contract for subsea production and boosting technology at its Greater Enfield project, offshore northwest Australia.

Analytics specialist Ortec is collaborating with Microsoft on an Azure cloud-based big data portal combining data storage, processing, analysis and advanced visualization. Tools include AIMMS, Spark, Hadoop, Spotfire and R. BP uses Ortec’s route optimization solution.

Rock Solid Images has licensed its rock-physics driven seismic and CSEM integration workflows to PGS for close integration of the technology with seismics.

Statoil has selected AspenOne’s MES, A ‘zero footprint’ HTML5 platform, as its corporate standard for ‘information manufacturing systems’ (sic) for offshore fields.

Granite Energy is working with Virtalis to promote use of the latter’s virtual reality technology in the oil and gas sector.


Standards stuff

PODS’ InfoGraphic. OPC Foundation, Energistics cooperate. IOGP on site survey. XBRL glossary. CEN, CENELEC and ’single standard' policy. US NIST’s million pound-force calibration machine.

PODS, the pipeline open standards association has published an ‘Info Graphic’ schematic of its next generation pipeline data model.

The OPC Foundation is working with Energistics to enhance interoperability between Witsml/Prodml and OPC UA. The collaboration will result in a ‘companion specification’ mapping the Energistics protocols to the UA information model. OPC UA servers will be able to consume or produce standard Witsml or Prodml documents in drilling and production workflows. Jay Hollingsworth (OPC and Energistics) chairs the workgroup. Other UA companion specs have been developed for the ISA-95 and PLCopen process control standards. Another upstream companion is under development for the MCS-DCS interface for subsea to topside integration.

IOGP has published the technical notes (IOGP No. 373-18-2) to its guidelines for the conduct of offshore drilling hazard site surveys.

The XBRL best practices board has published a glossary of ‘clear and simplified terminology’ for XBRL concepts, aimed at business users rather than software developers. XBRL also reports the first use of its Inline XBRL standard in a form 10-Q SEC filing.

The American Petroleum Institute has published a new onshore safety standard for tank measurement of crude oil. API MPMS Chapter 18.2 covers safe and accurate options for custody transfer from production lease tanks. In particular, how measurements made without opening the tank hatch protect workers from gas and hazardous vapors.

The EU CEN and CENELEC standards bodies welcome the EU Commission’s commitment to a ‘single standardization policy.’ This ‘should encompass all economic fields of activity including digital technologies, which until now have been addressed separately.’

A little off topic but… The US NIST has just completed a 16-month overhaul of its one million pounds-force deadweight machine, the largest in the world. The machine is used to calibrate load cells used to measure large forces such as the thrust of a rocket or (maybe) hook load sensors. Watch the fascinating video of the overhaul.


ALRDC New Technology seminar, Houston

Artificial lift R&D council highlights - Wansco’s Pump Reporter, ALRDC’s h-well dynamometer JIP.

Web/embedded systems and artificial lift geek Walter Phillips (Wansco), speaking at the 2016 Artificial lift R&D council’s new technology seminar presented his work on Pump reporter, a web enabled add-on hardware module for pumpoff controllers (POCs). Pump reporter is a Linux based single board computer that reads data from the POC’s serial port and formats it for display as a web page. The Reporter offers a user friendly interface, connectivity options and plug and play deployment.

The system scans the POC’s Modbus registers and maintain a copy of the POC data on its internal webserver for data management and remote access. Deployment is analogous to today’s Roku/Apple TV/Amazon Fire upgrade paths for existing ‘dumb’ TVs, all of which offer superior UI’s compared to ‘smart’ TVs.

Phillips also demoed his 3D wellbore viewer, said to help avoid ‘potentially misleading’ 2D perspective views, particularly when investigating side loads on the rod string in ‘corkscrew’ wellbores. The system was presented at the 2015 SPE western regional meeting as SPE paper #174024.

Weatherford’s Victoria Pons presented work performed in collaboration with Marathon on the ALRDC’s horizontal well downhole dynamometer data acquisition and analysis project. The Hwdda also uses 3D wellbore graphics of measured well bore paths and dynagraph data to improve understanding of the effect deviated wells on rod pump diagnostics. The project, which is also backed by Occidental, Shell and several contractors, sets out to investigate pitfalls in current diagnostics (many of which were developed for vertical wells) with the development of a downhole hardware tool to measure in situ rod load and stress. New project partners are welcome. More presentations on the ALRDC website.


Going, going... green

ARPA’s Internet of Energy. CarbonSAFE funding. IFPen, Global CCS Institute on CCS take-up.

DNV GL has been engaged by the US DoE’s ARPA unit to produce and operate an ‘Internet of energy’ (IoEn) platform integrating up to 100 distributed energy resources in Texas. The project uses Geli’s ‘Internet of energy’ software.

The DNV GL-led ‘Win-win ’ (wind powered water injection) joint industry project reports that in suitable circumstances, wind-powered water injection into oil and gas fields is feasible and can be competitive with conventional solutions. JIP partners include Nexen, ExxonMobil, ENI Norge, and Statoil.

The DoE reports that its Regional carbon sequestration programs have injected over 12 million tonnes of CO2 to date. The DoE also announced a further $68.4 million of CCS funding under a new CarbonSAFE initiative.

A new publication from France’s IFPen organization analyzes US incentives for CCS adoption and their implications for the EU. The study pinpoints failings in an earlier proposed EU subsidy.

Other ‘going green’ publications of note: an Introduction to Industrial CCS from the Global CCS Institute and a Summary of results from the UK’s CO2 storage appraisal project. Finally, the US President’s 2017 budget includes $260 million for CCS and a further $59 million for ‘crosscutting’ energy systems R&D.


Interica, Cegal team on Petrel data management

PARS project archiver integrated with Blueback Project Tracker.

Interica has partnered with Cegal to integrate the latter’s Blueback Project Tracker with Interica’s PARS E&P ‘application aware’ project archiving and retrieval solution. The move enhances Interica’s support for Schlumberger’s Petrel. Results from Blueback’s project scans be captured into a PARS archive and stored as meta-data along with a Petrel project. The combined solution is said to provide data managers with tools to maintain, order and structure Petrel project files and data across the network.

Interica CEO Simon Kendall said, ‘PARS interfaces with all the leading E&P applications. Integration with Project Tracker provides clients with efficient ways to manage their Petrel landscape.’ Cegal’s Ketil Waagbø added, ‘This integration allows users to get more value from their investment in the Project Tracker. These complementary products now assure full life cycle control of Petrel projects.’ More from Interica.


C3 IoT for Engie’s digital transformation

Three year program ‘one of the largest internet of things initiatives in the world.'

Engie (formerly GdF Suez) has selected C3 IoT’s internet of things platform as the technology foundation for its enterprise-wide digital transformation. C3 IoT will support Engie’s AWS/elastic cloud computing, big data, analytics, machine learning and internet of things initiatives.

Engie CEO Isabelle Kocher said, ‘C3 IoT is an essential partner in our transformation and will place us at the vanguard of the IoT revolution.’ At the heart of the initiative is Engie’s ‘digital factory,’ a group of 100 analytics and data scientists. C3 IoT, an enterprise-scale application development platform, combines cloud computing with ubiquitous access to sensor data, enterprise systems and external sources. Amazon’s IoT cloud services for connected devices also ran. The three year transformation is said to be one of the largest IoT programs in the world. More from C3 IoT.


GE’s Predix meets NOV’s MAX

Platforms combine to ‘bring FPSOs into the digital age.'

GE Oil & Gas and National Oilwell Varco are to collaborate on ‘comprehensive, data-led engineering solutions’ for floating production units that will ‘bring FPSOs into the digital age.’ The companies will deliver enhanced designs, technology and an industrialized supply chain that is expected to improve the economics of offshore development. Both parties are independently well down the road with cloud-based digital platforms. GE with Predix and NOV with MAX.

New digital solutions will optimize performance and provide predictive analytics through the life of the vessels, enabling FPSOs to efficiently adapt to a wider array of operating parameters. An ‘industrialized’ digital supply chain, global service and aftermarket capabilities will drive down the cost of offshore oilfield development. The packaged solutions will be available early in 2017.


Battelle’s predictive modeling toolset and hydrocarbon forensics

CO2 injection model repurposed for production optimization. New biomarker service announced.

The Battelle R&D organization has announced a new Tibco Spotfire-based predictive modeling toolset to help operators maximize production. The Battelle model leverages sensor data to forecast production, identify potential issues and recommend corrective action to avoid shut-ins. Real-time data is analyzed using sophisticated statistical modeling methods similar to those used by the financial, healthcare and intelligence/defense sectors. An anomaly detector algorithm monitors data in real time to detect divergences of forecasted and observed data that indicate anomalous situations that are not accounted for in the model. The first application was developed during Battelle’s work on CO2 injection at the AEP Mountaineer power plant.

Battelle has also announced a ‘comprehensive hydrocarbon analysis and characterization’ service using its hydrocarbon forensic toolkit. This now included new classes of biomarkers including sesquiterpanes, adamantanes, and selected alkyl cyclohexanes. Battelle is in the process of building a reference library of hydrocarbon chemical fingerprints that can be used for source attribution. To date some 79 samples from crude oil, petroleum distillates, coal, gas and tar have been analyzed.


Heriot-Watt researcher grows ’smart’ rocks

3 million Euros funding for rocks that talk back!

Mercedes Maroto-Valer of Edinburgh’s Heriot-Watt University has been awarded a €3 million EU Research council ‘advanced award for ‘frontier research.’ Maroto-Valer’s team is to ‘grow’ smart rocks that will ‘talk about what actually goes on deep underground.’ The plan is to 3D print porous rocks with incorporated micro sensors. The rocky avatars will then be subjected to reservoir conditions and the sensors will chatter away telling the researchers how they are doing.

According to Maroto-Valer, ‘Work over the years has given us some idea about how liquids and gases move through porous rocks at a large scale, but we still don’t understand how the process works at the pore scale and how this differs according to rock type.’

Maroto-Valer told Oil IT Journal, ‘We are planning to use tools, including virtual reality to deal with the complexity of big data related to the porous networks, both to manufacture the replicas as well as for the subsequent treatment and data analysis.’

The research promises advances in the security of water, food and energy supplies, the efficient extraction of oil and gas and the potential for storing captured CO2. Funding comes as grant agreement N° 695070 from the EU Research council’s Horizon 2020 research and innovation program. At least it did before Brexit. Now all bets are off!


Bend it like ... Mentor

Computational fluid dynamics model shows how a soccer ball swings while a beach ball wobbles.

This is seriously off-topic but, as the Copa America and Euro have been transfixing millions in front of their TV’s this timely research may be of interest. Previously we’ve heard ‘scientific’ analyses of how and why a football’s trajectory bends in flight which, err, fly in the face of reality. A new computational fluid dynamics model developed by Mentor Graphic’s Sergio Antioquia explains in detail, with embedded graphics, the physical
phenomena that make the ball swing.

Besides gravity, three different aerodynamic forces act on a moving ball, drag, magnus and wobble. You will have to read the paper to find out what these mean but they combine in various measures to produce both the erratic (and unsatisfactory) trajectory of a beach ball and the elegant curveball of the well struck free kick that just dips under the bar.

Antioquia produces streamlines and velocity contours from Mentor’s FloEFD application to illustrate the different effects and backs these up with real world YouTube videos of great goals of history. Ronaldo’s folha seca strike that leaves the goalie smiling in admiration from a seated position is worth a look. Download the gem of a paper from Mentor (registration required).


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.