Oil IT Journal: Volume 22 Number 7


Blue/Red clouds

Schlumberger’s ‘Delfi’ cognitive E&P compute environment trials in the Google cloud. Halliburton signs with Microsoft to leverage Azure’s ‘hyperscale, hybrid and global platform.'

At its annual user group meeting in Paris, Schlumberger announced ‘Delfi’ a ‘secure, cloud-based, cognitive*’ E&P environment. Delfi combines domain expertise and digital technologies to enable a ‘new approach’ to planning and operating E&P assets.

First Delfi application out of the starting blocks is DrillPlan, a well planning solution spanning project management, construction and design validation. DrillPlan was developed for North American shale operations where it has been used to generate ‘hundreds of well plans’ in a year-long trial. DrillPlan will be available across North America land operations from Q4 2017, with global roll-out in 2018.

Schlumberger’s EVP technology Ashok Belani said, ‘With the launch of Delfi, we deployed an E&P data lake on the Google Cloud with over a thousand 3D seismic surveys, 5 million wells, a million well logs and 400 million production records from around the world, demonstrating a step change in scalability and performance.’

The scalable Google high performance computing (HPC) cloud has also been used in a seismic depth imaging context where Schlumberger has processed a 100,000 km. sq. high resolution/wide/full azimuth Gulf of Mexico survey using reverse time migration and full waveform inversion.

Other Schlumberger applications, including Petrel, will leverage Delfi’s cognitive smarts ‘real soon now.’

Meanwhile, on the other side of the blue/red divide, Halliburton has announced its move into the cloud in a joint venture with Microsoft. Microsoft Azure VP Jason Zander spoke of ‘intelligent solutions that will drive the next generation of oil and gas exploration and production, leveraging Azure’s hyperscale, hybrid and global cloud platform technologies.’

The offering includes machine learning, augmented reality, internet of things and HPC. Uses cases include deep learning for reservoir characterization, modeling and simulation, mixed reality and more.

First app in the Azure cloud is DecisionSpace365 that now pipes real-time data from IoT edge devices into deep-learning models and predictive analytics.

Other components of the alliance include voice and image recognition, video processing and digital twins for inspection with Microsoft’s HoloLens and Surface devices. Oil wells and pumps ‘at the edge’ will connect through the Landmark Field Appliance into the Azure stack. Azure is also Halliburton’s preferred public cloud provider for iEnergy.

* According to Wikipedia, ‘There is no widely agreed definition for cognitive computing in either academia or industry. It is most frequently used as marketing jargon.’


OPC’s pitch for OPAF

OPC-UA touted as preferred route to Exxon/Open Group’s Open process automation forum (OPAF) standard. UA ticks all OPAF boxes.

There was some scratching of heads when ExxonMobil teamed with The Open Group last year, announcing the Open Process Automation Forum (OPAF) with the goal of establishing a new standard for process control. After all, OPC, the Open Process Control standards body had been touting its Unified Architecture (OPC UA) as the interoperability standard for industrial automation for some years. Moreover OPC UA targets specific use cases in the Exxon initiative including the internet of things and PLC systems across industries including oil and gas.

Spotting an opportunity, OPC has produced a position paper, ‘OPC Unified Architecture: The Interoperability Standard for Industrial Automation,’ arguing that OPC UA ticks all the boxes in the OPAF spec. OPC argues that oil and chemical companies are already developing standard object models that plug directly into the OPC UA architecture.

Moreover ‘these companies have formed a community of end-users and suppliers closely working together under the umbrella of standards organizations.’ To further its case, the OPC Foundation has joined The OpenGroup to ‘actively participate’ in OPAF.


Game over for France’s E&P!

France’s upstream oil and gas industry is officially shut down for ever. ‘Environment’ has replaced ‘Industry’ as former TV star Nicolas Hulot plays the role of Louis XIV’s finance minister Jean-Baptiste Colbert. Ex-Paris Basin explorer Neil McNaughton takes a stroll down memory lane and regrets the passing of a cottage industry.

In the 1980s, I got head hunted from my then major employer and went to work for a small, adventurous American independent that had set it sights on the Paris Basin. The area’s potential had been spotted by some savvy consultants who had translated and tarted-up the vast amounts of publicly-available literature on France’s geology. The company noted that a) the number of wells drilled per acre was a fraction of those drilled in Texas and b) there was a small but promising oil boom in the 1970s that had proved production and c) the oil price was back on the rise. But what clinched the deal was the opportunity for great food and wine and traveling around some nice countryside saying ‘Parlay-voo?

My responsibility at the time was organizing seismic surveys. This was extremely easy* since all ‘industrial’ activity was regulated and policed centrally. Louis XIV’s finance minister Colbert was partly responsible for this and, 300 years on, ‘Colbertism’ was still going strong. A central authority had the responsibility for issuing exploration and production permits. Regional authorities supervised operations and (best of all for us explorers) just about every action, from surveying a seismic line to drilling a well was codified such that negotiations with landowners were straightforward. Compensation was determined according to a scale (depth of ruts, width of crop damage) that had been negotiated between the state and the farmers’ unions. These regulations were not unique to oil and gas. I remember one farmer sighing as he saw us coming. He had seen first a freeway and then the TGV come through his land. We were a relatively minor nuisance.

Mining (and oil and gas) rights were held, not by the landowner, but by the state. There was no payments to be negotiated in respect of future production although a small royalty was paid to local communes, which seemed to keep them happy. It was a goldilocks environment for operators. The Paris Basin, by the way, was a well established oil province even in the 1980s. One field, Shell’s Saint Martin de Bossenay was a proving ground for low cost operations and for environmental protection. An aging operator-owned workover rig trundled around cleaning up the wells. Local labor was used and a large fish tank filled with surface water from downstream of the production facility was visible proof of clean operations.

After a year or two, our efforts (and those of other operators) proved that there was indeed life left in the Paris Basin. With the oil price riding high at $40/barrel ($120 in todays’ money), companies of all sizes rushed to get in on the action. It was kind of the Permian Basin of its day.

Two happenings conspired to end the boom. The oil price dropped from $40 to $10 and the exchange rate shifted from 10 Francs/Dollar to 4. The price of a barrel in local currency dropped from 400 to 40 francs per barrel! Just about everyone shut up shop and left. I was left holding the baby of ‘one of France’s largest acreage positions.’ A bag full of buggy whips, as I explained in an earlier editorial. My title changed from CEO to Liquidator and I resolved to find a more stable form of employment that was not subject to the vagaries of the oil price. That didn’t happen!

In the mid noughties I was enticed back into France’s exploration scene as a part-time local contact for another operator. Boy!, had the music changed. Colbertism was still around, in that the central government called the shots. But in the interim, the Ministry of Industry had been conflated with the Ministry of the Environment. Ministers from both the left and the right were far less supportive of our kind of industry. Some were not very supportive of the environment either. One, Ségolène Royal of COP21 fame, was notably against any environmental measures that might be construed as ‘punitive.’ (Read, do nothing that might be unpopular!)

The effect on the ground was that the strong support that explorers had from regulations and officials evaporated. Our efforts fell into two categories. When we operated in ‘oil country’ we managed to drill. But elsewhere the lack of support from the authorities and the redoubtable efficiency of the green movement in generating protest at the local and national level made drilling impossible. When I say ‘efficiency’ I am being charitable. The internet was getting going as means of organizing protest, Gasland-induced fury was rife. Fake news about bribes and kickbacks circulated. I bowed out from the scene for the second time. There was just too much grief. Since, I’ve kept tags on French exploration which has carried on bravely. I even got hold of the franceoilandgas.com domain name with the vague intention of doing some blogging. That didn’t happen either.

I must say that I had hoped that the new Macron government would treat oil and gas a little more kindly than its predecessors. But no. The new environment minister, one time TV star Nicolas Hulot, has given official backing to the environmental movement. I am tempted** to say that ‘no punitive’ environmental measures has been abandoned in favor of ‘unless the victims are small and unlikely to cause too much trouble.’ I am further temped** to observe that the brigade of protesters that travel around the country in diesel-powered vehicles are just about as likely to protest nuclear power stations, windfarms and just about anything else in their back yards. I confess that if it was my back yard, I, like Rex Tillerson, would probably be alongside them. Oh for Colbert!

* it was even easier since CGG (now in Chapter 11) actually did the work!

** temptation got the better of me!

@neilmcn


Interview - Derek Middlemas, Digatex

Former Aveva data specialist explains rationale behind consulting startup. The ‘digital twin’ should be built on a sustainable 'digital asset.’ But current information management practices fall short.

Why Digatex?

I was 20 years in engineering data, first with Intergraph, then with Aveva. Last year I left to start up Digatex when I saw an opportunity to help operators improve their use of asset data, an activity that is often bogged-down by vested interests and legacy baggage. Digatex’ aim is to help owner-operators realize the promise of the digital asset. This is less a technology problem, more of a business process issue. Companies need to change their way of working and apply technology pragmatically. Here we advise a two step process. First define a digital asset concept and strategy, then transition from a document-centric approach to the digital asset.

Engineering systems today may not be IT best practices but they do work. These complex interconnected systems rely on documents. Going to a database would require a great upheaval.

Sure, no project starts without document control in place. You do need a document department, without which there would be anarchy. But you can still have digital technology with new functionality. The transition from document to digital is at the core of Digatex’ hybrid consulting technology and digital services offering.

One issue that we try to track is the influence of standards for engineering data capture and handover. Are these on the wane as companies go for shrink-wrapped solutions such as Aveva?

Owner-operators get a lot of management pull in different directions. Standards may get agreed upon but they are not always applied in the heat of the battle! I was involved with ISO 15926 and I was on the Fiatech board. But sometimes folks get too involved with taxonomies and ‘upper ontologies’ which turns people off! Having said that, ISO 15926 offers a considered approach to information management. Vendors are now more in line with the information modeling approach, which is quite an achievement. But again, this is essentially a business problem, not a technology problem. Retailers and car manufacturers all know which supplier provides a given part for. But not so for oil and gas. Owner operators don’t understand or value their data, mainly because they have outsourced their supply chain management. Data management is not generally in the right budget, or may not be budgeted for at all.

Data standards help with the digital transition but in oil and gas these tend to be developed by small groups. Building information management used to be in the dark ages until the BIM agenda was captured by the UK Government. A mandatory standard gets much better traction! Non mandatory standards like ISO15926 have been useful in bringing the problems to light and have provided a framework for vendors. AvevaNet leverages the reference data library (RDL) concept of Part IV.

But not the semantic technology of RDF.

It’s like pure and applied math. Companies like Datum360, Intergraph and Bentley all have tools to manage the RDL, but is there a market for them? There are no takers in the operators for these. There is just not enough pain in the industry!

We were surprised a while back to learn that in the refinery, the simulator is just used for training. Why has this not been extended across the digital asset?

Because the data is not good enough to drive the plant. Plant business are good at making lots of point apps that use little bits of data. But plant data doesn’t make its way into these systems and soon they are out of phase with reality. Things are changing slowly which is where we are now advising on digital transformation strategy, working with clients to evolve contracts and to stop wasting money on technology that doesn’t do what they need.

So the offering is mostly an advisory?

Partly but also practical. For instance, we offer Digital Intelligence for automating document control and classification. This uses machine learning to achieve a quantum leap in drawing and document control, not only saving a significant amount of time and money but also vastly improving the accuracy and availability of information. One important aspect of automating document control is to audit and remediate the document classification and meta-data. More from Digatex.


New developments in natural language processing

Lease Analytics, Finch benchmark, IBM deprecates Alchemy, OpenText Magellan, Voyager Search.

Lease Analytics has established the Human intelligence and language technologies (HiLT) fund at the University of North Texas at Denton. Director Eduardo Blanco said, ‘We will find new ways to tag oil and gas leases and contracts with categories of interest, and improve classification, search and clustering beyond word presence.’

In an internal benchmark, QBase unit Finch Computing tested Finch for Text against competing text analytics solutions including API vendors Rosette, Lexalytics, Alchemy (now part of IBM Watson), Google, Microsoft and NetOwl. Finch ‘won nearly every head to head comparison, across every entity type.’ Parent company QBase markets MetaCarta, the geographic search tool previously marketed to E&P by Schlumberger. Finch is working to convert MetaCarta customers to Finch ‘because of its higher fidelity and utility.’ Download the 15 page whitepaper (registration required) of the benchmark.

Speaking of Alchemy, IBM has just released a 130 page RedBook, ‘Building Cognitive Applications with IBM Watson where we learn that the Alchemy language API is deprecated in favor of ‘Watson Natural Language Understanding.’ Watson NLU analyzes semantic features of input text and extracts metadata from content such as ‘categories, concepts, emotion, entities, keywords, relations, semantic roles, and sentiment.’ Potential Watson users might like to checkout this critical review from analyst James Kisner which concludes that while Watson is a mature platform, deployments rely on ‘expensive service and consulting engagements with IBM, limiting broader adoption.’

Other recent developments from the text analytics front include OpenText’s Magellan AI platform and new natural language processing functionality in Voyager Search.


EAGE Workshop on data science in geoscience

Total group data officer on ‘learning more from less data.' Luchelan/Ovation ‘don't use Hadoop!’ Shell’s LoRaWAN wireless IoT standard. Teradata on the IT/OT fork. Agile on the data revolution.

Michel Lutz, group data officer with Total, provided the keynote to the 2017 EAGE workshop on data science for geosciences. Before joining Total, Lutz was a researcher at Frances’s LIMOS research unit and author of a book* on data science. Lutz traced the history of artificial intelligence and machine learning from its origins in 19th Century statistics to the current buzz driven by its use by Amazon, Google and others. An illustration of reinforced learning is shown by Google’s DeepMind learning to play Breakout!

Much of AI was developed in the 1960s and 70s. Today techniques such as neural nets and decision trees are being ‘democratized’ with open source software and accessible big data and compute resources that are ‘revealing the power of these methods.’ The GAFA** web giants have put these techniques at the heart of their business, boosting scientific and technology development. The challenge today is to ‘learn more from less data,’ extending these techniques into fields like geosciences, where labeled data may be scarce. In parallel there is a shift from learning to ‘understanding,’ as exemplified by Stanford’s Karpathy ‘deep visual-semantic alignments for generating image descriptions.’ But today these require considerable data preparation and model tuning. Learning is limited to a specific task. More autonomy is required of ML. Also, caution is required as statistical learnings may embed bias in input data.

AI is a broad field, based on old statistical foundations. Machine learning on big data sets is something new. Potential areas of application include textual competitor analysis, integrated reporting, imagery (cores, thin sections), real time and static structured data. Total’s own efforts include a ‘competitors cruncher’ that mashes data from IHS, Rigzone and DrillingInfo. Another app predicts production in shale wells from decline curve analysis, augmented with data-driven analytics of well parameters and location. This has demonstrated predictions with an R2*** of over 0.8, said to be ‘exceptional’ for shale. Other trials include semantic analysis of technical documentation, seismic trace classification, biomarkers, HSE incident analysis, real time analytics of blowout/kicks and rotating machinery. IBM Watson was trialed on an analysis of biomarker reports.

Total’s data science ecosystem is a smorgasbord of open source acronyms (RStudio, Python Grafana, Bootstrap, Kibana, Hadoop, Spark...) alongside tools like Power BI, Spotfire and Tableau. The next challenge is to put algorithms into production and make the data science hype ‘disappear.’ On a closing note Lutz observed that ‘a high level of expertise in data management and governance is required to enable data science across the business.’

In the Q&A, Lutz was quizzed on the use of open source software, specifically on the likelihood of Total contributing open data to the community. Lutz admitted that today, Total is a consumer but recognizes the importance of giving back. The problem of convincing users of the validity of non physics-based models was also raised. Lutz agreed, this is a big challenge, ‘not just in oil and gas.’

Alan Smith (Luchelan/Ovation) related a failed attempt to use Hadoop for seismic data management (DM). Conventional seismic DM holds an index in Oracle and data on disk. This causes problems when extracting data to building multi-client data sets which required manual intervention, ‘tape monkeys’ and so on. Hadoop has been presented as a panacea for manipulating data. Tests performed at Ovation Data found that data could be recovered fast, if one could accept unordered traces. When data order is important, ‘everything slows down.’ The conclusion, ‘don’t use Hadoop, it is not wonderful!’ Instead, Smith advocates putting the data into a NoSQL database and ‘using the principles behind Hadoop’ to speed up retrieval by a couple of orders of magnitude. Parallel workflows leverage multiple 10GB links into the cluster. The study was funded by InnovateUK, ICL provided Hadoop support.

Turning to the sharp end of the big data spectrum, Hadi Jamali-Rad presented Shell’s wireless internet of things (IoT) applications. Shell has standardized its land seismics and pipeline sensor networks on LoRaWAN, a cheap, low power, long range solution that connects into the cloud for ‘world wide accessibility.’ LoRa has undergone extensive testing on the 40x40 km Groningen microseismic interferometer network. One test sent LoRa data via a balloon and on through four service providers over a 354km range.

Teradata’s Duncan Irving traced the split, a couple of decades ago, between IT and operations technology. Upstream users live in the OT field which is characterized by point solutions, big data silos and ‘the rise of application users as opposed to scientists.’ Meanwhile business-at-large has captured IT with at-scale analytics on Hadoop. Geosciences’ HPC architectures were designed for physics, not analytics and there is a cultural resistance to big data. Formats like SEG-Y/D are ‘hard to crack open and use by the data scientist.’ Jane McConnel took over to advocate a subsurface data lake providing access to geoscience data. But, ‘you still need a data model,’ leveraging PPDM and Energistics concepts. Another litany of open source tools was proposed (NiFi, Kylo, PostGresQL, MySQL, MariaDB…) along with the Teradata ‘Think Big’ data science lab used at the Hackathon (page 9).

Matt Hall (Agile Geosciences) is ‘fostering a high impact ML ecosystem for subsurface and engineering.’ Hall forecasts a data revolution ‘akin to stacking or seismic stratigraphy.’ DeepMind/AlphaGo have shown the way forward, but the bandwidth and scale of geoscience data is ‘unusual’ and ‘expensive.’ Figuring the 4D history of the planet ‘is harder than predicting a film preference on Netflix.’ Another caveat, AlphaGo is ‘not open source, not reproducible.’ Hall plugged the Cornell arXiv.org as the preferred outlet for open source/reproducible publishing. Big data also has ‘endemic problems’ with data quality and units of measure. In Canada there is ‘unpleasant litigation’ over ownership of seismic data and what is/should be in the public domain. Elsewhere, the MNIST/IRIS datasets are ‘very useful.’

More in our next issue.

*Data Science: Fondamentaux et etudes de cas, Eyrolles.

** Google, Amazon, Facebook, Apple.

*** Coefficient of determination.


Software, hardware short takes ...

Ansys Discovery Live, Baker Hughes Circa, CMG CoFlow, ESI Pipeline Manager 4.3, Enertia FDC, Entero Mosaic 2017, ETL on AWS, FracMan 6.7, INT GeoToolkit, GreaseBook ‘Summer 17,’ Sony/IBM tape record, IHS Kingdom 2017, Petrosys PRO, Exprosoft Wellmaster IMS.

Ansys new Discovery Live technology leverages Nvidia GPUs to provide internet of things developers and engineers with real time interactive visualization of design changes.

GE unit Baker Hughes has rolled-out Circa, an application to simulate coiled tubing operations. The software validates theoretical models against data from the field. Applications include tubing force analysis, wellbore hydraulics, solids transport modeling and real-time job optimization from inside the control cab.

CMG and partner Shell are seeking customers for trial modelling with CoFlow, an integrated asset modelling solution spanning reservoir and production networks.

Energy SolutionsPipelineManager 4.3 now includes support for RuptureSentinel, a separately-licensed tool that provides ‘robust’ detection of ruptures and large leaks. An OPC DA mirror server adds new, customizable connectivity options.

Enertia Software has released an iPad version of FDC, its field data collection package. An Android version is to follow.

Entero Mosaic V2017 targets forecasting unconventional resource play production and reserves. The software covers reserves management, budget and planning, capital management, decline analysis, and petroleum economics. New features include extended volume and cost streams for production, injection and primary and secondary products and ratios along with advanced sensitivity analysis.

ETL SolutionsDataHub data integration service now connects to Amazon Web Services (AWS), enabling client data to be processed in the cloud. The connection leverages Amazon’s S3 cloud storage and RedShift database.

FracMan 7.6 introduces new tools for geomechanics, geostatistics and geophysical data integration. Windows 10 is now also supported.

INT has released GeoToolkit.JS 2.4 with multi-well correlation, improved performance for large datasets and better data sharing and printing. INT has also launched its Developer Community.

The ‘Summer 17’ release of GreaseBook’s eponymous app now runs natively on Android, providing pumpers with on-site access historical commentary, production graphs, and well history.

Researchers from Sony and IBM Research have demonstrated magnetic tape storage technology with an areal recording density of 201Gb/inē. When commercially available, the technology will provide 330TB/tape cartridge. Today’s tapes hold a meagre 15TB.

IHS Kingdom 2017 includes a new well query dialog, a new frac treatment module and a connector to IHS Markit data.

Along with the name change to Petrosys ‘PRO,’ the 2017.1 release includes new map templates, DecisionSpace Geoscience integration and improved surface modeling. A module originally developed for Origin Energy improves drilling data collection with version control and approval workflows.

Exprosoft WellMaster IMS adds flexible reporting and trend analysis for preventative maintenance and compliance reporting. A graphical interface flags overdue reporting, prioritizes behind-schedule wells.


NDR 2017, Stavanger

National Data Repositories 2017. NPD on Diskos and open data. North Sea Data Management Forum. Africa Petroleum Data Management Forum. CDA on E&P DM-BOK and training program. Alberta’s International Centre of Regulatory Excellence.

Some 165 delegates from 30 countries met earlier this year for National Data Repository 2017 (NDR) in Stavanger, Norway under the auspices of Energistics and the Norwegian Petroleum Directorate. NPD director Bente Nyland underlined the benefits that data access (through Diskos) has brought to Norway’s oil and gas industry. ‘Companies compete on the use of the data, not by limiting access to it. This open data policy gives us a competitive advantage in attracting global investment.’

Other conference sessions underlined the benefits of standards including software interoperability, lower cost of reporting, and access to high value data, ‘even decades later.’ However, delegates at the concurrent ‘science and analytics workshop,’ expressed frustration that so little subsurface oil and gas data is ‘analytics-ready.’ Much E&P data is stored as scanned images or in inconsistent legacy formats that are hard or impossible to access. The 2012 NDR Data Quality Project is approaching conclusion with three ‘substantial’ documents to be published by Energistics covering the business case for data quality, the core concepts of data quality, business rules and metrics and implementation guidance.

Other presentations included an update on the North Sea Data Management Forum which has seen a memorandum of understanding between countries to agree an annual program of work, ‘making the North Sea a simpler, and more cost-effective place to work.’ In a similar vein, the Africa Petroleum Data Management Forum has been established with help from Norway’s Oil for Development program.

Delegates expressed enthusiasm for CDA’s proposal for a published ‘body of knowledge’ for oil and gas data to include know-how and best practices for professional data managers. On the training front, CDA reported on the data management educational programs in the UK and the Alberta Energy Regulator announced the creation of ICore, the International Centre of Regulatory Excellence, a joint venture with Mexico, to offer broad based training to future energy regulators.

Thanks to Energistics for help with this report.


EAGE 2017, Paris

Doom and gloom as plenary hears longer for lower mantra. Seismic industry in peril! Spectrum Exploration ‘now is the time for ultra-deep water exploration.’ BP’s automated life-of-field workflows. Ikon on geophysics in drilling. DMAKS and sedimentological metadata. Total’s Metis fantasy seismics and Alternative Subsurface Data wireline testing facility. Schlumberger models in the cloud. Efficient, portable code on GPUs. Inside track on Halliburton’s digital transformation.

The plenary EAGE Forum’s theme ‘new collaboration models in exploration’ was a tough one since exploration has already tried pretty well all conceivable forms of collaboration. The panel therefore focused on the more pressing issue, survival, in what is now considered to be a lower for longer oil price scenario.

ENI’s Luca Bertelli while warning that ‘price forecasts are always wrong,’ opined that ‘low’ would likely last through 2018 and force a complete industry ‘reset.’ Bertelli is skeptical of unconventionals. A few shale operators are ‘making skinny margins in the best basins.’ Much has been made of efficiency gains but these have now plateaued and there is little room for further improvement. In fact, costs creeping up again. It is getting harder to justify all this to shareholders. Conventionals remain a better bet, especially when an unconventional mindset is applied to shorten exploration-to-production cycle times. Companies should select simple, conventional asset where infrastructure is in place, cooperate with geophysical contractors on multi-client surveys and ‘respect roles without pampering each other.’ ENI is working towards integration across all geoscience disciplines and leverage its in-house 10 petaflop compute power to mitigate risk with proprietary algorithms. Poster child for the near-field approach is the Nooros Egyptian gas development.

Jean-Georges Malcor could probably have done with a bit more ‘pampering’ from CGG’s clients. Malcor complained of unfair competition from companies with lax HSE standards. Massive cost reductions (50, 60, even 80%!) are all very good but are they sustainable? Geophysics is a very capital intensive industry and contractors can’t invest with a six month horizon. Senior-level talk of collaboration is not translating into procurement, which is all about lower price rather than value for money. Malcor is a relative newcomer to oil and gas. He initially expected a dynamic industry, but ‘not at all, ideas take too long to adopt!’ CGG now has 60 petaflops of compute power and ‘proposals for 100PF are on my desk!’

Eric Oswald (ExxonMobil) offered a tale of non-cooperation at Shell’s expense! ExxonMobil’s Liza oilfield in Guyana was the result of plugging ahead on a ‘super high risk’ play that partner Shell decided to exit. The discovery well was followed by a ‘rush’ 17,000 sq. km. survey, performed by CGG, the largest proprietary survey Exxon had ever done. More speedy decision making made for first oil in under four years from the final investment decision. ‘We wondered why it took so long before!’

EAGE has made it hard to interact with speakers. Questions are now posed through a clunky app. This proved a waste of time anyhow as our questions a) why ENI uses in-house petaflops instead of CGG’s excess capacity? and b) how much of the Liza ‘risk’ Exxon managed to offload to CGG and other suppliers, were not deemed fit to ask.

Malcor observed that while geophysics was an important part of the ‘food chain’ it was nowhere in the ‘value chain’. ‘You buy the survey and after, we disappear!’ He further wondered if it might be possible to ‘contribute a survey in return for an equity position in a future discovery.’

Schlumberger’s Ashok Belani confirmed that neither they nor CGG had made any money from seismics for a long time, ‘land seismics is essentially defunct!’ Multi-client surveys are sustaining the industry to some extent, but current levels of competition mean that ‘even multi-client will be unsustainable.’

Oswald observed that in 2040, oil and gas is still to be the dominant energy source. But this will be hard to fulfil. We need new commercial models, innovation and capital. There are almost 10 million km. sq. of ultra-deepwater (>3000m) areas with over 3km of sediment. We need to figure how to make this viable.

Returning to the collaboration topic, Belani described the depressing offshore situation. Of the 70 vessels in 2013, 1/3 are retired/junked, 1/3 idle/stacked. The remaining 1/3 are in the water with 70% working on in-house multi-client ‘for survival’ and the remainder on proprietary surveys. There is no real sign of anything changing ‘before 2020 or beyond.’ ‘No business can survive 13 years down.’

There is no call for further investment in boats or equipment, or R&D. ‘Deepwater has a role, but it is not a pretty picture.’ On the positive side, Belani sees a ‘world of innovation opening-up with data,’ with inspiration coming from ‘GAFA*’ which are ‘doing unbelievable things.’ The cloud brings efficient, elastic HPC opportunities to seismic processing. Great things are happening in data science and machine learning. Seismic data in libraries is to undergo ‘innovative high-performance processing’ and reduce cycle times.

Recent activity in Mexico shows how collaboration with the regulator has made very advanced surveys available and shortened cycle times. With the cloud it will ‘technically be possible to process seismics in real time.’ This will enable ‘interpretation-based acquisition’ and make more deepwater viable. In the US Permian basin, seismics is now used to find sweetspots, although questions as to economic viability remain.

Howard Leach agreed, seismic technology has been a huge enabler for BP. But exploration is now shifting from the frontiers and the next big resource and back to ‘basins that we know.’ This requires a shift from towed streamer to niche processing. On the plus side, rig rates are down so we can drill more. All BP seismic is acquired by and 90% processed by contractors.

Neil Hodgson’s (Spectrum Exploration) paper was an impassioned and counter-intuitive argument in favor of ultra-deepwater exploration. The received view is that water depths of over 3,000 m are too deep to drill, that they are too expensive, high risk, and anyway there are no reservoirs, traps or mature source rock. For Hodgson these are ‘all lies that are poisoning our industry’s future.’ Total’s Raya 1, drilled in 3,411m of water proved that this is technically feasible. Today’s $200,000 drill ship day rates mean that these wells are ‘getting really cheap’ and ‘we will be drilling in 4,000 m of water by 2020.’ In the Atlantic, source rocks are ‘just about everywhere.’ But best of all, when you look at a cross section of the Atlantic basin in depth (as opposed to time) the whole ocean becomes a huge turbidite stratigraphic trap with a ‘zero chance of failure.’ $40-50 oil represents a great time to go looking for these massive prospects. ‘If you give engineers a billion barrels in 3,000 m of water they will find a way of developing it cheaply.’

Tom Hance described BP’s automated workflow for the integration of 4D seismics with history matching. 4D AHM (assisted history match) performs cell by cell computation of 4D properties using a batch process to compare synthetic seismograms with 4D data. Using data from BP’s Life of Field permanent array on the West of Shetlands Clair field Hance showed how the approach is used to tweak parameters such as fault transmissibility for a decent match. The functionality was implemented in BP’s go-to integration tool, DGI’s CoViz 4D.

On the afternoon of the first day of the conference there was a total electricity failure in the lecture area. The lectures were rescheduled for the following day, but, not all speakers showed up. A big fail for Paris’ main exhibition center and the EAGE unfortunately.

An audience of half a dozen or so showed up for Sophie Cullis’ (Leeds University) talk on metadata approaches and their effects on deep-marine system analysis. Unfortunately, this was less about sharable metadata and more a quasi-commercial plug for the use of DMAKS, the ‘Deep-Marine Architectural Knowledge Store.’ DMAKS is a relational database with controls to ensure ‘consistent data entry.’ This is said to overcome barriers to research deriving from the variety of classification schemas, interpretations and a ‘terminological minefield.’ Leeds has standardized and captured facies types from peer reviewed literature which are stored in hierarchical and spatial context.

As the seismic business withers, some geophysicists are applying their know-how to help drillers work in difficult terrain such as Myanmar. Here, as Ikon’s Alex Edwards related, Ophir Energy was trying to get a better understanding pore pressures. Earlier wells in Myanmar had gone ‘horrendously wrong,’ with shallow gas kicks and lost wells. Paradoxically, other deep wells were drilled with no overpressure at all. Ikon has modeled the basin’s evolution through time, rolling in drillers reports, geothermal measurement and seismic velocities. These are delivered as a pore pressure formation gradient forecasts for use in well planning.

In these days of doom and gloom it was nice to see that someone is looking to the future of land seismics. Total’s METIS concept targets foothills acquisition with a swarm of drones delivering ‘biodegradable’ geophone darts. Robotic self-driving vehicles do the shooting and data is transmitted wirelessly back to camp, leveraging a communications airship (what else!) Partners in the Metis concept are Flying Whales and WirelessSeismic. When the Metis crew is through, the camp HQ, built entirely with edible material, is cooked up for a big feast (only kidding!)

More prosaically, Total has opened a real-world test center in Pau, France for calibrating logging tools. The Alternative Subsurface Data facility is a three story building housing a 9 m high silo where large octagonal standard rock slabs can be superimposed to provide a testbed for third party logging devices. Logs can be certified to ISO 17025/Cofrac standards. The ASD was developed in collaboration with SEMM Logging to provide Total with its own independent verification of logging contractors’ claims and to open up the market to smaller regional logging companies.

In the high-performance computing session, James Hobro outlined Schlumberger’s deployment of models to the cloud (an inside track on Delfi perhaps - see page 1). Cloud computing (mainly seismic modelling) brings resource elasticity. A 1,000 hour job running on a single core can be run in an hour with the simple expedient of using 1,000 cores in the cloud! In seismic imaging however, issues of contention (of cores for main memory or blades for the network) and ‘incoherency’ (delays as parts of the system wait for each other). Tests of finite difference wave equation modelling leveraged C++ 14 codes, Intel’s threaded building blocks and MPI. Hobro spoke of the need to change from sequential to ‘actor-based’ programming, an asynchronous approach comparable to ‘kids programming in Scratch.’

Tilman Steinweg (Karlsruhe Institute of Technology) described the challenge of writing efficient seismic code for the GPU. While many leverage Nvidia’s proprietary CUDA, Steinweg was looking for a hardware-independent solution. Enter LAMA, Fraunhofer Institutes Library for accelerated math apps, an open source solution that runs across CPU, Nvidia GPU and Xeon Phi. A benchmark performed on the Jülich Supercomputing Centre’s JURECA Nvidia Tesla-based supercomputer showed good speedup over a comparable CPU-only machine.

We got a one-one-one with Landmark in their very modest booth, where we heard an exposé on Halliburton’s ‘digital transformation.’ The company claims to have transformed itself, and is now offering advice and services to clients. There is a difference between digitization and digitalization. Digitization has been an ongoing development since the 1970s. Digitalization is more recent and implies business transformation. In the USA, you can take a photo of a check with your iPhone and the money is in your account. Likewise the Amazon store has ‘no lines, no checkout, no cash, seriously.’ In Norway, there are now no fences, goats are corralled with geofencing and electric collars, ‘doing sheepdogs out of business.’

How will this translate to oil and gas? It will be through the widespread application of stuff like computer vision, data science and machine learning. Digital E&P ‘aligns the field with the boardroom’ and helps ‘translate decisions into actions.’ Landmark’s OpenEarth is a poster child for the transformation where DevOps best practices and DecisionSpace code have been contributed. The idea is to allow users to share code (e.g. for automated fault identification) with the community but to retain their intellectual property. This is ‘not to be confused with a marketplace’ (read Schlumberger’s Ocean!). Poster child for OpenEarth is Anadarko. In a production context, the Landmark Field Appliance provides connectivity from IoT/field sensors into the cloud (see the lead in this issue). Like GE, Halliburton is now proposing ‘utilization-based’ maintenance on its pumps, downhole tools, leveraging a digital twin. Elsewhere Halliburton’s IoT cameras and sensors deployed at the well site detect fugitive emissions and intruders.

* Google, Amazon, Facebook, Apple.


Folks, facts, orgs ...

Inpex, IFP, Target, Aereon, Aqualis, BCCK, Braun Intertec, Cogent, Coreworx, EcoStim, Energistics, Enservco, EPIM, EQT Corporation, FreeWave, HydraWell, IFRS Foundation, Jacobs, CH2M, Legacy, Lord Corp., Management Controls, Nera, Noble Corp., Premier Oilfield Labs, PUG, Quorum, Red Hat, MORE, Probe, RISC Advisory, Ryder Scott, Seeq, Venture Global, WolfePak Software.

Takayuki Ueda is now senior executive VP and Hideki Kurimura is VP with Inpex. Takeo Itano is senior coordinator, gas.

Christine Travers is now dean of IFP School. Gérard Momplot is director of international relations at IFPen.

Matthias Hartung has joined Target as president, digital transformation. He was previously with Shell.

Aereon has named Saeid Rahimian CEO.

Jasper Bergsma is to head- up Aqualis Offshore’s new office in the Netherlands. He hails from Mokum Offshore.

Matt Amilian is Director of Business Development at BCCK. He hails from Condit Company.

Keith Linton has joined Braun Intertec as Senior Environmental Consultant.

Chris Simmons has been promoted to COO at Cogent Energy Services.

Coreworx CEO Ray Simonson is retiring. He is succeded by John Gillberry.

Raoul Jacquand replaces retiree Rick Moignard as CEO at Dassault Systèmes Geovia unit.

EcoStim has promoted Barry Ekstrand to COO. He succeeds Bobby Chapman who steps down but remains VP corporate development and M&A.

Phil Neri is now marketing and communications manager at Energistics.

Enservco has named Tucker Franciscus as CFO, replacing Bob Devers, who is leaving the Company.

Tor-Inge Gran is domain manager and deputy director at EPIM. New departmental heads are Marit Bjordal (JVM), Tormod Tønnensen (SCM), Roar Vika (Project Management) and Morten Helgaland (Business Support).

Jeremiah Ashcroft is senior VP and president, midstream of EQT Corporation, replacing retiree Lisa Hyland.

FreeWave Technologies has hired Jim Bolotin as CFO. He hails from Truven Health Analytics.

Mark Sørheim is the new CEO of HydraWell Intervention, succeeding Odd Engelsgjerd who remains business advisor, board member and shareholder. Sørheim hails from Schlumberger.

The IFRS Foundation has appointed Nili Shah as Executive Technical Director.

Gary Mander leads the new integration management office to oversee the integration of Jacobs and CH2M. Vinayak Pai is interim head of global petroleum and chemicals.

Legacy Measurement Solutions has named Joseph Compofelice as chairman and CEO.

Lord Corporation has named Charmaine Riggins EAME president. She succeeds Joel Rood continues as president, oil and gas industrial equipment.

Management Controls founder Bob Harrell has retired. Vincent Broady is CEO, Ken Naughton is president and Mike Wangso is CTO.

Julie Carey has joined Nera Economic Consulting as director. Carey is an adjunct professor at Georgetown University.

Robert Eifler has been promoted to VP and general manager marketing and contracts with Noble Corp. He succeeds Simon Johnson who has left the company.

Matt Bell is CEO at Premier Oilfield Laboratories.

Andrew Norris, Chris Shanks, Andy Bohnhoff, Cristina Rouse, Susan Horvath, Rob Clark and Vance Hefly have been elected members of the 2017-2018 Global PUG Steering Committee.

Roy Queener is to lead Quorum Canada.

Naren Gupta has been named chairman of Red Hat’s Board of directors succeeding retiree Hugh Shelton.

Gary Cresswell is to serve as executive chairman of Probe

Geoff Salter has rejoined RISC Advisory’s London office to support EAME growth.

Ryder Scott has hired senior petroleum geologist, Luisa Rolon. She hails from Eni.

Helen Bradley is Senior VP HR North America at Schneider Electric.

Seeq has hired Todd Amy as sales executive in Houston.

Tom Earl is chief commercial officer at Venture Global LNG. He hails from Total.

Charlie Wolfe is stepping down as the CEO of WolfePak Software. James Colony is interim CEO.


Done deals

Aveva/Schneider, Ansys/Computational Engineering, CB&I, LMKR, Jacobs/CH2M, Intl. Matex/Epic Midstream, Master Rig/IEC, Peloton, Premier Oilfield Labs, Probe, Simmons Edco, Cordax.

The 2015 aborted alliance between Aveva and Schneider/Invensys is back on the table with the £3 billion ‘reverse takeover’ of Aveva by Schneider.

Ansys has acquired Computational Engineering International.

CB&I intends to sell its technology business and hopes to negotiate a long-term strategic alliance with a buyer that will benefit both parties.

LMKR (Geographix) has bought out its minority shareholder Actis. Atif and Shabana Khan now hold 100% of the company’s preferred and common stock.

Jacobs is acquiring CH2M in a $2.85 billion cash and paper transaction.

International-Matex is to buy Epic Midstream from White Deer Energy and Blue Water Energy. The transaction values Epic at $171.5 million.

Master Rig International has purchased IEC Systems’ intellectual property.

Peloton has received a strategic investment from Silver Lake, TriWest Capital Partners and HarbourVest Partners.

Premier Oilfield Laboratories has acquired NSI Technologies.

Probe Technologies has received a cash injection from Turnbridge Capital.

Simmons Edco has taken a stake in Cordax. The companies are to ‘join forces.’


Le Hackathon

Agile Scientific’s machine learning hackathon - cutting out the ’science’ in geoscience.

We checked into the grande finale of Agile Scientific’s machine learning hackathon hosted along with the EAGE chez Total. Around 60 heard from a dozen teams’ attempts to develop a demonstrator application in two days of coding. Some were plausible applications, like using supervised machine learning to identify mineralogical constituents of thin section microscope imagery. Others were far more ambitious. Seismic modeling is a labor and machine-intensive process. Team GANsters (generative adversarial networks) decided to cut-out the ‘science’ and use ML. The system was trained on 20k images pairs of models and their seismic representation. A trial on the EAGE’s Marmousi model was deemed to have produced ‘amazing’ results. While ‘this is black box mapping,’ the team plans to unpick the GAN results and see how they can inform processing, or at least, speed things up. More from Model2Seismic. One team set out to automate well correlations, combining ML with geology, using a set of images and interpretations. The process worked reasonably well on low dip conformable geologies, but an attempt to ‘ground truth’ a section with a salt dome produced an outcome that was ‘a bit different.’ The ANother team were more successful in developing an ML-based interpretation for mineralogy. Some 200GB of microscope imagery (polarized, unpolarized, fluorescence) were segmented into ‘superpixel’ clusters which were then tagged by a geologist.

A ‘Classy’ team set out to develop a seismic shot gather interpreter. This used a radon transform to pixelized the traces and develop a labeled training data set. An SVM classifier generated a test data set without labels which was run through the classified to identify events as ground roll, NMO, multiples, scoring ‘21/23!’

PickPickLog observed that stratigraphic interpretation and lithology identification from well logs is done by experts. It could be done by ML. Using an Alberta data set of gamma rays from 2,000 wells, lithology was determined using a logistic regression classifier. Comparison of expert and ML-derived lithologies was ‘quite good’ with a 75% match. The plan is now to replace expert supervision of the learning process with clustering.

In a similar vein, the LogFix team analyzed triple combo logs from the Athabasca basin using a Markov chain to calculate the probability of different lithology stacks e.g. a change from sand to silt. The approach could be used to create a resistivity log from nearest neighbor wells. Or to replace log reruns, perform data QC and patch washed-out sections.

The LogsOnTheRocks team set out to identify various E&P objects (lithological column, VSP, observers log…) in a large scanned image dataset from the UK’s OGA. A neural net was trained with tagged images. On the downside there are less tagged images of logs on the internet than there are of cats. But the approach works, ‘at 70% accuracy.’

All in all, we were more impressed by the enthusiasm of the participants than by the results of their efforts. Read the Agile report from the hackathon here.


GIS novelties from Agile and Blue Marble

Agile’s Python Shapefile tutorial. Time-based NATRF2022 framework - GIS meets plate tectonics.

Agile Scientific’s Matt Hall has published a really useful utility for reading and writing Esri shapefiles which doubles as an excellent tutorial on the subject. Shapefiles encode points, lines and polygons along with their attributes in what Hall describes as a ‘slightly weird’ format which is in fact a collection of files, one of which is the SHP file. Hall’s post walks through reading a shape file, changing its CRS*, tweaking some attributes and writing it back.

Speaking of CRSs, a Blue Marble blog post introduces the four new geographic reference frames that the US National Geodetic Survey (NGS) is to introduce starting in 2022. NATRF2022, the North American terrestrial reference frame introduces new frames for the Continental US/Canada/Mexico, the Mariana tectonic plate, the Pacific plate and the Caribbean. NATRF2022 will replace NAD83 which, it appears, is not as ‘geocentric’ as intended. It is actually off-center by about two meters. Moreover, plate motion since NAD83’s definition has aggravated the situation. The new frames will be time-based to accommodate future plate motion. As Blue Marble says, ‘This is going to require a new mindset for a lot of GIS users.’ Incidentally, a similar issue was reported at the NDR2017 Stavanger meet from the New Zealand NDR following the 2011 Christchurch earthquake that moved parts of the country by tens of meters!

* Coordinate reference system.


Schneider Electric/APC’s local edge configurator

LEC validates remote IT deployment architectures.

Schneider Electric’s APC unit has introduced a Local Edge Configurator (LEC) for designing ‘edge’ internet of things (IoT) solutions.

The LEC enables users to design a comprehensive, validated edge architecture, including rack, uninterruptible power supply, security, power distribution and software, for plug-and-play field deployment. A Schneider rep told Oil IT Journal, ‘The LEC is relevant to any industry that deploys a distributed IT infrastructure. In the oil and gas vertical, this could be any IT room in the office or oilfield, anywhere an IT stack processes sensor data.’

The LEC includes an automatically updated library of APC’s infrastructure solutions along with leading storage, networking, converged and hyper-converged systems from Cisco, Nutanix, Dell EMC and others.


Sales, partnerships, deployments ...

EnergySys, Accord Energy Services, EPIM, Cegal, Datum360, Expro, Getech/ERCL, Total, Breakthrough Energy Ventures, ILandMan, TGS, Jacobs Engineering, KBR, Infor, LMKR, Landmark, iEnergy, Total Safety, OFS Portal, EDrilling, Salcon, Rock Flow Dynamics, Nice, Quorum, SimSci, Cainkade, Voyager Search, Exprosoft.

Chrysaor is now using EnergySys’ cloud for hydrocarbon accounting and reporting from the ‘SEA’ assets it acquired from Shell in a £2.5 billion deal earlier this year. The system will be implemented by Accord Energy Services.

Norway’s E&P Information Management Association (EPIM) selected Cegal to provide its infrastructure operation service (IOS) in a five-year, 15 MNOK deal. The IOS comprises standardized operations, a common service desk and cost efficiencies leveraging an ITIL/ISMS processes. Cegal is to host and operate the platform.

Maersk Oil has awarded Datum360 a global framework contract for the provision of the PIM360 cloud-based engineering information management. PIM360 is deployed on Maersk’s Danish North Sea Tyra development.

Expro has secured a £5 million, five-year renewable master services agreement with Repsol Sinopec Resources UK for well services across its North Sea assets.

Getech’s ERCL unit has signed an agreement with Petroleum Directorate of Sierra Leone to provide advisory and technical support services.

Total is now a strategic partner to Bill Gates’ Breakthrough Energy Ventures, a billion dollar investment fund financing cleaner, low-carbon energy.

ILandMan and TGS have partnered to integrate land and well data systems and provides clients with streamlined access to well information and status data from the TGS’s database.

Jacobs Engineering has renewed its global enterprise framework agreement with Shell. The deal covers concept, front-end engineering, detailed design, procurement, project management, and construction services for Shell projects.

KBR and Saudi Aramco have signed a memorandum of understanding to expand and develop KBR’s services for Saudi Aramco. The agreement is in line with Saudi Aramco’s in-Kingdom total value add initiative to double the percentage of locally produced energy-related goods and services by 2021.

Koch Industries is to deploy Infor CloudSuite to meet its financial management, procurement, and human capital management needs.

LMKR is now member of Landmark’s iEnergy platform and is showcasing the integration capabilities of Gverse and GeoGraphix.

Total Safety has joined OFS Portal as a supplier member.

EDrilling and Salcon Petroleum Services are teaming up to service and support the digital transformation initiative of upstream oil and gas companies in Malaysia.

Rock Flow Dynamics along with Nice are to offer reservoir simulations in the Amazon AWS HPC cloud.

Staghorn Petroleum has implemented the latest release of myQuorum ‘Land on demand’ cloud application.

Showa Shell Sekiyu is to deploy Schneider Electric’s SimSci Spiral Suite. The unified supply chain management solution offers a cloud-based working environment and a common data model spanning planning and scheduling.

Cainkade is now part of the Voyager Search strategic partner program. Cainkade will deliver customized user interfaces for the Voyager platform and provide solutions for brand integration, custom home pages, and portal integration.

Launched in 2007, utilized by ‘a number of operators in Europe, Middle East and Asia Pacific,’ Exprosoft WellMaster IMS is now available as a fully cloud-based solution. It was recently implemented by two operators in Australia and New Zealand.


Standards stuff

Industrial Internet/Edge Computing consortia to partner. IOGP SSDM V2/P6/11 V1.1. Energistics/BP release RESQML introductory video. CFA Institute backs SEC’s switch to Inline XBRL. NISO’ reports on 'Issues in vocabulary management.'

The Industrial Internet Consortium and the Edge Computing Consortium have signed a memorandum of understanding to partner on the advancement of the industrial internet and edge computing. The organizations are to cooperate to maximize interoperability and portability of the industrial internet by sharing best practices, test beds and R&D.

The IOGP has released V2.0 of its seabed survey data model (SSDM) with notably, a UML model in Enterprise Architect for both ArcGIS implementation and GML encoding. SSDM V2 also now complies with the EPSG coordinate reference system schema and central reference for units of measure. The IOGP has also released version 1.1 of the P6/11 seismic bin grid data exchange format.

Energistics has put a video of an introduction to its RESQML reservoir data exchange standard online. Guest presenter, BP’s Lisa Towery, describes RESQML from an operator’s perspective, how it addresses challenges such as data and knowledge loss, inflexible workflows, uncertainty and non-productive time. More webinars from Energistics here.

The CFA Institute has insisted that the US SEC’s proposed switch to Inline XBRL be maintained, coming out strongly against the possibility that smaller companies might be exempted.

The US National Information Standards Organization (NISO) has issued a draft Technical Report, Issues in Vocabulary Management. Although its primary focus is bibliography, the report addresses interoperability in the linked open data environment.


GBC IIoT/digital solutions in oil and gas, Amsterdam (part II)

Passive tags, Jovix and smart strategy for Fluor. AGR’s iQX and GeologiQ for AkerBP’s Ivar Aasen. Silicon Microgravity’s MEMs accelerometer. Accenture-moderated IoT debate.

Pedro Tavares (Fluor) with Paul Mitchell (Atlas RFID) described a ‘fit-for-purpose’ deployment of RFID* technology to provide material traceability during the construction of a Middle East refinery. The project involved some 150 suppliers in a joint venture that spanned locations in China, the Middle East and beyond. RFID technology has evolved from simple passive RFID tags into sophisticated geolocational sensors that provide accurate equipment location. While these are cheap, a SmarTrac tag costs 50 cents, with a million equipment items, these were prohibitively expensive. Instead, Flour went for passive tags and Atlas’ ‘Jovix’ tracking software. Equipment in the yard was surveyed by a scanner mounted on the security car during nightly rounds and tied in with the transaction log. The operation resulted in a 90% reduction in location time and a three month reduction in ‘schedule risk.’ When all the equipment was on-site, the EPC was asked if it wanted to keep the tags. They said, ‘no thanks’ and cut all the tags off! ‘A big lost opportunity for a world’s first, innovative tag-based construction project!’

Petter Mathisen (AGR Software) presented work for AkerBP on digital collaboration on Norway’s Ivar Aasen field. Most operator reports are delivered as PDF documents that ‘make structured information unstructured!’ Enter AGR’s Intelligent Well Data Management tool, iQX, a reporting database and its GeologiQ application that ‘plugs gap in current Norwegian reporting and data capture.’ An ‘open source’ database also ran, providing ‘basic analytics’ that found a lot of incomplete data that otherwise would probably not be touched before decommissioning.

Paul Vickery presented Silicon Migrogravity’s microchip-based gravity meter, co-developed with BP. SMG’s MEMS accelerometer measures gravity to one part in a billion and targets borehole applications. Vickery wondered aloud if his commercialization strategy should be selling sensors or the data to conclude that the latter was the preferable option. ‘It’s hard to generate billions by selling hardware. Better to target high value opportunities and sell the data!’ SMG was formed in 2014. Field trials scheduled for 2017 have slipped into 2018.

In an Accenture-led group discussion it emerged that none of the operators present had committed to a full scale IoT platform. Total is looking at GE Predix, Azure and Schneider Electric solutions but there may not be a single platform across the company. BP is using Predix for analytics on facilities but not necessarily elsewhere in the organization. OSIsoft observed that some companies are backtracking on the IoT platform concept which is not so easy as previously thought. Operators may deploy Predix on GE equipment but use an in-house ‘R’ environment for analytics. One experienced user opined that many IoT trials fail because ‘people don’t have a good underlying use case.’

More from Global Business Club.

* radio frequency ID tags.


The well-connected well pad...

Rockwell Automation blog advocates shift from legacy RTUs to modern PLC controllers.

Rockwell Automation’s Zack Munk blogged recently on the relative merits of RTUs and PLCs at the well site. Some operators have as many as 50 wells on a single pad, placing ‘much greater demands on control systems.’ Time is ripe for a reevaluation of technology options. For decades, the remote terminal unit (RTU) was the go-to technology in the upstream. RTUs are rugged, low power and could handle Scada systems’ low bandwidth needs.

But the modern multi-well pad is pushing the limits of RTU technology and producers need a more performant solution such as a programmable logic controller (PLC). PLC originate in the plant/factory environment rather than the ‘inhospitable and harsh’ environments of the oil field. But increasingly, well pads have environmentally controlled buildings or enclosures, along with ping and power, the perfect environment for PLCs.

Munk believes that a ‘modular and scalable’ PLC control architecture addresses the challenges of legacy RTUs. PLCs can be configured in many ways, allowing operators to monitor and control a wide variety of field instruments. PLCs support communications across different networks while libraries of pre-developed, documented code allow for rapid onsite configuration without specialist programming skills.

Multi-well pads have made data and application requirements in upstream operations greater than ever. RTUs remain a feasible option, but their memory limitations, added maintenance requirements and overall higher production costs provide a strong incentive for operators to consider a better alternative.


Open source stuff ...

Madagascar seismic processing V2.0. gvSIG - free open source GIS online course.

A new major release (V2.0) of the open source Madagascar seismic processing package features 25 new reproducible papers and significant other enhancements including complete examples of seismic field data processing and enhancements to parallel computing. The Madagascar source repository is now on Github. New reproducible documents cover seislet-based component analysis, structure-constrained acoustic impedance, dip estimation and more. The previous 1.7 distribution was downloaded nearly 12,000 times.

The gvSIG association has published a free online course on learning GIS with open source software. The course covers GIS tools, concepts and terminology along with map making, working with projections, vector and raster data.


PLM comes to oil and gas

Siemens’ plant lifecycle management for DNV GL. Veracity trials for Lundin on Edvard Grieg.

DNV GL’s has teamed with Siemens on a digital asset model for oil and gas, combining its knowledge of projects and operations with Siemens’ Teamcenter product lifecycle management (PLM) technology. Today, projects suffer from disparity of formats and the lack of a single source of asset information for operators, designers, yards and manufacturers. DNV GL CEO Elisabeth Tørstad said, ‘Users will have access to an online, self-service portal that offers automated compliance-checking, benchmarking and data mining capabilities.’ A key component of the collaboration is Siemens’ ‘Active Workspace’ for Teamcenter, a PLM GUI that supports change management, workflow, requirements management and visualization.

DNV GL’s Veracity has been deployed by Lundin Norway in what is said to be one of a ‘few practical use cases’ of digitalization in oil and gas to date. Veracity is currently monitoring energy use from 2,000 sensors installed on Lundin’s Edvard Grieg offshore hydrocarbon processing facility. The expectation is that Veracity will help anticipate a shutdown situation before it happens. The data science application was developed by four student interns.


Toys for the mobile workforce

New solutions from Emerson and Parsable connect workers to the internet of things.

Emerson’s AMS Trex device communicator connects asset management experts with field devices. Previously, information on changes in the field was often incomplete, inaccurate, or delayed. Now, these changes are automatically synchronized to the facility’s device manager database, logging and timestamping changes as they occur.

Changes to a ‘stranded’ (non connected) device are cached locally on the communicator and uploaded when the handheld reconnects. AMS Trex is a component of Emerson’s ‘always mobile’ portfolio, itself part of the Plantweb digital ecosystem. Visit the interactive demo.

Parsable reports a successful deployment of its ‘Industry 4.0’ connected worker platform at a Unilever plant. Parsable’s SaaS platform transforms paper-based procedures into digital business processes, reducing downtime and providing real-time collaboration and communications. Parsable claims to ‘allow humans to provide the same continuous signal as IoT devices.’ At Unilever, the deployment has added some 30,000 new data points daily and brought a 50% decrease in startup, shutdown, and changeover times. The plant is now 85% paperless after only four months of operations. San Francisco-headquartered Parsable’s clients include Schlumberger, Scientific Drilling and TechnipFMC.


EnergySys’ vision, an ‘oil company in a box'

ODATA and web services API proposed route to patchwork of interoperable upstream applications.

EnergySys’ ‘Oil and gas company in a box’ (OGCB) is a vision for a ‘patchwork’ of cloud applications from which companies can pick and choose, building workflows for activities such as drilling, reservoir simulation and production data management. OGCB builds on a platform of office productivity applications, document management, finance and reserves management to which access historian data, analytics, asset tracking and CRM software for marketing and management of partners.

Behind the vision is a ‘checklist for the cloud’ which assures that applications are both data storage and infrastructure agnostic. Boxes to tick include a web services API and OData-conformant storage. Unfortunately ‘there is a shortage of quality cloud applications for oil and gas.’ EnergySys is hoping to change this situation and already, a ‘significant percentage’ of North Sea production data is stored in the EnergySys cloud. More from EnergySys.


High performance computing news

TACC Stampede2 upgrade. Ansys for Aramco/Kaust. Linpack benchmark challenged.

The NSF has finished a $30 million upgrade to its Stampede2 supercomputer at TACC, the Texas advanced computing center at the University of Texas at Austin. The system now peaks at 18 petaflops and comprises 4,200 Intel MIC-based Knights Landing nodes and 1,736 Intel Xeon Skylakes. Access is through NSF’s XSEDE science and engineering development environment. TACC’s Aaron Dubrow told Oil IT Journal, ‘It’s early for Stampede2 results but our oil and gas partners will have some really good data around on Stampede2 by the year end.’

Saudi Aramco, Ansys and KAUST claim to have shattered a supercomputing record, scaling Ansys Fluent across nearly 200,000 processor cores, a 5x increase over the record set three years ago. Simulation of a multiphase gravity separation vessel was performed on the Shaheen II Cray XC40 supercomputer hosted at the KAUST Supercomputing Core Lab. More from Ansys.

A recent presentation by Jack Dongarra (Oak Ridge National Lab) at the EU DG Connect symposium addresses the global race for exascale (over 1,000 petaflops) HPC. The race is taking place in three ‘swim lanes,’ CPU, GPU and ‘lightweight’ cores*. The exaflop target is expected to be hit some time around 2020. Today, China hosts 36% of the TOP500.org systems and has begun making its own processors. Dongarra concluded with a critique of the ubiquitous Linpack benchmark which no longer accurately tests HPC performance. The high-performance conjugate gradient method is a candidate for Linpack’s replacement.

* Like the ARM chips used in cellphones.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.