May 2014


Teradata at Statoil

Statoil deploys a Teradata EDW 6690 data warehouse appliance and comprehensive all-relational data base to automate analysis of 4D time lapse seismic surveys.

Speaking at the 2014 PNEC Conference in Houston this month, Jane McConnell presented the results of a joint development project with Statoil to test the Teradata data warehouse appliance in an oil and gas context. Teradata’s first appearance at PNEC goes back to 2006 with Nancy Stewart’s seminal presentation of the use of the technology chez Wal-Mart. In the Statoil case, Teradata is used to store georeferenced seismic data from successive surveys alongside other data sets such as pressures and fluid content. Interpretation and simulator results are also loaded to the ‘reservoir data warehouse.’

While conventional interpretation technology can be used to demonstrate, for instance, how the time shift between successive surveys can be used as a proxy for reservoir pressure, the data warehouse approach allows for more in-depth analysis. Standard business intelligence/big data queries can be used to investigate a multi-dimensional data space and provide a ranking of different correlations. The approach is claimed to speed processing of new time lapse data which is now acquired every six months, or in some cases continuously, allowing for insights obtained from the data to be turned into actionable information for production operations in a timely fashion.

The test involved a Teradata EDW 6690 data warehouse appliance and tiered storage combining ‘hot’ solid-state drives and ‘cold’ hard disks with automated data management. An online analytical processing database optimized for fast access is claimed to provide more flexibility and performance than traditional applications such as Oracle database-driven environments. Data is stored in a ‘fit-for-purpose’ monolithic database that includes seismic traces and other domain specific data as user defined binary data type similar to SEG-Y. This ensures the seismic data fits in an acceptable amount of space. All data types are exposed via an extended SQL vocabulary that includes spatial query.

In a reservoir monitoring proof of concept, a large number of queries were run automatically against the data to derive a meaningful subset of correlations. Further root investigation of possible causes for seismic time shifts between surveys can be carried out with business intelligence applications such as Spotfire. The approach is claimed to provide insights into the data that would otherwise require much interpreter ‘grunt-work’ using conventional workflows. Current research is expanding the project’s scope to pre-stack seismic and history matching. More from PNEC and Teradata.


Wood Group bags Cape

Cape’s VP Link safety systems validator to complement MSi Kenny’s Virtuoso process control systems training simulator offering.

Aberdeen, UK-headquartered Wood Group is buying simulation software boutique Cape Software. Cape provides simulation software and services for operator training and logic validation for industrial control systems used in the oil and gas and other process industries.
Cape’s VP Link simulates a process control network in an operator training context. The toolset complements Wood Goup’s own Virtuoso simulation modelling technology. Previous collaboration between the companies has resulted in the delivery of integrated operator training systems for upstream offshore oil and gas facilities, enabling critical unit operations to be rigorously modeled.
Wood Group’s Michael Mai said, ‘VP Link broadens our existing operator training simulator to include a full-featured, hardware-based solution. Operators can now train on the same type of equipment they will operate in the control room. We intend to expand Cape’s track record in the SCADA/DCS and safety systems verification market through our global footprint.’ Cape will become part of the Wood Group’s MSi Kenny business line. Cape’s revenue for 2013 was around $5 million. More from Wood Group


How big is big data? And how different is it from ‘small’data?

Neil McNaughton editorializes over a serendipitously special ‘big data’ issue of Oil IT Journal. He offers a sideways look at the Hadoop movement and brings together some lose ‘big data’ strands from Teradata, SAS and OSIsoft, as well as new terabyte fiber and computer tomography datasets.

People often ask us for our ‘editorial calendar.’ I find this a curious request in that it suggests that news and views can be predicted months ahead of time. Of course if your publication is more interested in carrying the marketing department’s message than getting at the truth, well the news at least, then your position may be different. This month though we seem to have stumbled into a thematic issue of sorts, that of ‘big data in oil and gas.’ Being a believer in the ‘nothing new under the sun’ theory of IT, (and for that matter the world) my sketchy understanding of ‘big data’ is as follows.

As the likes of Google and Yahoo came into being they were faced with a problem of analyzing very large volumes of ‘click stream’ data coming into their server farms. This kind of problem was not exactly new. It is very similar to any stream of data like, for instance banking transactions. These come in in such volumes that they are likely stored in a system that may or may not be amenable to the kind of analytics that the business may later need. Thus we have had ‘data warehouses’ online transaction processing systems (OLAP) and what have you.

A good example of prior ‘big data’ art is evidenced in our 2006 report from an earlier PNEC on Wal-Mart’s IT. Here time series data from cash registers around the world is captured and then flipped into a format that is more amenable for analysis. This likely involves turning it into a ‘data cube’ (a matrix to us scientists) that can be queried and analyzed statistically with ‘R’ or toolsets like Spotfire and SAS.

Getting back to the Googles and Yahoos, their next obvious step would have been to buy an enormous data warehouse. Except that this is not what happened for an obvious reason. Just think of the bill if Google was running Teradata! No, if you are fresh out of school and faced with such a problem you roll up your sleeves and code. BTW do not take this too literally, for every 1000 that do this, probably 999 are still sitting in front of their computers surrounded by discarded pizza boxes struggling to make that killer app. But I digress.

Natural selection being what it is, some of this intense coding effort will produce results. A savvy entrepreneurial type can then further leverage such by a) giving the code base away so that they get a more or less free ride from the open source community and b) work hard to consolidate the business. The latter may involve building massive server farms, diverting rivers and so on (another digression). Google’s data warehouse saved it from paying for a zillion commercial licenses and begat the ‘big data’ movement, a suite of technologies centered on Hadoop. So far so good. But what does all this mean to a vertical like oil and gas?

Reordering time series data and performing matrix operations sounds a lot like seismic processing. Indeed this is one of the putative use cases (page 12) from Hortonworks, the company that has been anointed, just as Red Hat was for Linux, as the torch bearer for ‘commercial’ Hadoop. Seismic data is indeed ‘big,’ but decades of R&D and relatively little pressure to save on costs mean that any new technology is going to have a hard time battling the highly tuned installed base. In a short chat with some Hortonworks representatives I was assured that the combination of a Hadoop ‘data lake’ and schema-on-query (as opposed to schema-on-load) were real differentiators. I was also told that Hadoop is already deployed in a production environment at oil and gas companies. Which companies these are could not be revealed because of the ‘commercial advantage’ it bestows. I intimated that I had ‘heard that one before’ and was accused of cynicism and ‘cautioned’ although I’m not sure against what. I guess I’ll find out now!

Steve Holdaway’s book on ‘big data’ (review on page 3) is interesting in that it makes no reference to Hadoop at all. Holdaway’s focus is on data mining with statistical tools. The omission reminds me of a conversation I had with one practitioner who was wide-eyed when I asked if the compute resources were stretched. I was thinking of a big cluster. He was thinking pattern matching and fuzzy logic running on a PC. Much of key oil and gas data (especially historical production) is actually quite small!
Our lead this month on Teradata at Statoil is another interesting ‘big data’ use case. Also sans Hadoop incidentally, and actually more like the Wal-Mart deployment above.

Our ‘overflow’ report from Intelligent Energy (page 7) includes Maersk’s trials of running Schlumberger’s Eclipse reservoir simulator in the Amazon cloud. OK, it’s not Hadoop either, but it does show that big data in the broadest sense can be shifted around and processed sans infrastructure. Another presentation at IE described how a 150 terabyte data set could be acquired in a couple of days of digital temperature sensing observation. That’s ‘big’ alright! As indeed are the multi terabyte data sets that BP’s digital rocks effort generates in a matter of hours (page 12). Another use case put forward for big data is all that stuff that is (or soon will be) streaming in from the digital oilfield. Here again, the technology space is pretty well occupied by domain specific tools notably from OSIsoft (see our report on page 6).

If your problem set involves fault tolerant clusters and the sort of re-ordering before query operations that the data warehouse provides and if you are looking in an entrepreneurial fashion to deploy such economically, then maybe the Hadoop approach is the way forward. Perhaps in a couple of years we will all be thinking ‘map reduce’ instead of SQL. If you would like to hear more on big data in oil and gas and you are in London in July you might like to consider attending the SMi ‘Big data in oil and gas’ conference.

Follow @neilmcn


Book review—Big data in oil and gas

Keith Holdaway’s new book, ‘Harness oil and gas big data with analytics’ offers plethoric advice and opinion on the technology’s potential but falls short on compelling use cases.

Harness oil and gas big data with analytics,* (Harness) is a large well-produced book replete with comments and admonitions from a recognized industry expert. Holdaway has co-authored several SPE papers and is a member of the advisory council of the SPE petroleum data driven analytics technical section. To give you a flavor of where Holdaway is coming from, a few sample quotes from the introduction. ‘Traditional deterministic and interpretive studies are no longer viable as monolithic approaches to garnering maximum value from bid data across the E&P value chain,’ ‘The digital oilfield [..] generates a plethora of data that, when mined, surfaces hidden patterns that enhance conventional studies.’ ‘It behooves the upstream geoscientist to implement in-memory analytics technology.’ ‘Geophysicists are entrenched in their upstream bailiwicks.’

First a word as to what Harness is not about. A quick scan of the index reveals that there is no Hadoop, no NoSql, MapReduce or other of the trendy technologies that the ‘big data’ movement of recent years has spawned. Harness, as befits its SAS credentials, is about the collection of statistical and fuzzy techniques that make up data mining. To these, Holdaway adds lashings of oil and gas industry focus, use cases and terminology.

Holdaway has it that the big data problem set comprises three prominent issues, data management, uncertainty quantification and risk assessment. He argues for an approach that combines data discovery ‘first principles.’ It should be possible to explain what is causing an observed correlation.

Holdaway is verbose and repetitive but not without humor. The oil and gas industry is ‘moving toward adoption of data mining at a speed [that would have been] appreciated by Alfred Wegener.’ His explanation of 3D seismic acquisition as being like ‘dropping a bag of ping pong balls into a room’ is funny too, although perhaps not intentionally. Harness covers a lot of ground in its 300 pages. Aside from a short mention of SAS’ Semma approach, the approaches described are generic—and there a lot of them, more than can be covered in a short review.

An introductory chapter on data management introduces the big data buzzwords of volume, variety, velocity. Holdaway walks though the notions of data quality, governance and master data to introduce a four tiered data architecture and a production data quality framework.

The seismic use case is clearly dear to Holdaway’s geophysical heart. He has it that ‘soft computing methodologies that map seismic attributes to reservoir properties are incredibly important as a means to define more credible and reliable reservoir characterization definitions for field development.’ Harness offers its own ‘plethora’ of techniques for seismic trace analysis—PCA, various transforms, clustering and more. He warns against ‘overtraining’ of algorithms on sparse data and states intriguingly that ‘the majority of common [seismic ]attributes are redundant.’

Hodaway refers to the ‘top down’ and surrogate modeling approach of Intelligent Solutions Inc., used to address non productive time, stuck pipe and to optimize production. Another section covers unconventional reserves estimation.

All in all there is a lot of material here. Our main criticism is that the use cases fall frustratingly short of compelling. While Holdaway’s advice and commentary should help practitioners practice, Harness fails to provide hard evidence (that could be presented to management) that such techniques really work, or which of the many described, work best.

* Wiley/SAS Institute—ISBN 978 1 118 77931 65700.


Oil IT Journal interview—Garrett Leahy

Emerson/Roxar is developing its RMS flagship geomodeler into a fully-fledged interpretation tool.

Last time we talked it wasn’t clear if RMS was really ‘doing’ seismic interpretation...

Yes, the seismic workstation market does seem saturated. But the other tools still leverage a pencil and paper paradigm. Our new technology lets you model on the fly. You just put in a few control points and the guided interpretation tool does the rest, producing a set of 3D geological surfaces.

But ‘modeling while interpreting’ is not entirely new…

No. But what RMS adds is a simultaneous quantification of uncertainties in the interpretation. The uncertainty ‘brush’ produces multiple realizations and a fully risked structural picture and rock volumes.

How did this go down at the AAPG. The American market is perhaps not so receptive to ‘uncertainty?’

There is probably less time for modeling and analysis in the North American context, but this is where a tool that speeds up the process comes into its own. In fact reporting requirements are pushing North American majors to risk their reserves.

Is the ability to leverage compute clusters a selling point?

Yes. And we advocate their use early in workflow. Parallel computing is enabled by RMS’ multi-thread/multi core architecture. This is again a selling point as most E&P software is single threaded despite some marketing claims.

You still have some way to go to evolve into a ‘true’ interpretation package?

Yes. Our vision in the past has been somewhat muddled. We did not want to make waves. But the extraordinary effort that has gone into the technology means that we are now pushing it to the next level. We haven’t got all the small bells and whistles yet. But already we are adding value to client workflows with the fast interpretation and simultaneous modeling where we achieve up to a 10 x speedup. The tool is already fully functional and even if we don’t do horizon flattening yet and some other features, these gaps are being filled. More from rss.marketing@emerson.com.

For a sample use case of modeling while interpreting and uncertainty quantification checkout the paper ‘A new approach to quantify gross rock volume uncertainty—application to a carbonate reservoir, offshore Abu Dhabi,’ presented by Khalil Al Hosani et al at the GEO 2014 Conference held earlier this year in Bahrain.


Schlumberger leverages Ansys for bottom hole assembly design

Technip automates jumper design runs. Schlumberger studies drill string buckling.

A special oil and gas issue of Ansys’ Advantage in-house magazine includes case histories from Technip and Schlumberger that show how the engineering toolset addresses complex problems confronted by upstream equipment designers. Technip has automated design runs to perform statistical analysis using Ansys’ parametric exploration and optimization tools to check the structural performance and integrity of subsea jumper pipe structures. Ansys’ DesignXplorer ‘design of experiment’ tool allowed a huge parameter space to be investigated rigorously with ‘only’ 20,000 simulation runs.

Schlumberger’s engineers have used full physics finite element analysis to model buckling in various drill string and bottom hole assemblies. The technique has allowed investigation of hydraulic forces that can damage drill pipe in operations such as hydraulic fracturing and acidizing. Modeling complex interactions between the drill string, borehole and fluids has proved challenging using a conventional analytical approach. Ansys’ Mechanical Beam188 provided an accurate means of simulating helical buckling in a well test string sealed in a packer. The tool predicted higher bending stresses than the analytical solution, allowing for an improved BHA and safer deepwater operations. Ansys’ attractively produced 32 page magazine is a free download.


Core Lab—drill closer, frac more!

Core reports record first quarter and ‘robust’ demand for FlowProfiler oil tracer service.

Core Laboratories has posted its most profitable first quarter in its history driven by improvements in all operating segments. First quarter net income increased 10% to $62,280,000 and operating income increased 7% to $84,427,000 (figures exclude foreign exchange adjustments). Core is benefitting from its position in the tight-oil plays of North America where secondary and tertiary recovery projects are being conducted to increase production from shale plays outside the proven sweet spots of the Eagle Ford and Wolfcamp.

Core reports ‘robust demand’ for its FlowProfiler service. Here an oil-soluble tracer is injected during fracking. When a well flows, oil is analyzed with a gas chromatograph allowing producing zones to be identified. The technique also pinpoints stages that do not flow, ‘providing valuable insights for future wells.’ The results mean that Core is now recommending closer well spacings, longer laterals with more and shorter stages, and pumping proppant to ‘screen-out’ for all stages. Core suggests that such an approach, while adding 20% to a well’s costs, are ‘clearly offset’ by the potential for a 40% to 60% increase in ultimate recovery. The approach, if widely adopted, would likely give Core a few more record quarters too. More from Core.


Don’t flare that gas!

Well Power and Blaise Energy offer alternatives to wasteful flaring of well head gas.

As we reported back in February 2013, much North American shale gas is flared off. The by-product of more valuable liquids production is uneconomical to produce, especially in more remote locations such as the Bakken where flaring lights up the night sky. Recent developments in micro gas gathering technology are set to change this situation. Houston-based Well Power is to deploy a test unit of the Micro-Refinery Unit (MRU) later this year. MRUs are deployed at the wellhead to process natural gas into liquid fuels and clean power. Well Power operates the MRU in Texas under license from Calgary based ME Resource Corp. and is currently negotiating for extensions of the deal to other States. More from Well Power and ME Resource.

Blaise Energy of Bismark, North Dakota already has technology that can compress well head gas for use in electricity generation or as truck fuel. One major Bakken operator is using five Blaise natural gas powered generators to provide one megawatt of power to the operator’s CNG compressor. Using CNG as a fuel at the well site reduces diesel costs and site emissions. The unit automatically adjusts power output to demand load. The system can scale to over a million cubic feet per day, more than the largest of the 3,000 Bakken flares currently burning. Almost 30% of North Dakota’s gas production is currently flared despite the fact that half of the flared gas comes from sites that are connected to a gathering system. More from Blaise.


Paradigm rolls-out 14.0

Enhancements to geosciences toolset address seismic imaging, data loading and connectivity.

Paradigm has released a new version of its geological and geophysical interpretation suite, Paradigm 14, to its worldwide user base. The toolset covers the upstream workflow, from processing, interpretation and modeling, drilling, reservoir characterization and engineering. Enhancements in the current release focus on seismic data processing and interpretation.

Chief product officer Somesh Singh said, ‘With the 14 release we are delivering advanced science for everyone, enhancing the user experience with convenient workflows and easier installation and making these technologies available to the entire user community.’

Specific enhancements include mathematical regularization of seismic sources and sensors into a desirable geometry with ‘5D’ data reconstruction, enhanced fracture determination from seismic data and improved velocity determination. The new release also bolsters Paradigm’s Epos data infrastructure with streamlined data loading and a new GUI and tools for IT project management. Connectivity is extended with a JavaSeis plug-in and a direct connection to ESRI shapefiles. More from Paradigm.


Software, hardware short takes

Baker Hughes, Sitfal, dGB Earth Sciences, Massachusetts Institute of Technology, Eliis, Midland Valley Exploration, Sercel/Metrolog, TransAct Technologies, Walls, Oilfield Camo.

Baker Hughes has announced a mobile application for its WellLink Vision monitoring platform for oil and gas production operations. The new mobile app enables technicians to use a mobile device to access and enter data, actively monitor well information and receive alerts while out in the field.

Sitfal ‘s log plotting software is now available as a free plugin for dGB Earth Sciences’ OpendTect users. Sitfal’s ‘Clas’ computer log analysis software enables log viewing in the free version of OpendTect. For interactive curve editing, multi-well, multi-log curve crossplots and histograms, a commercial license is required. More from dGB and Sitfal.

The Massachusetts Institute of Technology has announced CnC, a programming toolset that offers general purpose parallelization of computer code. According to the release, ‘parallel programming is difficult for anyone, especially for the domain expert (as opposed to the computer scientist).’ Conventional programming requires developers to think about what code needs to be parallelized which is both hard and architecture dependent. CnC frees developers to concentrate on computational data flows in terms of producers and consumers, and code controllers and controlees. CnC is claimed to produce more parallelizable and efficient code. CnC was developed under the US Department of Energy’s Exascale software stack (S-Stack) project.

The 1.5 release of Eliis’ Paleoscan increases the size of interpretable seismic volumes and reduces the time required to process and interpret large data volumes. The new release introduces parallel computation on multi-core processors and ‘seamless connection’ with Schlumberger’s Petrel. Other new functionality addresses seismic to well ties with check shots/sonic calibration, wavelet extraction and an interactive well tie window. Synthetic seismogram generation and correlation tools provide a ‘new way to compare geologic and seismic interpretation.’

Midland Valley Exploration has released FieldMove Clino (FMC), a free app for iOS and Android smartphones that acts as a digital clinometers and field geological mapping helper. FMC uses the phone’s GPS location and orientation sensors to capture the orientation of planar and linear features in the field. Notes and photographs can be registered in context. Maps can be imported and data exported as .csv files for spreadsheet use or as .mve files for us in FieldMove.

Sercel’s Metrolog unit has launched iScope, a downhole monitoring system for fast, efficient and frequent retrieval of pressure and temperature data during well testing operations. iScope provides real time information during a build-up test leveraging Metrolog’s iQuartz pressure and temperature memory gauges. iScope can be deployed as a component of a standard drill stem test assembly. Gauge data is sent via wireless to an iScope shuttle at the surface.

At the Houston Offshore Technology Conference this month, TransAct Technologies introduced its Printrex color printer range to the oil and gas industry. The Printrex 920 was developed specifically for use in the oil field and uses thermal technology to produce ‘clear and detailed’ prints for decision-making at the well site. The printer was designed for use in harsh environments and remote locations. The Printrex 980 color is an inkjet office printer that prints continuous well logs at the rate of eight inches per second. The Printrex 1242 12-inch seismic and well-logging printer is designed for desktop use and boasts a printing speed of 4 inches per second.

Sartorially-inclined oilfield workers may be interested in Walls’ new line of apparel from Oilfield Camo. The line includes men’s workwear and hunting items (!), youth (!!) and toddler selections (!!!). Oilfield Camo is said to ‘reflect the pride of the oil and gas industry and provide effective camouflage for the outdoors.’ Presumably the camouflage is required to smuggle the toddlers onto the rig.


API completes divestment of e-commerce standards to PIDX Intl.

PIDX International has satisfied the API’s criteria for standards sustainability.

The American petroleum institute (API) has finalized the transfer of the PIDX trademarks and standards publications to PIDX International which is working to extend these to worldwide standards for the global oil and gas business. PIDX president and CEO, Fadi Kanafani, said, ‘The transfer of intellectual property is a key milestone in our mission to provide effective electronic collaboration between oil companies and their suppliers.’

PIDX International originally received an exclusive license from the API in 2011 but was required to demonstrate its ability to complete a full periodic maintenance cycle of all the PIDX standards and publish them for public review. This was achieved last year to the satisfaction of the API which has determined that PIDX International ‘is now eligible to become the sole owner of the PIDX name and standards publications.’

‘In the past few months, API management and the PIDX International executives worked diligently to bring a close to this PIDX transition,’ said David Miller, API’s standards director. ‘PIDX standards are now the sole responsibility of PIDX International, an organization that has proven capable of serving the eBusiness standards’ needs of our industry.’

We have reported on the bad press that the API has had in recent years (especially after Macondo) stemming from its dual role of industry cheerleader and standards-setter. The transfer has clarified the situation for the relatively uncontentious e-business standards. More from PIDX.


OSIsoft 2014 User Conference, San Francisco

Petronas expands PI System infrastructure. Chevron upgrades to PI web parts. More from Hess and cloud-data provider Industrial Evolution. MaraDrill adds Coresight and Spotfire. PG&E super-sizes video wall. Suncor’s journey ‘from historian to infrastructure.’ Pemex Refining’s game changing PI AF proof of concept. Pemex on pipeline PI deployment.

Musreen Azwan outlined Petronas’ migration from a ‘stagnant’ data architecture reliant on phone calls and manual data entry into spreadsheets to a secure, PI system-based real time data infrastructure. The new system provides early warning of equipment failure and safe operating limits displays with PI web parts and the Excel data link. Comparing real time data with historical records has enabled Petronas to avoid five unplanned shutdowns in 2013. PI scope is expanding rapidly and, by 2015, should cover 46 assets and a grand total of 800 thousand tags.

Ernest Garner and Tara Willis presented recent upgrades to Chevron’s Gulf of Mexico PI System. With the legacy PI infrastructure it was hard to keep high resolution (5second) data in the PI archive. Various in-house developed work arounds proved hard to maintain. The new system is designed around a PI Asset Framework (AF) server acting as a data consolidator with client applications built from PI SharePoint web parts. The new installation was an opportunity to standardize on data governance, tag names and graphic conventions. Tag names include asset code, well name and are carried through to the SharePoint clients allowing for consistent information display and drill down. The data is also integrated with other Chevron well data sets such as the equipment master and work orders. A comprehensive PI installation manual and operating procedures has also been delivered. On the lessons learned front, latency issues between PI and SharePoint mean it is a good idea to co-locate systems. The project required significant time and resources to identify and standardize control systems’ data. Finally PI WebParts needs its own server dedicated farm, it ‘does not work well in shared SharePoint environments.’

A presentation from Hess’ Tony Goodreau and Industrial Evolution’s Simon Wright showed how IE’s PI-based Gulf of Mexico data hub is used to share process data between joint venture partners and third parties. The IE hub provides connectivity with other operators assets which may be running PI or, in the case of data from an Anadarko facility, a CygNet-based data feed. Hess is in the process of migrating from a VPN-based PI to PI connector on the downstream side of the IE data center to a solution leveraging PI cloud connect (Picc), a web-based customer portal managed by OSIsoft. The Picc has passed most of the trial’s tests but a decision on roll-out is pending further work on data sharing security and scalability. Hess will continue to use IE’s data hub which has been deemed to ‘speed deployment in a PI to PI context.’

Ken Startz provided an update on Marathon’s MaraDrill PI-system based drilling optimization solution. MaraDrill provides weight on bit, RPM and other parameters to remote stakeholders leveraging PI data connectivity to populate spreadsheets at the head office with real time data from factory drilling operations in the Eagle Ford and Bakken shale. The system supports 1 second data rates which show details of, for instance, stick slip, that are below the resolution of conventional drill floor displays. Stick slip mitigation has brought a ‘sustained’ 40% improvement in rate of penetration. PI Coresight provides integration of WellView data and Spotfire analytics. After drilling, further Spotfire studies can be used to model rock strength and optimize future wells and to identify production sweet spots. Drilling optimization has halved Marathon’s drilling time, down from 24 days in 2011 to 12 in 2013. MaraDrill is being rolled out across 25 rigs and is now considered ‘strategic.’

Mel Christopher showed off PG&E’s ‘super-sized’ situational awareness video wall. The September 2010 San Bruno pipeline explosion and fire revealed serious limitations in PG&E’s contingency plans. Also inadequacies in the Scada systems made it hard to pinpoint the location of the rupture. This led to a dramatic change in the company’s gas control philosophy and the deployment of tools and systems that were designed for emergencies, not just normal operations. The new system has replaced a limited tabular view of operations with a real time display of pipeline inventory, color coded to show fill and capacity. Other screens show actual vs. forecast volumes and compressor status with PI AF. Hooks to ESRI’s mapping services allow customer calls to be geolocated. The massive video display was delivered by Barco. DST Controls also ran.

Tripti Somani presented Suncor’s PI journey ‘from historian to infrastructure’ supporting energy and environmental regulatory reporting, asset performance monitoring and safety management of its steam assisted gravity drainage heavy oil operations. The most recent development is the adjunct of PI AF which has simplified and speeded Suncor’s software development. For instance, it is common engineering practice to bypass safety systems during special operations such as shutdowns. In a complex operation like a large SAGD development this can involve thousands of bypass tags with custom logic and reporting requirements. To minimize the risks, Suncor has monitored safety critical bypasses with a combination of PI Data Access Server, PI JDBC along with some ETL code and business logic. The solution has improved the governance of critical interlock bypasses and allowed for the tracking and auditing of bypass history. The system is also used in risk mitigation planning.

Carlos Díez presented on Pemex Refining’s standard window on operations. Again, PI AF and WebParts have been used to transform an existing PI system into a platform for applications and reporting. The proof of concept system provides system wide KPIs and a window on operations in critical assets and refineries. Initial results show ‘game changing’ potential for improving asset reliability, safety, and yield.

Ruben Leo presented a more mature Pemex PI usage across its pipeline network. Here real time operations data flows through the PI System, supporting management of the transmission and distribution processes and flags critical events such as rapid pressure or flow variations and unscheduled equipment downtime. Watch the conference videos and download presentations from OSIsoft.


More from Intelligent Energy

Maersk trials Eclipse in the Amazon cloud. Exxon ‘cyber security is hard and getting harder.’ Yokogawa on safety, humans and automation in oil and gas and aviation. BP’s ‘basket analysis’ of oil production. Molten on multi terabyte fiber monitoring. Idmog on novel ‘big data’ technology.

Morgan Eldred showed how Maersk has been trialing s Schlumberger’s Eclipse fluid flow reservoir simulator in the Amazon web services cloud. A major issue is data transfer from the corporate network to the cloud fixed by Panzura’s Quicksilver technology. Inside the cloud, data transfers between simulated offices used OpenVPN on standard Linux Amazon machine images. Eclipse was run on a Linux C3 ‘xlarge’ size server with solid state drives. The proof of concept worked but challenges remain—notably licensing issues. The authors noted the ‘agility’ of working in the cloud, for instance benefitting from the latest technology refresh à la Intel Xeon E5.

Darrell Pitzer (ExxonMobil) observed that cyber security is hard, and will get harder, but business goes on. Scada vulnerabilities are on the rise, ‘the bad guys have found us.’ Control systems are hard to secure because of the 10 to 20 year equipment lifetime and the fact that they are designed for transparency and efficiency rather than security. Also, while many experts are reaching retirement, the hacker’s knowledge is ‘comprehensive and constantly building’ particularly with tools loke Metasploit and the Shodan search engine. Companies need to reduce risks to a ‘comfortable’ level—e.g. by banning removable media and changing default accounts and passwords. Training is also important. Also, verify that procedures are followed, for instance by throwing some USB drives around the parking lot and see who plugs them in! You also need a disaster recovery plan in case a site gets wiped out by a virus. Oils are not the most respected and loved companies, ‘there may be folks out there trying to shut you down.’

Maurice Wilkins (Yokogawa) investigated human behavior in critical situations such as the Indonesia Airways Airbus 380 incident (where the pilots saved the day) and the Texas City refinery explosion where the operators were overwhelmed with contradictory information on the plant’s status. With refinery losses trending upwards, is more automation the answer to process safety? For Wilkins, ‘having humans in the picture at the time of an emergency can be beneficial. But we need to prepare for abnormal situations where lots of confusing data and warnings are created. The answer is better guidance and support for operators. In the Texas City example, a ‘procedure assistant’ could have triggered actions that might have saved the day.

Richard Bailey (BP) showed how market basket analysis, as used in to analyze the contents of shoppers’ supermarket trolleys, can be repurposed for production data analytics. The oil and gas ‘basket’ has relatively few items—just start/stop events on injector and observations at producers. These are analyzed with ‘directed pattern search’ to look for cause and effect. Bailey reports that this is very successful and has helped BP optimize its gas injection strategy and ‘sell high.’ The technique is now embedded in a user-oriented toolkit.

Jeff.Liddle (Molten) presented on behalf of BP work done on distributed temperature sensing with fiber for well surveillance. Continuous monitoring with fiber can produce very large amounts of data. A single comprehensive monitoring operation of, e.g., a frac job plus flowback, can generate 150 terabytes. Fiber is however easy to deploy downhole and behind casing and requires no power or comms and it works over long distance, up to 25km. Interpretation tools are primitive and pertinent use cases are so far elusive.

Aymeric Preveral (Idmog) provided a round up of ‘big data’ technologies likely to impact oil and gas, making an analogy between the problems that Google and Amazon have and those of streaming data from the oilfield. Current SQL-based data stores and applications scale poorly and require a significant administrative burden to maintain. Data gets replicated and causes more work for IT. The answer is the NoSql as deployed in Google’s BigTable and Amazon’s SimpleDB. Preveral proposed a distributed architecture involving these technololgies but observed that connectors for Prodml, Ppdm and OPC-UA will have to be built. The IS presentations can be acquired through the OnePetro website.


Fiatech 2014 Conference and technology showcase

Shell and Dassault’s recognition of need for standardization in construction echoes past conferences. Analysis of full cost of JORD reference data project comes up with some scary numbers.

The US Fiatech organization includes members from the engineering/construction community and spans plants (including oil and gas facilities) and regular large buildings. It is therefore home to a broad church that includes building information management (BIM) systems and standards that have relevance to major offshore construction projects. The need for standardization in construction was emphasized in presentations from Shell and Dassault Systèmes, although what is puzzling is that the arguments are indistinguishable from those made at plant information management conferences twenty years ago. Shell’s contribution of a ‘advanced work packaging information mapping’ standard is new but seemingly unrelated to prior art.

Julian Bourne provided a status report on the flagship joint operations reference data (Jord) project which addresses a perennial problem of standards initiatives, that of populating and maintaining reference data. The ‘large and diverse’ group that is Fiatech means that its constituent communities protect their own interests which leads to many tensions in terms of needs and priorities. ISO 15926 needs to accommodate as many needs as possible. Bourne walked through the work required to complete Jord and the ‘irreducible costs’ which are rather eye watering. It costs about $5,000 for one complete definition. Further multiplication sees an overall investment of around $1.5 billion dollars, spread over a 20 year period. Bourne dutifully ploughed on with proposals for workgroup organization and funding. But (in our opinion) the likelihood of the construction industry a) coming up with $75 million per year for one standard initiative and b) engaging collectively on a 20 year project is small to vanishing. More from Fiatech.


Folks, facts, orgs ...

Babcock & Wilcox, Chevron Phillips Chemical, Circulation Solutions, Coler & Colantonio, Charles River Associates, CrudeShares/Engota, Deep Down, Ensco, Gaffney Cline & Associates, Sigma3, Siemens, Neos GeoSolutions, Pine Brook, Gasfrac, PODS.

Elias Gedeon has joined Babcock & Wilcox as senior VP and chief business development officer. He was previously with Alstom.

Kate Holzhauser is to join Chevron Phillips Chemical as VP environment, health, safety and security, replacing Don Lycette, who was recently named senior vice president, research and technology. Holzhauser hails from Ineos Nitriles.

Circulation Solutions has appointed Jerry Beeson as VP sales.

Ronald Hinthorn is the new director of software product delivery at pipeline software house Coler & Colantonio. Hinthorn hails from consultancy DNV GL.

James McMahon has re-joined Charles River Associates’s energy practice as vice president. Before his return, McMahon was with Black & Veatch.

CrudeShares, a wholly owned subsidiary of Engota, LLC, has announced an eponymous platform for the purchase and sale of privately held shares in oil and gas companies. VP Chuck Kowalski said, ‘CrudeShares gives investors access to oil companies that were once inaccessible. Shareholders of listed companies can offer their shares for sale, creating a means to exit an otherwise illiquid position.’ The offering targets shares in corporate ownership and those owned by officers and employees. More from CrudeShares.

Mark Carden has joined the board of Deep Down Inc. as independent director. He hails from Coopers & Lybrand.

Carl Trowell has been named CEO and president of Ensco plc. He succeeds retiree Dan Rabun. Trowell was previously with Schlumberger.

Nick Fulford is now head of LNG and natural gas at Gaffney Cline. He was previously with Direct Energy. Maurice Cardano heads-up the company’s new office in Bogota, Colombia and Doug Peacock has been appointed to the SPE reserves committee. Susan Eaton is general manager of the new office in Calgary, Alberta. Eaton, whose carreer began with Esso Resources Canada, is also an intrepid polar explorer.

Janet McGuire has been named Denver operations manager for microseismic and borehole seismic imaging with Sigma3. She was previously with Summit Geophysical.

Siemens has appointed Lisa Davis, currently with Royal Dutch Shell, to its managing board. Siegfried Russwurm is now CTO and Klaus Helmrich takes the lead in the digital factory and process division. Later this year the company will reorganize into nine divisions with oil and gas inside the new process industries and drives unit headed up by Peter Herweck. The corporate services, IT and supply chain unit will be managed by Hannes Apitzsch.

Neos GeoSolutions has appointed Larry Scott as VP of global sales. Scott hails from Global Geophysical Service.

Michael McMahon heads-up the new Houston location of New York-based investment firm Pine Brook. Dick Stoneburner, a geologist, who joined the firm last year from Petrohawk, and Claire Harvey, formerly with TPH Partners are also staffing the Houston location. More on Pine Brook in our report from the GE Oil and Gas conference held earlier this year in Florence (OITJ February 2014).

Gasfrac’s seven incumbent directors are to step down and be replaced by Julien Balkany and Pierre Jungels, Dale Tremblay, James Hill, Larry Lindholm and Mark Williamson. Balkany explained, ‘Our goal is to [..] eliminate the disconnect between Gasfrac’s current share price and the underlying value of its ground breaking environmentally-friendly waterless fracking technology.’

The Pipeline open data standards associations, PODS has announced that its executive director, Janet Sinclair, is leaving to pursue other opportunities.


Done deals

Tibco Software, Jaspersoft, Aker Solutions, Akastor, Atos, Bull, National Oilwell Varco, NOW.

Tibco Software has acquired San Francisco headquartered business intelligence specialist Jaspersoft Corp. The Jaspersoft name will be retained and the unit will operate as a product group within Tibco.

Aker Solutions is to split in two with one new company handling subsea, engineering and maintenance, operating under the Aker Solutions name. The other, named ‘Akastor’ includes drilling technologies, oilfield services and process systems. Both will be listed on the Oslo stock exchange.

Atos is to acquire French high performance computing specialist Bull for €620 million, a 30% premium on the average share price over the last three months.

National Oilwell Varco is to spin-off its distribution business into a new company, NOW Inc., headquartered in Houston. Now, NOW will be an independent, publicly traded company, and NOV will retain no ownership interest in NOW. NOW expects to receive approval soon for the listing of its common stock on the New York Stock Exchange under the symbol DNOW.


Supplier management white paper

OFS Portal’s Peter Smith has it that suppliers are not just a ‘necessary evil.’

A new white paper by Peter Smith of OFS Portal*, the supply side oil and gas e-business group, advocates giving more attention to supplier management which should be ‘central to your procurement thinking.’

Smith traces the history of procurement—starting with Samuel Pepys’ job supplying the English Navy’s provisions through to early formalization of the purchasing function, notably with Marshall Kirkman’s 1887 book, The Handling of Railway Supplies and German sociologist Max Weber’s 1920 early work. Fast forward to the late 20th century and the concept of ‘category management’ which brought a rigorous approach to procurement, and saw specialist buyers of items such as software, drilling equipment and services. While acknowledging that there are many definitions for supplier management, Smith homes in on key aspects as follows—vendor onboarding, qualification, performance management, risk and compliance management, relationship management and collaboration. Smith cites a 2013 report from Proxima titled ‘Corporate Virtualization: A global study of cost externalization and its implications on profitability’ which analyzed data from some 2,000 companies to find that 79% of revenues were spent with suppliers against only 12.5% on staff costs. Oil and gas had the third highest proportion of external spend at 77%, and the lowest labor cost of all at 5%.

While the importance of the supply chain is broadly recognized, today’s software tools do not fully support an integrated supplier lifecycle management approach. Companies have to integrate point solutions to address different needs such as risk management and onboarding. Smith advocates a ‘behavioral shift’ that sees suppliers as contributing to organizations’ future success rather than a ‘necessary evil.’ A renewed look at the big picture of supplier management is also needed—particularly in risk and performance management. Technology can help, even though the ‘perfect solution’ remains elusive. More from ofs-portal.com.

*Smith is also MD of UK-based Spend Matters.


Life Cycle Engineering teams with OSIsoft on asset management

Algorithmic predictive analytics to leverage new ISO 55000 standard.

Charleston, S. Carolina-headquartered Life Cycle Engineering (LCE) has partnered with OSIsoft to offer predictive analytics and intelligent asset management solutions to the oil and gas industry. The new customized risk-based asset management system leverages ‘algorithmic prediction analytics’ for real-time evaluation of drilling rig vulnerabilities that may lead to blow outs and other risks. The solution includes a dashboard displaying current oil rig status and location. The system was developed for a ‘large US drilling contractor.’

Alongside the situational awareness function, the system addresses compliance issues driven by new regulations emanating from the National offshore petroleum safety and environmental management authority and the US Bureau of safety and environmental enforcement. In response to these, companies are embarking on major asset management strategy revamps to protect the environment and safety of employees. ISO 55000, a recently released asset management standard underscores the importance of planning for and mitigation of catastrophic failures. LCE worked with the drilling company and OSIsoft to create risk-based equipment maintenance plans to be deployed with real-time asset health monitoring of drilling rigs. More from LCE.


Fieldpro 8 to release real soon now...

Resources Engineering Systems—‘An order of magnitude more shale wells are needed.’

In a short position paper/blog Resources Engineering Systems’ (RES) Mike Cleary argues that developing shale reservoirs may require an order of magnitude hike in the number of wells drilled. The required investment may force a shift from today’s independent operators to IOCs and NOCs. Today’s shale models are ‘mostly wrong or irrelevant.’ Enter RES’ Fieldpro with fracture modelling technology that has been ‘confirmed with realistic production matching.’ RES is to release Fieldpro V8.0 later this year. Fieldpro is a scalable, integrated system for oilfield operations management and subsurface engineering across—‘from spud to plug.’ V8.0 is the first commercial version of the software following ‘years of testing with select clients who have helped develop the product and bring it to maturity.’ More from RES.


University of Houston professors indicted

Integrated Micro Sensors founders alleged to have withheld fees.

Abdelhak Bensaoula and David Starikov, professors at the University of Houston, have been charged with making false statements in obtaining federal funds for research. The charges relate to the defendants’ startup, Integrated Micro Sensors, which received research grants from the small business innovation research program, NASA and the Department of Energy. The indictment alleges that the defendants failed to pay a required fee to the University of Houston on four of five contracts. The release from the US Attourney General’s office observes that ‘an indictment is a formal accusation of criminal conduct, not evidence’ and that ‘a defendant is presumed innocent unless convicted.’


Sales, deployments, partnerships …

Aker Solutions, Baker Hughes, Amplidata, Avere, Aveva, Arcadis, Exprodat, TeachMeGIS, FMC, Pemex, GE, Pemex, Kalibrate, Honeywell, Petrotechnics, ION, Weatherford, CurTran, OTI Petro-Smart, Ventyx, InStep Software, VeriFone Systems, First Data, Weir Oil & Gas, Rolls-Royce/MTU.

Aker Solutions and Baker Hughes are teaming to develop production solutions to boost output, increase recovery rates and reduce costs for subsea fields. The deal combines Aker’s subsea production and processing systems with Baker Hughes’ well completions and artificial-lift technology.

Amplidata and Avere Systems are to offer high performance, scalable storage solutions for data centers leveraging Amplidata’s object storage technology. A three-node FXT 3800 cluster achieved 180,229 ops/sec throughput and 0.95ms latency in trials. More from Avere.

Initec Plantas Industriales, the oil and gas unit of Tecnicas Reunidas, is to deploy Aveva Electrical on its major capital projects. Aveva also announced that Total E&P Nigeria has adopted its engineering and design solutions across its joint venture with the Nigerian National Petroleum Corporation.

Engineering design and consultancy specialist Arcadis has been appointed by BP North America to provide project management, program management and contractor management services in a three-year master services agreement. The deal is reportedly worth a minimum of $5 million per year and will be run through a new contractor management program model at BP’s West Lake campus in Houston.

Exprodat has extended its partnership with GIS training specialist TeachMeGIS to provide a series of geosciences courses in Houston.

C-Innovation has ordered six UHD-III Remotely Operated Vehicle systems from FMC Technologies. The units include FMC’s new ISOL-8 Pump which can close BOP rams in under 45 seconds, enabling compliance with API Standard 53.

GE, Petróleos Mexicanos (Pemex) and the Mexico Institute of Petroleum (IMP) have signed a technology collaboration agreement focused on the oil and gas sector. The agreement covers productivity and efficiency in mature onshore fields and deep and ultra-deep water projects.

Kalibrate Technologies (formerly KSS Fuels) reports sales of its PriceNet retail fuels pricing system to Finn St1 Oy, Deutsche Tamoil and an extension of its contract with Hess to its newly acquired WilcoHess sites. Hess has been using PriceNet since 2010.

Honeywell’s Experion Process Knowledge System and Safety Manager have been selected for the third phase of the Asia Trans Gas pipeline. The pipeline is operated by a joint venture of China National Petroleum Corp. and Uzbek-NefteGas,

GDF Suez E&P UK has selected Petrotechnics’ Proscient operational performance and predictive risk platform for its North Sea Cygnus gas field.

Pemex has awarded ION Geophysical’s GXT unit a multi-year contract for the provision of seismic data processing services.

Weatherford has entered into an exclusive agreement with CurTran for use and sales of its ‘LiteWire’ carbon nanotube technology in oil and gas.

OTI PetroSmart has has been awarded a 15 year contract from a government ministry in Southern Africa for its EasyFuel Plus management system. The contactless solution will enable automated refuelling at the ministry’s national network for its vehicle fleet.

ABB company Ventyx has teamed with Chicago-based InStep Software to bring advanced industrial equipment diagnostics to Ventyx’s Asset Performance Management solution.

VeriFone Systems and First Data Corp. have launched a VeriFone edition of the First Data’s TransArmor solution, an end-to-end encryption and tokenization solution for secure retail payment systems.

Weir Oil & Gas and Rolls-Royce Power Systems company MTU are to develop power systems engineered for hydraulic fracturing. Weir is the leading manufacturer of hydraulic fracturing pumps, and MTU is a market leader in heavy-duty industrial diesel engines. The deal combines Weir’s SPM pumping technology with MTU’s new‘Frac Pack’ unit.


Standards stuff

‘SAM,’ Energistics’ shared asset model. OGC new ‘big data’ workgroup. New EPSG minisite. New release of OSGF geospatial data access library. Prodml V1.3. Energistics ISO 19115-1 EIP V 1.0.

Speaking at Intelligent Energy earlier this year, Bill McKenzie (Chevron) reported progress on ‘SAM’ a standard for a ‘shared asset model.’ SAM sets out to mitigate proliferating acquisition in the digital oilfield with a ‘single source of truth.’ SAM can be implemented as a database or as a ‘facade,’ or as logical data on top of existing stores. Energistics is planning an open source SAM reference implementation but, ‘vendor adoption will be critical.’

The Open Geospatial Consortium is seeking comments on a new big data domain working group—checkout the charter.

The EPSG geodetics dataset is now available at an OGP-hosted minisite on www.epsg.org and the geodetic parameter registry on www.epsg-registry.org.

The Open Source Geospatial Foundation has announced a new release of GDAL/OGR, a C++ geospatial data access library for raster and vector file formats, databases and web services. GDAL includes bindings for several languages, command line tools and the latest EPSG 8.2 database.

EnergisticsProdml V1.3 standard release candidate is now available for public review. The release covers additional functionality in the distributed temperature sensing (DTS) implementation.

Energistics has also published the Energy Industry Profile (EIP) Version 1.0 of ISO 19115-1. EIP is an open metadata exchange standard for structured and unstructured geographical information of importance to the energy community. Energistics’ Geoportal (qu’est-ce que c’est?) is a reference implementation of a searchable catalog compliant with the new standard.


Nvidia GPU Technology Conference

Multiple high performance computing presentations from majors and researchers in reservoir engineering and seismic imaging shows 10x-up speed-up over boring old CPUs.

Nvidia’s GPU technology conference, held early this year in San Jose, California is a showcase for high performance computing (as opposed to graphics) using Nvidia’s parallel processing technology. The subtext for most all presentations is that the GPU is a practical route to HPC and can provide more flops per dollar and speed than the conventional CPU of the Intel variety. While one should not expect balance from this gathering of enthusiasts, the conference hosts an impressive line-up of technologists from a wide range of industries. Here are some highlights from the oil and gas track.

Chris Leader’s (Stanford) PowerPoint spectacular shows how GPUs can ‘greatly accelerate’ seismic imaging with potential for an order of magnitude speedup.

Thor Johnsen and Alex Loddoch (Chevron) showed that running high resolution (120Hz) computations on very large synthetic seismic models such as the SEG SEAM II implies eye watering amounts of RAM (768GB!). Using smart data management across host and disk memory makes this feasible. 16 Kepler GPUs achieve 20-30x better throughput than highly optimized CPU code running on a dual socket Sandy Bridge. The approach is amenable to cloud-deployed GPU nodes.

FEI’s Nicolas Combaret described a Stokes equations solver for absolute permeability (see also our article on BP’s digital rocks on page 12) as deployed in Avizo Fire from FEI’s Visualization Sciences Group. A Stokes solver coded in Cuda (Nvidia’s GPU programming language), running on a Quadro K6000, showed a 10x speedup over the same calculation on a dual 4 core CPU machine.

Massimo Bernaschi of the National Research Council of Italy has used Cuda Fortran90 and C kernels to model hydrocarbon generation and expulsion. A novel data structure holds all 200 variables used in the calculations and is accessible from both C and Fortran. The authors conclude that using a cluster of GPUs as a farm of serial processors offers both high performance and strict compatibility with legacy codes.

David Wade presented Statoil’s ‘end-to-end’ implementation of reverse time migration running on the latest generation of Kepler GPUs.

Phillip Jong revealed that Shell has been working with Nvidia on its in-house interpretation system (GeoSigns). Nvidia IndeX, a parallel rendering and computation framework, is a key component of Shell’s toolset.

Hicham Lahlou (Xcelerit) is modeling reverse time migration applications as ‘dataflow graphs’ to expose parallelism, memory locality and optimization opportunities. The code generated is portable across different hardware platforms.

Garfield Bowen (Ridgeway Kite Software) showed how GPU memory limitations in large scale reservoir simulations can be overcome with a simple scale-out strategy. The solution was demonstrated running a 32 million cell case on 32 Tesla GPUs. More, much more from the GPU technology conference homepage.


iShare@Sea—condition-based maintenance standard for offshore

Creative commons standard promises ‘plug and play’ while eschewing prior art.

As promised in last month’s issue we bring you some chapter and verse on the iShare@Sea open standard for marine equipment maintenance information exchange. iShare started out in 2013 as a TNO-led project funded under the EU 2020 R&D program. Use cases include offshore production facilities, supply ships and lifting vessels. The project set out to address industry wide challenges such as reducing downtime and maintaining equipment in the face of ongoing reductions in people on board. The standard works across different systems and protocols to enable ‘pro-active’ maintenance. Initial scope is the exchange of condition based maintenance data from propulsion and energy generation systems.

The iShare team promises cheaper plug-and-play integration, a platform for innovation and the use of multiple vendors’ ‘best of breed’ solutions rather than ‘proprietary standards’ and vendor lock-in. While these noble sentiments have been expressed in many other contexts, iShare brings one novelty in that its standards-set is delivered under an open ‘Creative Commons’ license. Curiously it appears to have developed independently of existing O&M standards such as Mimosa. Neither does the work refer to ISO 15926, but rather ISO14224, used to log downtime. iShare documentation is available on request from iShare@Sea.


Bischel—standardize and roll your own!

‘Rejuvenate now,’ dogma busters, technology thrusts and CNPC joint venture on factory drilling.

Speaking at the GEO 2014 conference in Bahrain earlier this year, Shell CTO Matthias Bichsel looked to the future of geoscience in oil and gas in his keynote, ‘Taking geoscience beyond the conventional.’ Bichsel spoke of Shell’s ‘rejuvenate opportunity now’ internal workshops on hydrocarbons formation where old timers meet with the new generation of explorers. Coming up with new plays involves dogma busting, and is done ‘old school’ with no iPads, just coloured pencils and maps.

Elsewhere, focus on standard, scalable technology has reduced the time to first oil. Shell currently has seven technology ‘thrusts’ scheduled for worldwide deployment. The company appears to eschew ‘buy not build’ and continues with GeoSigns, its own brand interpretation platform. GeoSigns incorporates 29 proprietary seismic interpretation technologies used to analyze time-lapse surveys and locate shale sweet spots by some 1,200 Shell employees. Bichsel also mentioned a joint venture with China’s CNPC to develop a ‘conveyor belt’ approach to factory drilling.


BP’s digital rockery and the Australian National University.

One micron resolution X-Ray scanner enables virtual core analysis.

In an online video, BP petrophysicist Dmitry Lakshtanov explains how BP obtains numerical rocks for use in reservoir studies. In a laid-back but informative presentation, Lakshtanov shows how rock samples are characterized so that ‘experiments’ can be conducted on a digital proxy model. Conventional analysis of a real core can take six months or more. With a supercomputer, digital flow experiments can be carried out ‘in weeks.’ But first you need your digital rock. This is obtained with BP’s in-house ‘proprietary’ scanning technology which, according to Lakshtanov, is ‘among the best in the business.’

While the accompanying article from Upstream Technology acknowledges that a ‘strategic partner’ helped develop the custom micro CT imaging system, BP is not saying who this was. However, the roll call of the Australian National University’s DigiCore consortium includes BP and the Heliscan CT scanner looks suspiciously like the one in the BP video and has an identical 1 micron resolution. The university sold its digital rock technology earlier this year to FEI for $68 million in what was described as ‘one of the most significant commercialisation outcomes of the University.’ Watch the BP video. More from the consortium and from FEI.


Energid provides software smarts to robotic driller

Integrated development environment for standard operating procedures authorship.

Cambridge, Mass.-headquartered robotics specialist Energid Technologies has partnered with Norwegian Robotic Drilling Systems AS (RDS) to tailor its robotics software toolkit, ‘Actin’ to enable RDS’ automated drilling proof of concept. Energid’s technology has been used by Nasa to build simulators for testing its robotic technologies without the need for field trials.
For RDS, Actin monitors and controls the robotic drilling system. Energid claims its software provides an unparalleled level of autonomy, flexibility and safety. Developers can create complex motion sequences for the robots and multi-robot tasks can be built from smaller, ‘human-understandable’ tasks.

RDS’ Roald Valen exclaimed, ‘I never thought it would be possible to create sophisticated multi-robot hand-off procedures using a simple drag-and-drop interface, especially when dealing with as many degrees of freedom as we do.’ Actin 4.0 will be released later this year and will incorporate many of the tools being used by RDS. More from Energid.

TracLabs ‘Pride,’ space station software for oil and gas

Joint development addresses drilling costs with dashboard for project planning and support.

TracLabs is repurposing its ‘procedure integrated development environment’ (Pride) for use in the oil and gas vertical. Pride, originally developed under a contract with Nasa for use on the International Space Station, is a tool for authoring and auditing standard operating procedures used in complex operations.

Pride replaces ‘cumbersome and often incomplete’ paper manuals with a computer-based system providing access to the information needed to operate or maintain equipment. An intuitive drag-and-drop interface helps subject matter experts assemble components such as the equipment required, procedural steps and questions that must be answered prior to carrying out the work.

The toolset targets operators and technicians at rig sites and maintenance facilities and assures that all required actions are taken. If an operator breaks from the instructions, the software can alert other stakeholders for remedial actions. During operations activity is logged to a secure database for tracking, quality control and reporting. Sensor data can be brought into the software to record activity during a blowout preventer test. Nasa has endorsed the toolset which is now available from TracLabs’ Pride Automation unit.


Hortonworks’ Hadoop for oil and gas

Grand claims for the big data lake in seismic, production optimization and HSE reporting.

A rather overblown marketing document from Hortonworks sets out the stall for the use of innovative ‘big data’ technology in oil and gas. The release includes grandiose claims for Hadoops’ contribution to US energy independence and mitigating declining world oil production to propose three use cases, seismic, lease bidding and compliance with health safety and environmental reporting.

Machine learning algorithms running against massive volumes of sensor data from multiple wells can be used to optimize production and extend a well’s life. Once an optimization strategy has been obtained, optimal set points can be maintained with Apache Storm’s fault-tolerant, real-time analytics and alerts. Storm running in Hadoop can monitor variables like pump pressures, RPMs, flow rates, and temperatures and take corrective action if any of these set points deviate from pre-determined ranges.

In lease bidding, Hadoop is claimed to provide competitive advantage by efficiently storing image files, sensor data and seismic measurements, adding context to third-party surveys of a tract open for bidding. Apache Hadoop also offers a ‘secure data lake’ of compliance-related information. Improved data capture and retention make compliance reporting easier. Because Hadoop does not require ‘schema on load,’ data can be captured in native format, as pdf documents, videos, sensor data, or structured ERP data. More from Hortonworks.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.