October 2014


The Hadoop historian

Pointcross completes Phase I of drilling data server and repository proof of concept for Chevron. Open source Hadoop/HBase stack provides flexible access to multi-terabyte drilling data.

Pointcross completed its drilling data server and repository (DDSR) proof of concept for Chevron earlier this year. The PoC demonstrated the use of a Hadoop distributed file system and HBase ‘big table’ data repository for storing and analyzing a multi-terabyte test data set from Chevron’s drillers.

Currently such data is spread across multiple data stores and search and retrieval is problematical. There is a ‘knowledge gap’ between data scientists and domain specialists. The formers’ algorithms may be running on a limited data subset, while the latter tend to develop spreadsheet-based point solutions in isolation.

A big data solution however is not magic. Pointcross’ specialists have developed a classification schema for drilling documents, a drilling taxonomy and an Energistics-derived standard for units of measure. This has enabled documents and data sources such as LAS well logs, spreadsheets, text files and Access databases to be harmonized and captured to a single repository.

One facet of the PoC was the ability to scan volumes of well log curve data to detect ‘patterns of interest,’ an artificial intelligence-type approach to the identification of log signatures. These can be used by drillers to pinpoint issues such as mud losses or stuck pipe. The technique can automate the identification of formation tops.

The sample data set comprised some 6,000 wells with over three billion records and half a million ‘other’ documents. All in all around nine terabytes of data were loaded to the Hadoop cluster, set up in Chevron’s test facility in San Ramon.

A key component of the Pointcross IT stack is a ‘semantic data exchanger,’ that maps and connects disparate data sources to the DDSR. A significant effort was put into a mnemonic harmonization program. This was to compensate for the plethoric terminology and abbreviations that plague upstream data sets causing ‘misperceptions and increased complexity’ for data scientists.

Pointcross is now pitching a second phase of the PoC with enhanced functionality, more data and use cases. These include more taxonomic harmonization, data curation and event extraction from PDF documents, daily reports, trip sheets and more. Phase two will investigate how other causes of nonproductive time can be attributed to lithology, bottom hole assemblies, and crew schedules.
Pointcross is also extending its big data offering under the Omnia brand, into a ‘total business solution for exploration and production.’ Omnia includes solution sets for geophysical, shale, production asset management, big data and enterprise search. More from Pointcross.


60 petaflop HPC

SEAB study finds that on petaflops alone, at least one oil and gas high performance computing center bests China’s Tianhe-2 supercomputer.

A report from the task force on high performance computing from the US secretary of energy’s advisory board (SEAB) makes interesting reading for oil and gas HPC users. While the fastest computer in the world, the Tianhe-2 at the national super computer center in Guangzhou, China, boasts a 34 teraflop performance, according to the SEAB report, some oil and gas market players ‘already run data centers with over 60 petaflops of compute capacity.’

The authors however nuance oil and gas HPC’s prowess, ‘Oil exploration workflows are composed of many high-level data processing stages with little automated integration. Stages are rerun by hand, in an ad hoc fashion, when the output does not satisfy subjective criteria dictated by the experience of the operators.’

SEAB is advocating exaflop class machines that will allow geophysicists to interact dynamically with ‘thousands or millions’ of scenarios and to better integrate other business areas such as reservoir simulation, oil field management and the supply chain.

Plans are in place through the Coral program to deliver 200 petaflop systems with around 10 petabytes of memory in a ‘data centric’ architectural context.


The Cloud. How the IT world is slowing us all down!

It has been a tough month for editor Neil McNaughton. A move to the Office365 cloud was problematical. He is thrown out of a talk at the SEG Denver convention and loses hard-won sports data. But over breakfast in Denver, the penny drops on some egregious cloud marketing spin.

Another book-I-probably-will-never-write will be about why IT has failed to live up to its promise of increasing productivity. I plan to stretch out my March 2014 editorial to a suitable length. A big type face, large margins and widely spaced text should make this doable.

The thesis is simple. If IT is really about ‘productivity,’ then IT should be shrinking itself at a rate approximately equal to the rising clock speeds of Moore’s law. In fact, with no false modesty, I propose that this productivity-induced shrinkage could be termed ‘McNaughton’s law,’ and measured in units of … naughts. For indeed, the shrinkage is not happening and will never happen as long as the IT world stands to benefit from selling more stuff by constantly re-inventing itself with new operating systems, paradigms and what have you.

But first, a word on the differences between a very small company and a larger organization. In other words on the paradox of size. In a large company, everything is hard. Deploying software is hard. Training users is hard. Upgrading stuff is hard. Delivery of anything slows down, dragging users’ expectations along with it such that eventually folks get used to staring at an hour glass as stuff loads from the network. Systems are unresponsive and eventually users get to be happy with the smallest of achievements.

Compare and contrast with the beauty of the small office/home office and a good old fashioned stand-alone PC. Things going too slow? Get a faster machine for a thousand bucks or so. Not enough disk? Add another terabyte for $50. Some new software? As long as it’s Cots* it will be easy and intuitive.

That was until we we signed up to the cloud a few weeks back. I am not talking anything fancy here. Not ‘big data’ or even Microsoft Azure, just a couple of licenses to Microsoft Office365. Getting started with the cloud proved quite tricky. Instead of ‘insert the CD and click on OK,’ installing Office365 needs quite a bit of the sort of IT knowledge that only a mis-spent middle age can provide. Ever tried to get your website’s mail server to point to a different location than the http site? It is not hard but I bet that less than one in a thousand users of Office would have a clue what all this is about let alone navigate the arcane interfaces of the various hosting companies, nameservers and other gatekeepers of the web.

Once things are running (well actually things are not yet really running…) the fun starts. What is the difference between OneDrive and SharePoint? How does one find out? When I use my OneDrive, the navigation bar in my browser tells me I am on a SharePoint site. Help from the ‘community’ is err, unhelpful.

If you are not yet in the Office cloud you might like to know that Office 365 brings you two versions of everything. One to run in the browser and another on your desktop. The online version is clunky and idiosyncratic. Keyboard shortcuts? You may as well forget them. Even cutting and pasting from within the same email brings up a dialog along the lines of ‘are you sure you want to do this?’ Click to open a document or change folders, the cloud lets you know that its is ‘working on it…’ The web-based version appears to have been designed to slow usage down to a crawl. Perhaps that was the point.

~

I have been attending the Society of Exploration Geophysicists as ‘press’ since I started writing Oil IT Journal/Petroleum Data Manager back in 1996. A key activity for me has always been to attend the technical presentations and to try to distill some of the novelties into my reports from the show. Well this year, while attending one not-very-interesting talk early in the morning, there was a kerfuffle in the room and a couple of SEG functionaries came for me. I was frog marched out of the room in the middle of the talk and told that, as of 2014, press no longer get access to the talks! Meanwhile my colleagues in Amsterdam were getting the royal reception that the SPE reserves for its press attendees. What can I say? Hrmph!

While in Denver, I was sipping my early morning latté in one of the many excellent coffee shops when an ad for Microsoft’s Azure cloud and Office 365 came on screen. This gist was that the Formula 1 Lotus/Renault racing team uses the cloud to capture and store data from races for analysis.

I was sleepily puzzling over how the sluggish cloud could be of any use in the fast real time world of F1 when suddenly it dawned on me. This is a egregious example of ‘he man dual control’ marketing (see my October 2010 editorial). To understand how this works, put yourself in the position of an Azure marketing manager. The problem is that the cloud slows things down. So how do we counter this? By associating the cloud with something that goes very fast! Add in some spin and misrepresentation and off we go!

To wrap things up and explain why this edition is a tad late. As some of you may know I like doing longish runs. My latest was a 75km trail run with a 4000m total climb which I have been trying to do for a while now. My first two attempts resulted in an abandon after 45km. This year I managed 68km, but just missed the time barrier. I was pretty pleased with this, but when I got back home and tried to upload the data from my various monitors to the cloud, the cloud was down and my upload was ‘pending.’ When finally Suunto’s whiz kids got their cloud back up and running all that hard won data was a) deleted from my watch and b) not in the cloud either!

* Common off-the-shelf software.

Follow @neilmcn

Book review - The SALVO Process

John Woodhouse’s guide, ‘Strategic assets, lifecycle value optimization’ relates how the EU Macro project gave birth to a software toolset and methodology for managing oil and gas mega projects.

The Salvo Process* (TSP) is a 160 page guide to ‘strategic assets, lifecycle value optimization.’ TSP reports findings of the EU-funded Salvo project whose partners hail from London Underground, Scottish Water, Sasol Synfuels and others. TSP’s author John Woodhouse of the eponymous Partnership (TWPL) started his career with Shell and Kvaerner Engineering which means that there is ample material of interest to oil and gas.

If you are wondering what exactly is meant by an ‘asset management,’ the ISO definition of ‘the realization of value from an asset’ may not help much. The strategic assets in question include large offshore oil and gas developments, refineries and such. The overarching aim of the process is to help keep plants running without accidents, as long and profitably as possible.

Salvo was a follow-on to an earlier EU project, the Macro (maintenance, cost/risk optimization) project (which involved Halliburton and Norske Shell) which produced notably the Asset performance tools (APT), a software package that was acquired by TWPL in 2008. TSP is largely an introduction to the TSP toolset and methodology.

Woodhouse is an experienced and opinionated author with a dim view of enterprise asset management (EAM) software and engineering data standards. ‘The whole subject of data, information and knowledge management is a mess for many organizations.’ ‘Some very expensive mistakes are made in the over-ambition and under-delivery of ‘solutions’ to the problem.’ The idea of an accurate, generic data standard is ‘naïve and counter productive.’

So how are Salvo and the APT different from classic EAM? The answer is that the approach combines hard data collection (where reasonable) with ‘softer’ inputs, obtained by ‘asking the right people the right questions.’ In some cases, ‘structured speculation and expert judgment can compensate for a lack of hard data.’

It turns out that ‘80% of asset management decisions’ can be taken based on information that is already available inside the organization. The APT toolset is a way of gathering and structuring such a combination of hard and soft information into more familiar representations of key performance indicators and return on investment.

There are a few examples of how the software and methodology are used. One shows how Salvo triggered ‘lateral thinking and risk mitigation options’ and allowed an aging control system to outlive its apparent useful life. Elsewhere, major and minor maintenance activity was re-jigged ‘with significant reductions in costs, downtime and risk.’

Notwithstanding the above, TSP would be greatly improved if it provided a decent case history of how the tools have been used on a major project. This would probably have made the book twice as long, but more than twice as useful. Having said that, as it is, TSP is a good read and a valuable contribution to the subject.

* The SALVO Process, ISBN 978-0-9563934-7-0.


Schlumberger VP on factory drilling and the need for integration

Patrick Schorn advocates more oilfield integration needed to combat 'uneconomic’ Eagle Ford wells. Another obstacle is the ’silo’ mentality and narrow, domain-focused workflows.

In his keynote address to the Simmons & Co. European energy conference in Gleneagles, Scotland earlier this year, Schlumberger president of operations and integration Patrick Schorn advocated an integrated approach to field development embracing reservoir characterization, drilling and production. Schlumberger has been involved in ‘integration type’ projects for many years, but is now taking integration to the next level thanks to the key differentiators of technology integration, a matrix organization, integrated services and access to drilling rigs, a long time Schlumberger strength.

Schlumberger’s integrated project management (IPM) business started some 19 years ago and now represents a portfolio of multi-year, multi-rig contracts, some involving challenging work in deeper and tougher wells.

Integrated drilling services has been bolstered with the acquisition of Saxon drilling in March of this year. Saxon’s land drilling rigs make for an ‘excellent integration platform,’ that is allowing Schlumberger to evolve the drilling engineering practice from a simple combination of discrete services to optimal systems, customized through extensive design and modeling capabilities.
A good example of this is the factory drilling paradigm of unconventional development such as Texas’ Eagle Ford shale where average production has plateaued, despite increased drilling intensity, longer laterals and more frac stages. A recent PFC study found that overall, 40% of all Eagle Ford wells are uneconomical (and that was with 2013 oil prices!). Schorn also observed that, ‘With less than 10% of today’s unconventional laterals logged, it is hard to optimize completions.’

Schorn also took a tilt at another obstacle to optimizing the completion, the ‘silo approach’ adopted by many customers and the narrow workflows used by domain experts. In most cases, the characterization, drilling and completion data and workflows are too independent of each other, making information sharing hard. Enter Schlumberger’s Petrel-enabled workflow that combines different petrotechnical disciplines into ‘one seamless workflow.’ Other key enablers cited by Scorn included LeanStim for efficient frac operations and factory drilling, now available from nine rigs.

Schorn concluded that the industry needs to work differently in the current ‘range-bound*’ price environment. A step change is needed in performance, with integration as a key enabler. ‘This will require a serious redesign of our current workflows and processes.’

Comment—IPM started 19 years ago, matrix organization began in 1998, drilling a ‘long time strength’ so... what took ’em so long?

* Well it was range-bound in August!


Divestco announces ‘Glass’ seismic interpretation system

New .NET implementation provides autopicking and iPad app for interpreters on the road.

Calgary-based Divestco has released ‘Glass’, a new seismic interpretation system. Glass was written from scratch using Microsoft’s .NET framework. Divesco started work on Glass several years ago starting out with its legacy interpretation product, WinPics. Feedback from WinPics’ clients was used as a starting point for the Glass roadmap. But the new tool was re-written ‘from the ground up’ with a new user interface and redesigned workflows.

Glass offers automatic picking and a GIS map providing accurate coordinate conversions and interactive horizon and grid calculators. Glass also offers ‘iPad integration’ for interpreters on the move. We asked Divestco’s Shannon Bjarnason how iPad integration has been realized. ‘We didn’t use .NET for the iPad app. It is definitely an app approved through the iStore. We actually have a patent pending on our full implementation of the architecture for this so I can’t get into the details but we will be able to port this to other mobile devices if client feedback warrants it.’


Knowledge management - the return?

Cecily O'Neill outlines a new approach to KM, citing deployments from ConocoPhillips and Shell.

It’s been a while since we heard the words ‘knowledge management’ (KM) so it was refreshing to read through Cecily O’Neill’s (Velrada) presentation on social collaboration and next generation information management, given at the Quest Smart E&P conference in Perth, Australia last month.

Velrada is an IT consultancy and integrator that has worked on information management on major capital projects in Australia’s mining and energy verticals. O’Neill’s KM poster children include Rio Tinto, Chevron, ConocoPhillips and Shell. These companies have ‘fundamentally changed’ their approach to knowledge management for capital project development from a focus based entirely on engineering terms and tangible inputs, to look at how organizational learning can increase efficiency and reduce risk.

O’Neill presented two ‘real world’ case studies. ConocoPhillips now has some 100 knowledge networks and a ‘business 2.0’ approach with collaborative jams, visioning strategy sessions and ‘Wikithons,’ enabled by Oracle’s ConText search tool. The result was ‘purposeful collaboration’ across global functions, disciplines and networks that helped employees handle situations that do not fit into established processes and structures.

Shell’s solution leverages SharePoint, MatchPoint, TRIM, Project Server and Power Pivot and is integrated with SAP and business process modelling applications.
O’Neill told Oil IT Journal, ‘KM is definitely making a comeback now that we have the technology to enable the capture and management of tacit information and tools to support collaboration, but it is not just about the technology. We also see organizations in our region looking at ways to unlock the productivity benefits and to make it easier to on-board projects and teams. It is a very different approach to previous incarnations of KM and is definitely driven out of competitive demand.’
More from Quest.


Midland Valley teams with Swiss map agency

Move package enhanced for use in large organizations. Used to map cross-border geology.

Midland Valley Exploration (MVE) has teamed with SwissTopo, Switzerland’s Federal Office of Topography to enhance MVE’s Move structural geology modeling software for use in large organizations. SwissTopo is working on a cross-border project to provide a seamless geological model of the North Alpine Mollasse Basin.

MVE is working to link Move with the Geosciences in Space and Time (GST) 3D data store from GiGa InfoSystems. This will enable the sharing of large 3D data-sets across an organization without data duplication. Users will also be able to retrieve data subsets of areas of interest and check-out data for editing before returning edits to the database. The GST link will be available to MVE users early next year.


British Geological Survey rolls-out its Maps Portal

Russell Lawley tells Oil IT Journal how major digitization project was achieved.

The British Geological Survey (BGS) has launched the Maps Portal, a gateway to UK geology. The portal provides free online access to high resolution scans of 6,300 geological paper maps, some dating back to 1832. For those seeking more up to date information, the recent 1:50,000 maps and geochemical and geophysical maps are also online.

BGS’ Russell Lawley told Oil IT Journal how the scanning project was carried out. ‘We used Colortrac wide format scanners to capture the paper maps, mostly to 300 dpi, 24 bit color depth uncompressed Tiff master files. These were batch converted to Jpeg2000 files using the open source Kakadu package at ‘visually lossless’ 10:1 compression. Map metadata is held in an Oracle database with pointers to BGS lexicons of controlled terms. We are now working to enhance search with a master sheet ‘footprint’ for each map that links the scans to our GeoIndex which we hope will allow for the use of e.g. post codes, popular names and classic geology sites.’

Paper copies of maps that are still in print will be available for purchase through the BGS online shop or through BGS map resellers. BGS will continue to reprint popular titles and offer a ‘Print-on-Demand’ service for out of print maps. Digital copies of the scanned maps and sections are also available for purchase.


Software, hardware short takes

SST, ICIS, Jereh, Workstrings, EssencePS, Beicip-Franlab, DGI, Exprodat, Praxair, Halliburton, HARC, LR Senergy, Paradigm, Roxar, Sciencesoft, Techplot.

Silicon Space Technology has successfully tested temperature-hardened components for oil and gas downhole applications capable of operating at 250°C for over 250 hours.

Reed Business Information unit ICIS has released an app for access to its oil pricing intelligence and news service.

China’s Jereh Oilfield Services Group has announced a new coiled tubing unit and the Apollo Turbine Frac Pumper with a claimed 4,500 horespower peak output.

Superior Energy Services’ Workstrings International unit has launched a free drill pipe specifications app.

EssencePS has announced a ‘shrink-wrap’ version of its EssRisk history matching tool. The new edition is a ‘validated commercial probabilistic forecasting tool’ with community support.

Beicip-Franlab has released PumaFlow 7.0, its flagship reservoir simulator, with a new multi-ion salinity option, post processing for hydrocarbon tracking and shale funtionality.

Dynamic Graphics has announced CoViz 4D 7.2 with improved time lapse visualization of large seismic volumes and support for Witsml 1.4.

The latest V221 release of Exprodat’s Exploration Analyst petroleum exploration and new ventures package delivers new analysis and interpretation tools for estimating yet-to-find resource volumes, mapping gross depositional environments, and analyzing historical activity.

Praxair has announced ‘DryFrac,’ a waterless frac technology that utilizes liquid CO2. The technique is claimed to reduce the environmental impact of water-based hydraulic fracturing.

Halliburton’s new CoreVault system offers a means of safely retrieving fluid-impregnated rock samples obtained during side wall coring, allowing for accurate measurement of hydrocarbons in place.

The Houston Advanced Research Center (HARC) has launched an interactive hydraulic fracturing ‘virtual site’ designed to share environmentally friendly best practices for fracking. The site was developed with funding from the Ground Water Protection Council and developed by the Epic Software Group.

The latest (V2.2) edition of LR Senergy’s ‘Iris’ offshore project and data management web application has been enhanced with support for subsea pipeline and asset inspection data management. Features include simultaneous streaming of three or more video files, live camera positions on the web map, pipeline profiles and pipeline events.

Paradigm’s Sysdrill 10 brings integration with Peloton’s WellView/Masterview corporate database for drilling and well operations, and includes a new jar placement module based on technology acquired from Cougar Drilling Solutions.

Emerson’s Roxar unit has released Tempest 7.1 with improved uncertainty management and reservoir prediction capabilities. Enable now lets users create ‘ensemble-based’ prediction workflows and better quantify uncertainty in production forecasts.

The latest edition of Sciencesoft’s pre-processing package S3control offers a new view for non-neighbor connections, cumulative S-Curve plots, enhanced LET curve fitting and more.

New features in Tecplot RS 2014 R1 include Utchem (a chemical flood reservoir simulator developed at the University of Texas at Austin) simulator support, a quick RFT grid plot, periodic production Rates and binned integration.


PPDM meet hears from Devon/Noah, OGP

How Devon has transformed its data asset. Oil & Gas Producers’ new positioning formats.

Speaking at the PPDM Association’ Oklahoma City data management luncheon this month, Devon Energy’s Matt Harris, with help from Patrick Kelly of Noah Consulting, told how his company has turned its data into a ‘valuable corporate asset.’ The key software tool underpinning Devon’s well data is Peloton’s WellView suite along with the MasterView I/O SDK and Informatica’s data quality toolset (IDQ).

Devon is now a ‘data driven organization’ with an ‘accountable’ data management organization that assures quality, accessible data through standard definitions. Devon has over 200 IDQ rules running alongside audit checks to ‘address data issues before they occur.’ The IDQ rules add business process and data governance to WellView’s application logic. The Peloton suite is managed by a team based in IT helped by a data governance group with stakeholders from the business. Devon’s rules are embedded in a data work stream with most activities executed in a SharePoint workflow that leverages Informatica’s metadata manager and business glossary.

One example of the technique’s success is Devon’s greenhouse gas reporting which has reduced data defects from around a million in October 2013 to a few thousand by the end of the year. In 2014 serious defects have been virtually eliminated.

~

John Conner introduced the new Oil and Gas Producers’ association’s positioning exchange said to bring significant benefits over legacy positional data management. The OGP ‘P’ formats are direct replacements for the SEG-P1 seismic positional exchange format and the UKOOA’s P2/91/94 formats for marine acquisition. The most currently used format today is SEG-P1 which has outlived its usefulness. A modern 3D marine survey may record 30 or 40 position types but SEG-P1 only supports 1 per file. Some, like source, receiver and CDP are not even identified. Headers are freeform and often unpopulated. Conner warns, ‘Every export to SEG-P1 is a potential loss of knowledge and accuracy.’
The new OGP formats have a robust, common machine-readable header that eliminates hand-entered metadata by hand. They also offer more robust positional integrity on loading into the workstation and less overhead for data managers. Adoption will be a transitional process as application vendors and seismic data processors adapt their software to the new formats. The formats are a free download from OGP1104. More from PPDM.


ECIM 2014, Haugesund, Norway

Shale ‘collapses the asset lifecycle.’ Shell’s enterprise architecture. Statoil’s BI take on 4D seismic. A professional society for data management? Interica on E&P data usage patterns. Landmark, Schlumberger and CGG’s takes on the E&P cloud. Noah on shale data management.

The 2014 edition of Norway’s ECIM E&P data management conference was held in Haugesund, Norway this month with 310 attendees. Out in the harbour, a large platform was being kitted out. Nothing unusual for this hub of Norway’s offshore industry. Except that the platform, the Dolwin Beta, is a gigawatt capacity electricity transformer and a key component of an offshore wind farm, part of Germany’s ‘energiewende’ green transition. But we digress.

Halliburton’s Joe King described how in North America, shale exploration is collapsing the asset lifecycle. Today, everyone is operating marginal fields with a renewed focus on profitability. While technology from both horizontal and vertical vendors is evolving, many challenges remain unsolved. There has been much talk of big data, but today, much of our work is too fragmented and siloed for the application of big data principles and analytics. While the landscape is evolving with Hadoop and its derivatives, data management remains labor intensive—thanks to a ‘complex, vast panorama of data.’ But there is great potential for exploratory data analytics and serendipitous discovery. Current approaches center on first principles/closed form solutions for, say, optimizing drilling parameters. But new techniques allow us to evaluate drilling data against thousands of proxy models in near real time. Statistical techniques such as principle component and Monte Carlo analysis can be leveraged to tune well spacing to shale sweet spots. Other use cases target drilling performance which King illustrated with an Open Street Map view of activity in the Eagle Ford of Texas. For big data implementers there is the problem of what technology to chose. There is a trade-off between using horizontal technology (read Hadoop) which may be cheap but low in domain savvy. On the other hand, for consultants, ‘every solution is a project.’ King advocates a ‘platform driven’ approach that offers a configurable environment and ‘seeded science.’ Our guess is that he was thinking of Halliburton’s recently acquired Zeta Analytics!

We have heard before from Johan Stockmann on Shell’s ambitious enterprise architecture! In his iconoclastic style, Stockmann railed against the upstream’s IT status quo and fragmented data landscape that is ‘costly and hard to maintain.’ The EA approach starts with meticulous design of business goals, processes and roles, then you can design IT solutions. ‘Do not listen to vendor sirens,’ many system components for data quality, governance, off-the-shelf software and data management procedures all need work. But, ‘you cannot manage your way out of a problem that has its roots in poor design.’ The architecture is the key. It is persistent, valuable and hardest to remediate if done wrong. While E&P data is different, upstream IT is similar to other industries and the same principles should be followed. What are these? Agree on goals, define (data) success and prepare for the journey up the ‘maturity staircase.’ Shell uses an ‘opportunity driven’ approach to its architecture. New business needs such as handling shale plays will kick off an update to the architecture—to ensure that the new data maps to the enterprise model. The aim is for seamless interoperability of IT systems across facility design to lab and reservoir. ‘Agile’ also ran, as in ‘smart data’ and in-memory processing.

We have also reported previously on tests of Teradata’s data appliance but it was good to hear Statoil’s Mark Thompson provide more on the use of business analytics on 4D seismic data. Statoil has been conducting seismics on the Gullfaks field and now records new surveys every six months from an installed 500km cable array. Data comes in faster than Statoil can process it—as do the ad-hoc requests from users for near, mid and far trace data sets with various time shifts applied, attribute computations and so on. Managing the hundreds of data volumes created is a major data pain point.

Enter the subsurface data warehouse—a home for all the 4D seismic data along with reservoir data, production data and well logs. All stored on a massively parallel processing system from Teradata. Data is staged according to use with ‘hot’ data on flash storage and cold on hard drives. Data movement is automated depending on usage patterns and data access is through SQL. Thompson got a good laugh for his claim that ‘geophysicists don’t know SQL, they know Fortran,’ although it’s not clear why. Work process are stored as user defined functions allowing access from standard apps. 4D tools-of-the-trade, such as repeatability analytics, time shifts and correlations (of seismic deltas with pressure changes in the reservoir) have been coded in SQL and stored in the RDBMS. Thompson showed a few lines of, it has to be said, rather ugly SQL, that now run automatically. A Statoil committee is now working on the wider application of business intelligence pending a firm commitment to the technology. For Thompson, the foot dragging is a lost opportunity. ‘Why are we ignoring this? I really don’t know. We need to embrace the 21st century.’

Lars Gåseby, speaking on behalf of Shell and ECIM floated the idea of a single professional data management society spanning ECIM, PPDM and CDA. The ambitious goal is for a professional society of oil and gas data managers along the lines of the SEG, AAPG or the SPE. A memorandum of understanding has been signed between the three organizations which are to fine tune their proposals for organization, governance and membership terms by December 2014. In Q1 2015, the societies will be seeking stakeholder agreement and, if this is forthcoming, the society will come into being. Gåseby is enthusiastic that this will happen.

Interica’s Simon Kendall provided an unusually detailed account of data usage patterns he has observed at a variety of client sites. Interica started out as PARS whose eponymous hardware and software bundle began life in 1995, archiving Seisworks data to D2 video tape. Today, the hardware has been dropped leaving the PARS/Project Resource Manager (PRM) archiving data from some 30 applications to LTO6 cartridges with a 2.5 PB capacity. Kendall reports explosive data volume growth in the upstream. One client has 100PB of disk in its seismic processing center. Seg-Y files of post stack data easily exceed 50GB. And then there is unstructured and duplicate data. G&G makes up 5% of an E&P company but manages 80% of its data. A short PRM sales pitch ensued. The tool is used by 7 majors to track disk usage by application, over time and by user. This has allowed for some statistics—although Kendall warns against drawing hasty conclusions from this likely biased sample. A ‘typical’ large company may have around 100 file systems, 10s of PB of storage, about 30 main applications, 10-30 thousand (sic) projects and 10s of millions of files. Sometimes the oldest ‘live’ projects have not been touched in a decade—they are still on spinning disks. Oh, and archive volumes are increasing at an even faster rate!

The ‘average’ medium sized company likely has a mix of Linux and Windows applications and around 650 TB of storage (this figure has doubled in the last 5 years), 10 main apps, 8 thousand projects and perhaps 30TB of data that has not been touched in 4 years. A small company will likely have mostly Windows applications and maybe a million files, 15 TB of data, 100 thousand directories and 40-50% of data as duplicates. Kendall has seen popular SEG Y files duplicated up to 18 times in one shop. Around half of all apps (in this sample) are Schlumberger’s Petrel and make up one third of the data volume. Kendall puts the value of the interpretation IP effort for his top ten clients at $2.8bn.

In the Q&A Kendal opined that while keeping data on a local workstation may be deprecated, users still do it. In some companies, it would be highly desirable just to be able to generate a list of what data the company possesses. Smaller companies are often reliant on ‘tribal knowledge’ about what is there and where and there is a general lack of knowledge of what has already been done. The move to PC based tools is not making things easier. It is too easy to populate, copy and move stuff, actions that are hard for data managers to combat. ‘There is much more data sitting on C drives than we ever want to admit.’

Satyam Priyadarshy, chief data scientist with Halliburton/Landmark likes to distinguish between business intelligence, answering questions you are already asking like ‘what is selling?’ and big data/analytics (BDA) which is ‘mystery solving with data science.’ The latter is what is needed for innovation. BDA uses higher math and science to explore new patterns turning the raw data into the single source of the truth, working on data in situ. We were now back in Zeta Analytics territory. BDA will be highly applicable to shale exploration. Priyadarshy slid into acronymic mode enumerating Hadoop, Hive, Postgres, Tableau Ploticus and more. An ‘in memory analytic server’ is claimed to cut seismic imaging time from 48 to 3 hrs. The key enablers of BDA are not commercial software (maybe we are not in Zetaland!)

An open source environment frees you up for BDA and lets you mix & match algorithms from different sources. You need a modular platform and a large number of ‘TAPS,’ technology algorithms products and solutions. Big data projects, like others, fail from a lack of leadership and from poor advisors/consultants who ‘sell technology not solutions.’ BDA and the TAPS explosion are the next industrial revolution. In the Q&A we asked Priyadarshy how you could set up a BDA team without knowing beforehand what problems you were going to solve and what ROI could be expected. He suggested ‘working under the radar using open source software.’ BDA may be an embarrassing topic for the C suite.

Not to be outdone, Schlumberger’s Stephen Warner also presented on the big data-cloud-analytics spectrum that is ‘redefining the information management landscape.’ Along the hydrocarbon ‘pathway,’ it is information management that drives decision making. Only a decade or so ago, few reservoirs were modeled and data managers had an easy life. Today we have terabytes of data and exotic new subsalt and unconventional plays. Analytics has application in drilling and targeting shale sweet spots. ‘Shale is a data play.’ Shale delivery is difficult to model and simulate—so derive your KPIs from the data. Schlumberger has partnered with big data experts on a proof of concept leveraging SAP’s Hana in-memory compute appliance, with Microsoft in real time analytics in the Azure cloud. Scaling up for big data requires new capabilities. Data managers need to be at the forefront of these changes to data driven workflows. There has never been a more exciting time in data management.

Big data enthusiasm is all very well but there is nothing that beats an actual test drive. This was provided CGG’s Kerry Blinston who presented results of a trial on CGG’s UK upstream data set which has been ported to a Teradata (yes, them again) appliance. Teradata’s Aster technology was used to investigate a ‘hard data set, not a top level management story.’ The trial investigated occurrences of stuck pipe during North Sea drilling, a major cause of non-productive time. Along with conventional drilling parameters such as weight on bit, rate of penetration, other data sets of interest were added to the mix, a 1TB data set of some 200 thousand files. Hidden in text log files were three ‘tool stuck’ occurrences which were not visible in the completion log. Many such data connections are not routinely made just because there is too much data. Blinston offered a new definition for analytics as ‘finding what we didn’t know we had but didn’t know we didn’t know we had it!’ The 300 well pilot found ‘unexpected correlations and spatial distributions’ that were good enough for prediction and improved drilling models.

Shale exploitation goes back a long way. Noah Consulting’s Fred Kunzinger cited a 1694 British patent to ‘extract and make great quantities of pitch, tarr, and oyle out of a sort of stone.’ Today shale is changing the balance of power in the energy market. The sheer speed of shale development is creating new problems with the management of the vast amount of data and information that the new processes depend on. There is a need to rethink ‘conventional’ data integration and here there is a renewed focus on master data management, the key to bringing all departments and disciplines together. Factory drilling requires just in-time data management, analytics and continuous improvement along the ‘new assembly line.’ A business transformation is needed to improve shale margins and transform portfolios.

~

Data management is a rather dry subject and our account fails completely to capture the magical couple of days spent in sunny Haugesund. We did our now traditional trail run in the Steinsfjellet park. The Oil IT Journal/ECIM runners club now has two members! Another treat this year was the Haugesund marching band which entertained the data managers at Tuesday’s gala dinner. Watch them on YouTube and visit ECIM.


Folks, facts, orgs ...

Strategy&, Structural Integrity Associates, Oiltanking Partners, Enable Midstream, Blue Ocean Brokerage, Cameron, Baker Hughes, CGG, Chevron, CoreLab, Computer Sciences Corp., EnergyNet, OGC, OFS Portal, Tiandi, Jones Energy, Sharecat, BG Group, Statoil, American Enterprise Institute.

Strategy& (formerly Booz & Co.) is launching an R&D ‘talent mobility survey’ to assessing the attractiveness of various locations to host a world-class R&D center for the oil and gas industry.

Matt Freeman has joined Structural Integrity Associates as director, strategic business development. He was previously with Intertek.

Laurie Argo has been appointed president and CEO of Oiltanking Partners. She hails from Enterprise Products Partners. Donna Hymel has been named VP and CFO.

Deanna Farmer has been named executive VP and chief administrative officer of Enable Midstream Partners. She comes from Access Midstream.

Alan Germain has joined Blue Ocean Brokerage in a business development role. He hails from Hess.

Scott Rowe is now president and COO with Cameron. Doug Meikle succeeds Jim Wright as president of valves and measurement.

Kimberly Ross has been appointed senior VP and CFO of Baker Hughes.

Gregory Paleolog heads-up CGG’s new multi-physics business line.

Chevron has named Joe Naylor VP strategic planning. Wes Lohec is VP HSE.

Chris Hill is VP and chief accounting officer with CoreLab.

Diane Wilfong is VP, controller and principal accounting officer at Computer Sciences Corp. She hails from Caesars Entertainment.

Chris Atherton is now president of EnergyNet and Mike Baker and Ethan House are both VPs business development.

Ingo Simonis is now director, interoperability programs and science at the OGC.

California Resources Corporation has joined OFS Portal’s buyer community.

Tiandi Energy announces the addition of Rob Condina, Justin Laird, Kevin Rohr, and Pete Rullman as business development directors. Condina and Laird hail from Honeywell.

Jeff Tanner has joined Jones Energy as senior VP geosciences. He was previously with Southwestern Energy.

Jon Gjerde has resumed his position as CEO of Sharecat Solutions.

President and CEO Helge Lund has left Statoil to take up the CEO role at BG Group. Eldar Sætre is acting president and CEO.

The American Enterprise Institute has released a new movie, ‘Breaking Free: The Shale Rock Revolution’, covering the American energy ‘renaissance.’ Directed by Robin Bossert, Breaking Free seeks to ‘bridge the information gap between public perception and an industry that fuels our daily lives, our national economy, and our future.’


Done deals

Siemens, Dresser-Rand, Vepica, Edoxx, National Automation Services, Schneider Electric, InStep, Teradata, Think Big Analytics, Vista Equity Partners, Tibco, Swets, Enterprise Products.

Siemens is to acquire rotating equipment parts and service provider Dresser-Rand for approximately $7.6 billion in an all cash transaction. The $83 per share price represents a 37% premium. Siemens will operate Dresser-Rand as its oil and gas business line, retaining the executive team. Houston will be the Siemens’ oil and gas HQ.

Privately held multinational engineer Vepica Group has acquired ‘ownership interests’ in Houston-based 3D modeling, reverse engineering and laser data collection boutique Edoxx Technical Services. Edoxx provides conceptual and detailed engineering for onshore and offshore oil and gas facilities.

Following its acquisition of JD Field Services earlier this year, Las Vegas-based National Automation Services is in the process of acquiring Mon-Dak Tank and Devoe Contracting.

Schneider Electric is to acquire InStep Software, a provider of real-time performance management and predictive asset analytics software and solutions. InStep’s eDNA data historian and PRiSM predictive analytics software use AI-based techniques to monitor the health and performance of critical assets in real time.

Teradata has acquired Think Big Analytics, adding an open source, big data capability to its technology platform. Think Big’s expertise covers Hadoop, NoSQL databases including HBase, Cassandra and MongoDB, and Storm for real-time event processing.

Vista Equity Partners is to acquire Tibco in a $4.3 billion cash deal. The $24.00 per share price represents a 26.3% premium and is subject to approval by Tico stockholders and regulatory approvals.

Swets Information Services B.V. was declared bankrupt last month and has entered a two month ‘cooling off’ period. Swets’ 110 employees in the Netherlands have been terminated. According to the release, the bankruptcy ‘does (for now) not affect its (foreign) subsidiaries.’

Enterprise Products has acquired the interests in its Oiltanking Partners unit from Oiltanking Holding Americas in an approx. $4.4 billion paper and cash transaction.


Siemens’ virtual data ‘explorers’ for wells, fracs and CBM

Partha Ray describes flagship deployments of Simatic IT XHQ at Saudi Aramco and shale drilling.

Speaking at the Quest Smart E&P conference in Perth, Australia last month, Partha Ray provided a whistle stop tour of Siemens’ flagship oil and gas deployments. Siemens Simatic IT XHQ underpins Saudi Aramco’s enterprise monitoring solution deployed on the giant Ghawar field. Here XHQ dashboards link real time data feeds from flow meters, tank levels and pump rates with financial information. Events likely to affect production are flagged with their implied monetary cost and XHQ allows drill-down to equipment tags from the data historian. .

Siemens’ work in industrial processes has lent itself to the factory drilling paradigm of the shale industry where maintenance-repair-operations technology has been repurposed to ‘track the frac.’
Siemens’ claim is that data from a vast array of petrotechnical and financial sources (ERP, Petrel, Oracle, LIMS, Aspen, PI and more) can be blended into a single virtual data source, ‘isolating’ end users from the plethoric back ends.

The offering is served as a variety of ‘explorers,’ for wells, hydraulic fracs, coal seam development and even employee and vendor payables. Raw and derived data (KPIs) pass through the XHQ data services layer for subsequent analysis, simulation, discovery and other applications. One such target application is the ‘virtual well test’ that leverages machine learning, comparting real time tubing head pressure against historical well tests into Statistics & Control’s OptiRamp simulator. More from Quest.


Honeywell Digital Suites for oil and gas

Streaming data can hide problems from operators. Enter Honeywell’s safety-centric digital oilfield.

Honeywell Process Solutions has announced Digital Suites for oil and gas (DSOG), a digital oilfield offering that adds a safety focus to oilfield data management. Honeywell observes that the ‘unprecedented layers’ of data streaming from multiple sources in the field can hide what is happening from the operators’ view, particularly in the face of ‘unexpected events.’

DSOG comes as either a ‘fully integrated’ bundle or as separate components. The six suites are: operational data (including a secure data archive), process safety and reliable operations, equipment effectiveness (minimizing costs and maximizing uptime), production surveillance and asset health, production ‘excellence’ (planning and optimization) and operational performance improvement.

Honeywell’s Ali Raza observed, ‘Clients tell us they have access to more real-time data than ever, but that alone is not enough to improve performance—they need digital intelligence to make sense of all the data being collected. These new tools will improve productivity, uptime and remote operations with a return on investment in as little as six months.’

DSOG can be ‘easily integrated’ into any multi-vendor environment. The ‘field-proven’ technologies and new software and implementation methodologies are delivered from Honeywell’s newly established oil and gas center of excellence.


Hess Selects Virtustream for SAP HANA Cloud

Hess has migrated its SAP environment to a hybrid in-house/off premises cloud.

Hess Corporation has migrated its SAP Hana test, development and quality assurance environments to Washington-headquartered Virtustream’s xStream cloud. Hess was facing problems managing and maintaining its on-site Hana deployment and wanted shift its expenditure from Capex to Opex.

Virtustream also assisted Hess with the deployment of its production SAP environment to its on-premise data center and is providing maintenance and management support for both the on-site production and cloud-based test environments. Hess is also using Virtustream cloud storage for backup and disaster recovery purposes.

xStream leverages Virtustream’s µVM resource management, a patented cloud resource manager offering standardized, fine-grained bundles of compute resources to provide accurate provisioning and measurement of resources in the clouds. Virtustream unveiled its SAP Hana ECC cloud last year. This is now available as a software subscription option through an arrangement with SAP America.


Quality before data automation

Troika data guru says clean up your data act!

Keith Capelin’s paper states that managing terabyte datasets requires adherence to standard formats. Capelin provides advice on physical media storage and on transcribing legacy data, a fraught and complex process where it is easy to degrade data. Automating seismic data management is problematical.

But there is hope. In the future, seismic data will benefit from more automation provided data quality can be preserved. To achieve this, Capelin advises making a small investment of time and money in the early stages of the data lifecycle. Possibly leveraging some of Troika’ specialist tools and services.


Sales, deployments, partnerships ...

OSIsoft, P97 Networks, ZipLine, Intertek, Intellicheck Mobilisa, Fluor, Rock Solid Images, Hitachi, SK Engineering, Wood Group, Aveva, Capgemini, Schlumberger, Geospace, GSE Systems, Meridium, Vesta Partners, Sharecat, Tendeka, WellDog, Weatherford, Yokogawa, Dell, FMC Technologies.

OSIsoft continues with its enterprise agreement (EA) with Shell for the supply of operational intelligence infrastructure. The EA covers Shell’s process efficiency, quality improvement, asset health, energy management, safety and regulatory compliance. An EA licensee also gets priority access to OSIsoft’s technology transfer services.

P97 Networks (developer of PetroZone) and ZipLine have formed a strategic partnership to provide ‘bi-lateral certification’ of interfaces and collaboration on mobile e-commerce deployments to oil company and retail merchant customers. The partnership leverages the ‘Microsoft retail fuels and marketing reference architecture.’ P97 also announces that Sinclair Oil Co. is to deploy the PetroZone mobile commerce platform across its 1,400 US gas stations.

Intertek is providing testing, inspection, and measuring services to support the new Cravo, Lirio, Orquidea and Violeta fields’ FPSO located in deep-water Block-17 off northern Angola. Intertek also provided the flow measurement services for the 36” crude oil fiscal metering system.

Intellicheck Mobilisa has sold its IM2610 Transportation Worker Identification Credential (TWIC) Plus readers to a ‘leading US refinery’ located in California. The web-hosted system provides TWIC card authentication and a driver’s license reading capability.

Fluor Corp. has received a $1.3 billion engineering, procurement, fabrication and construction contract from Fort Hills Energy for the utilities scope of the Fort Hills oil sands mining project. The project is located about 90 kilometers north of Fort McMurray in Alberta, Canada. Fluor has booked the contract for 2014.

In collaboration with OMV Norge, Rock Solid Images has developed tailored workflows for lithology and fluid prediction, integrating seismic, well and controlled source electro-magnetic data with geology. The techniques were used in pre-drill analyses in the Barents Sea where the OMV-operated Hanssen well’s fluid type was ‘accurately predicted.’

Hitachi Solutions Canada has implemented its Microsoft Dynamics AX-based ERP solution at Strike Energy Services of Calgary. The project included Hitachi’s ‘Build//AX’ solution for the EPC/construction sector.

SK Engineering & Construction has implemented Intergraph’s SmartPlant Enterprise solution to optimize its engineering, procurement and construction (EPC) process systems for project control and execution.

Wood Group PSN has been awarded a contract by Woodside for engineering, procurement and construction management (EPCM) services on the Karratha Gas Plant Life Extension Program, Western Australia. Scope of work includes modification and refurbishment of all of Woodside’s onshore and offshore assets.

Aveva and Capgemini have signed a global alliance agreement for the joint provision of services to asset-intensive industries including oil and gas. The alliance targets increased efficiency in construction, revamp and modification projects. For new-build projects, Aveva’s integrated engineering and design solution promises rapid project setup, clash-free 3D design and real-time collaboration across different locations and disciplines.

Enhanced Oil Resources has signed a letter of intent with Schlumberger whereby the latter will conduct an in-depth technical evaluation of the potential redevelopment of the Milnesand and Chaveroo oil fields, New Mexico. The study is to be performed at Schlumberger’s expense and, if successful, will likely result in a ‘comprehensive services agreement to be negotiated by both parties.’

Geospace is to rent out 4,000 stations of its cableless OBX ocean bottom seismic acquisition system to a ‘major international seismic contractor’ for a 180 day contract expected to launch in January 2015.

GSE Systems reports Q4 2014 oil and gas industry contract awards totaling over $3.0 million. These include the supply of the EnVision tutorials and simulators to a new oil and gas training center in Kazakhstan, an engineering and operations company in Nigeria and a major oil refiner in Belgium.

Meridium has expanded its alliance with SAP service provider Vesta Partners to help customers extend their SAP deployments with Meridium’s asset performance management solutions.

Statoil has extended its corporate information management contract with Sharecat, for one more year, until the end of August 2015. Sharecat will provide Statoil with SAP Material Codification, Material Maintenance, Data Cleansing and other work with Material Master. This includes services across multiple and concurrent projects to collect and classify contractor equipment and parts information, ensuring quality of information required to optimize plant operations and maintenance.

Tendeka has entered an ‘exclusive strategic agreement’ with Beijing Wallgate Information Technology to deliver digital oilfield capabilities to key operators in China. The agreement will see Tendeka’s leading edge intelligent well technology installed in multiple wells over a minimum period of two years.

WellDog Pty. announces that it has been working with Shell over the past eighteen months to develop a new ‘technical service’ for locating natural gas and natural gas liquids in shale formations. Shell is now leading beta trials of the technique.

Weatherford International has licensed the Ikon Science RokDoc RT suite for use across its global rig-based real-time pore pressure specialists. RokDoc RT helps recalibrate the well plan to the actual formation characteristics and boundaries that have been encountered while drilling.

Yokogawa has signed a contract with Dell for the global provision of PC models that are selected and customized for Yokogawa use. Yokogawa will offer high end systems that combine Dell PCs and Yokogawa industrial automation system-related products.

FMC Technologies has signed a long term frame agreement with Wintershall Norge for the supply of subsea production systems for its developments offshore Norway. A $280 million call-off under the agreement, covering subsea equipment for the Maria field, has been awarded.


Fiatech JORD issues final report

Engineering reference data endpoint operational. Great cost and effort needed for sustainability.

The Fiatech standards organization has just published the Final Report from JORD, its ‘joint operational reference data’ project. JORD, a joint effort between Fiatech and Norway’s POSC/Caesar Association (PCA) began in 2011 with the intent of delivering a ‘scalable and sustainable’ reference data service (RDS) within 5 years. The Report heralds the end of JORD phase 2 which completed in August 2014. An optional Phase 3 is planned to start later this year.

JORD’s focus is the software tools that underpin major capital projects in engineering and construction. The idea is to provide developers and end users with quality-assured reference data using the ISO 15926 standard for information management in the process industries, in particular in its Semantic Web technology flavor.

The 15 page Report covers compliance with 15926, development tools, training and operations of V2.0 of the data endpoint. This provides ISO15926 compliant data as a Sparql endpoint, using Apache Fuseki to serve semantic RDF data over HTTP. The endpoint is now operational and serving data but its future development and maintenance will require substantial effort and resources.

The authors observe, ‘stakeholders are diverse and their needs conflict.’ ‘Very large amounts of reference data are required, the system will always be in development.’ ‘The cost of quality reference data […] is significant and not understood by all stakeholders.’ The report acknowledges contributions from DNV-GL and from TopQuadrant whose Composer was chosen as the RDL Expert Manager Tool. Overall JORD project costs are around $1.3 million for the first three years.


Detechtion compressor diagnostics pinpoint valve damage

Enalysis toolset transforms routine Scada data into actionable information.

Speaking at the GMRC Gas Machinery Conference in Nashville Tennessee this month, Zachary Bennett described how Detechtion’s Enalysis Scada data visualization and analytics package is used to perform online compressor diagnostics, transforming routine Scada data into ‘valuable and functional’ information. Enalysis leverages complex data sets captured with Scada and other systems to identify maintenance and performance enhancement opportunities. Users can identify issues such as cylinder wear, leaking valves, high engine load or imminent failure and plan for timely remedial actions.

For Bennett, ‘Management of compressor fleets is the heartbeat of profitable natural gas operations.’ While Scada data collection is now widespread, the large data volumes and complex dynamics of natural gas production mean that early detection of potential issues cannot be realized by observing changing temperatures and pressures in a spreadsheet.

Bennett’s case study involved a compressor that appeared to be running fine with zero alarms. But diagnostics of tensile rod loads on one compressor stage showed these were exceeding 90% of the maximum manufacturer’s rating and trending up. Further drill down pinpointed the root case, deteriorating cylinder efficiency. Following inspection, the problem was identified as a damaged ‘witches hat’ filter (well it was Halloween!). The unit was replaced and catastrophic failure was avoided.


Back to school

IFP School’s ’sustainable mobility’ MOOC. IWCF/Oilennium develop well control eLearning.

IFP School is launching a massive open online course next month. The free, four week English language course titled ‘Sustainable mobility: technical and environmental challenges for the automotive sector’ covers engine design and related environmental and societal issues. Teaching material includes videos, interactive quizzes, forums and ‘serious games.’

The International Well Control Forum has commissioned an eLearning course on well control from Petrofac’s Oilennium unit. The course is a component of IWCF’s campaign to increase understanding of what triggers a well control incident and is based on recommendations made by the the OGP in the wake of the Macondo tragedy.


IFS announces employee rotation and Azure cloud option.

Work with Odfjell Drilling sees new IFS Applications module for offshore workers.

IFS, along with key clients Odfjell Drilling and Archer have launched an employee rotation solution for oil and gas. Employee rotation is a complex task with ad-hoc or third-party solutions. IFS’ staffing and rotation planning module, a component of IFS Applications, enables day-to-day planning and compliance with industry standards. IFS also recently announced a Microsoft Azure cloud-based version of IFS Applications.

The solution can be delivered as either a self-service infrastructure as a service edition or as a fully hosted and managed solution. IFS CMO Mark Boulton said, ‘Azure is well aligned with our clients’ requirement for a trusted platform and a quick set-up.’ The solution was validated in trials with IFS client Ebara (a precision instruments manufacturing company) which has ‘made a company-wide decision to put key IT systems into the cloud.’ CIO Frank Lowery described Ebara as ‘comfortable with the cloud environment, having recently moved to Office365.’ Your mileage may vary!


Total’s fiber optic pipeline intrusion detection

OptaSense fiber acoustic detection protects ‘very dangerous’ Yemen pipeline.

An article in the latest edition of Total’s Techno Hub magazine describes a new pipeline intrusion detection system (PIDS) that Total has deployed in what is described as a ‘very dangerous’ 320 km section of the pipeline that supplies the Balhaf Liquefied Natural Gas (LNG) terminal, Yemen. The line was experiencing frequent sabotage necessitating enhanced security measures including passive monitoring with a fiber optic cable.

The system leverages technology from Qinetiq unit OptaSense that uses back-scattered pulses of infrared light to ‘listen’ to all activity within a few meters of the pipe. Acoustic signals are decoded in real time and an alert is raised in the control room when any excavation or other activity is detected.

The PIDS system piggy-backs onto the existing fiber optic Scada communications cable—there is no need for a dedicated cable. The system pinpoints activity along the pipeline with a 10m positional accuracy. Qinetiq’s OptaSense technology was also deployed on the BP Baku Tbilisi Ceyhan pipeline along with Future Fibre Technologies’ ‘Secure Fence’ perimeter protection (OilITJ Oct. 2010). Total LNG is also an FFT client. The system was installed following a peak of sabotage activity in 2012. In 2013, following installation, operations were uninterrupted.


SpectrumData’s new data management center

Western Australian data center deploys Prevenex hypoxic fire prevention.

Perth, Australia-based data management and storage specialist SpectrumData recently completed construction of its new Australian data management center (ADMC) in Leederville, Western Australia. The AU$1 million facility houses physical tapes and other media types for long term retention and management.

The new unit includes an active fire prevention system using technology from Prevenex of Loisville, Colorado. Prevenex uses hypoxic technology to bring oxygen levels down to the point where a fire cannot start. Oxygen levels are reduced to below the 16% oxygen fire threshold. This relatively small reduction in oxygen is claimed to make fire impossible while remaining safe for workers. Being in a Prevenex environment is comparable to being at an altitude of 7,000 feet or 2,100 meters. In fact Prevenex’ sister company Colorado Altitude Training offers low oxygen environments for athletes and others to train in.

Another ADMC innovation is a flexible vault construction from DX Platforms. New hardware includes high capacity disks and an array of fiber and SCSI tape devices ‘to suit every known media type.’ SpectrumData has over 100 tape drives including 9 and 21 track drives from the 1960’s to present day LTO and 3592 technology. New large format high speed scanners are also available for capturing logs and other large documents up to 30 meters long.


Accenture’s SAP HANA appliance for E&P

New offering for smaller E&P shops offers ‘end-to-end’ technology platform.

Accenture and SAP have launched Accenture enterprise solution for energy (ESE), an upstream enterprise resource planning (ERP) solution for independent oil and gas companies that leverages SAP’s Hana cloud solution. The ‘end-to-end’ technology platform targets small to medium sized North American operators and promises ‘enhanced production management and improved return on investment.’

The cloud-based offering includes SAP business suite for core ERP functions, SAP Hana live for analytics and reporting and SAP production revenue and accounting (including joint venture accounting). ESE includes turnkey services for implementation, application support and hosting, plus master data services. ESE was developed by the Accenture/SAP business solutions group and was rolled out at the SAP Oil & Gas Best Practices conference in Houston last month.


IPIECA remediation best practice

New ‘good practices’ publication provides risk-based framework for remediating old oil and gas sites.

A new publication from the International Petroleum Industry Environmental Conservation Association (IPIECA) sets out ‘Good Practices’ for managing and remediating releases to soil and groundwater from petroleum refineries, terminals, retail gas stations and onshore exploration and production facilities. The 68 page guide summarizes recommended practices from ASTM International, the US EPA and the UK Environment Agency. The publication targets environmental specialists working in countries where legislative guidance on environmental land management may be less developed.

The guide presents a generic, globally applicable risk-based framework for identifying and managing release impacts and for assessing and implementing corrective actions. The process involves a conceptual site model (CSM), said to be fundamental to informed risk-based management of environmental impacts, that addresses potential sources, exposure pathways and receptors.
The guide describes data collection and the use of software models in assessing site specific risks. These include HRI’s Risk Assistant, Risc-Himan and GSI’s RBCA. On-line tools are also available for risk assessment and groundwater fate and transport modelling.


© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.