May 2013


Shell—not ‘big’ data, Odata!

Shell CIO Johan Krebbers unveils new enterprise IT architecture at OSIsoft conference including in memory computing with SAP Hana and multiple endpoint support from ‘Odata’ open data protocol.

In a wide ranging presentation to the OSIsoft user conference in San Francisco last month, CIO Johan Krebbers outlined Shell’s plans for an overarching IT infrastructure spanning the whole company, ‘from drilling to retail.’ Shell’s new infrastructure is built on three pillars—the cloud, an enterprise data model and endpoint flexibility. The cloud is now its preferred way of buying IT with a preference for software as a service—acquiring an ‘end to end’ solution. While in the recent past Shell was a strong supporter of Microsoft Windows and browser-based endpoints, ‘those days are going rapidly.’ Today’s users work with iPhones, tablets or, in the near future ‘smart glass’ devices. Today, people ‘live and die with their smart phones,’ now a critical part of the environment that brings its own security and authentication issues.

For Krebbers, ‘big’ data is irrelevant. Data, of any size, needs to be transformed with analytics to be actionable. Today, production, drilling and retail are all components of an enterprise-wide real time environment. In every corner of the enterprise there is a move from look-back reporting to forward looking analytics that predict future performance.

Data is getting bigger though, as sensor counts increase and as the ‘internet of things’ envisages more connected devices. A modern seismic survey can generate 10PB of data which may expand tenfold during processing—a true big data challenge. Shell’s global, remote operations span unmanned platforms that mandate a ‘data driven’ environment. Systems and users need more trustworthy data. Data quality is improved as data is made visible. The other quality driver is the enterprise data model—an initiative to move towards a single version of truth and systematic naming of devices and business objects.

Tools of Shell’s big data trade include an SAP/Hana in-memory appliance with a petabyte of main memory. Less mission-critical data may be stored in SQL databases or in Hadoop if current use cases are unclear. Data in Landmark and PI is combined with historical data—a key enabler of non conventional ‘factory’ drilling.

The enterprise data model ‘Edam’ is under development by a team of 20 data modelers along with rules for data ownership and governance. A data services layer provides access to multiple data sources, hiding complexity. Endpoint access leverages the Odata protocol. Shell’s SmartApps (OITJ April 2012) are being retooled around Odata. Ultimately all PI data sources will be represented in a 3D virtual reality ‘living’ model of Shell’s assets that will last throughout the plant’s lifetime. Watch the OSIsoft video of Krebbers’ presentation.


PetroWeb/NeuraDB

Combination of Neuralog’s database and PetroWeb’s web services technology gets thumbs-up from SouthWestern Energy.

PetroWeb is to acquire NeuraDB, a PPDM-based upstream database management system from developer Neuralog. The all equity deal means that Neuralog will receive an undisclosed share in PetroWeb. Neuralog will continue to market its Log, Map Section tools and its Scanner and LaserColor hardware. Neuralog president Robert Best, who described the deal as a ‘partnership,’ is to split his time between the companies.

The plan is to port NeuraDB to a web based platform for managing commercial and proprietary E&P data together. A key part of this will be to develop standard web services for data loading, editing and accessing geoscience client applications. End-user web access to NeuraDB via PetroWeb is also planned.

PetroWeb CEO Darcy Vaughan added, ‘Cloud and on-premise web deployments are the future of E&P data management and promise end-user access for users from a spectrum of devices.’ PetroWeb adds an ESRI ArcGIS front end to NeuraDB along with other E&P data stores and vendor data feeds. PNEC speaker Sean Turcott welcomed the move as providing his company, SouthWestern Energy, with a quick route to web portal deployment. More from PetroWeb.


Data management ‘professionalization.’ Good or bad?

Oil IT Journal editor Neil McNaughton believes that data management is indeed a ‘people problem’ but not in the way you may think. He argues that ‘professionalization’ will simply create yet another information silo and that the real solution to data management is automation.

At any point in time the world, and what we perceive of as ‘knowledge’ or ‘science’ contains a mixture of stuff that later turns out to be right and stuff that turns out to be bunkum*. Back in the day we had phlogiston, the luminiferous ether and Mars’ canals. All bunkum! Today we hear news reports of many medical remedies and beliefs that turn out to be … bunkum.

I have just got back from the excellent PNEC data integration conference (report on page 6) and it might seem churlish to argue that much that is said on the subject of data management tends towards the bunkum side of the equation. But that is what I propose to do.

For me, PNEC had two enlightening moments. During the panel session on future trends and best practices, a speaker from the floor who returned to the geodata world after a stint in accounts said he was ‘appalled at the lack of automation in geodata.’ In finance, the expectation is that data comes in to the system and flows through untouched by human hand. This interesting observation got short shrift from the panel along the lines of, ‘oil and gas data is different’ and the debate went on to discuss what a well was and other ‘best practices.’

The other enlightening moment came in Karl Fleischmann’s answer to my own query (sometimes enlightenment needs a bit of priming) as to whether Shell’s analytical approach to data management might pave the way to automating parts of the workflow. Fleischmann, who was presenting on what technical data management has to learn from modern manufacturing, came back with something along the lines of ‘yes indeed.’

The problem today is that far too much of technical data management is considered to be a ‘workflow’ i.e. a ‘people thing,’ when it should be a process i.e. automated. Data management is at the stage that financial services were many years ago when a multitude of individuals were involved in a transaction, filling out forms, stamping and signing and passing over to the next person. Today’s forms may be digital and the messages may be files or emails, but the result is the same.

Why are things this way? The computer business has a lot to answer for here in the way it has to reinvent itself all the time. Again people are center stage and we are encouraged to ‘collaborate,’ BYOD** or network socially. Such a free for all makes a straightforward, process approach hard to realize. Putting the individual at the heart of the workflow is good for the IT business but it means that much of what we have come to know and like is an obstacle to the automation of data flows.

Matthias Hartung’s presentation on Shell’s attempt to provide ‘trusted data at your fingertips’ was a good summary of the state of the art. Hartung called for the harmonization of applications and architectures, of more and better standards from the SEG, PPDM and Energistics and saw hope from the new Standards Leadership Council.

But wait a minute. Calls for standards, application harmonization … that reminds me of something. Well it reminds me of the whole 17 years of PNEC and Oil IT Journal. If you don’t believe me, read my very first 1996 editorial from what was then Petroleum Data Manager—a call for more and better standards and interoperability. As they say, ‘plus ça change, plus c’est la même chose.’

Hartung advocated the professionalization of data management, turning it from ‘Cinderella to enduring beauty.’ The call for professionalization was the big thing to emerge from this year’s PNEC. PPDM, the UK’s Common data access and the emerging Saudi Aramco-backed DMBoard initiative are pushing for professionalization. For PPDM and CDA the plan is for a data management ‘society’ along the lines of the SPE or the SEG. There are a few problems with this.

First of all, petroleum engineering and geophysics have a hundred years or so of academic and practical experience to build on, data management has twenty years of experience and little to show for it. Academic bunkum!

Secondly, the ‘call for standards’ has been heard so long that it is getting a bit rusty. What exactly does it mean when a multibillion major or national oil company makes a cry for help to an underfunded standards body? I submit that it means that the problem at hand is in reality less than mission critical. Standards bunkum!

Thirdly, consider the bigger picture, that of the business. It is a truth universally acknowledged, if rarely acted on, that business is hampered by the lack of communication between its traditional silos, geosciences, drilling engineering, production and so on. The only major change to this picture in the 16 years of this publication is the emergence of yet another silo, the IT department, another disconnect in the way we do business. Silo bunkum!

Finally we have the mantra that a data project is all about people and not about technology. This effectively precludes any serious attempt to treat the problem as one of automation—which is all about technology. People bunkum!

The mistake of sanctifying data management as a profession is that it perpetuates the data problem with yet another silo. Instead of one hard to negotiate gap between IT and the business, there will now be three hard to negotiate gaps, between the business, IT and data management. And inside this new silo what do we find? Useful knowledge of data flows and applications—just what is needed to support an automation effort.

But all this lovely know-how is in the wrong place! It would be put to much better use if the business and IT could get together to eliminate vast swaths of data management completely just as finance has managed to do.


* Nonsense—see its amusing etymology.

** Bring your own device—i.e. an iPhone.

@neilmcn


Barclays analyzes the big four

Baker Hughes, Halliburton, Schlumberger and Weatherford under the microscope. Schlumberger leads the field. IT segment small but highly profitable. ‘Buy’ recommendation for all four!

Barclays has just produced a compendium of reports on the ‘big four’ oil and gas service companies, Baker Hughes (BHI), Halliburton (HAL), Schlumberger (SLB) and Weatherford (WFT) comprising individual reports of around 90 pages and an overview. Barclays recommends all four as buys and sees 2013 as ‘the year of the large-cap diversified companies.’ The big fours’ ongoing success is ascribed to ‘one of the largest strategic mistakes of the last 50 years—the divestment and lack of investment in the oil service and drilling industry by oil companies.’

The big four differentiate themselves from smaller players through consistently investing in new technologies and processes. SLB leads the field in R&D with an $11 billion spend since 1997, 2% more than the other three combined. SLB was the first to use a multi-national tax structure and its tax rate has been almost 10% below its peers for a decade.

Prior to the 2009 downturn, the big four’s share of the North American oilfield services and equipment business was around 15%. Since the recovery, the market has increased to some $180 billion in 2012—of which around 25% went to the big four. Growth has been spectacular in the non conventional sector with a 35% per year hike in the horizontal rig count which stood at 1,150 in 2012. Barclays anticipates continued expansion in horizontal drilling.

Coverage of information technology is light. Petrel is described as ‘the glue that binds SLB’s reservoir characterization products and services.’ Schlumberger Information Solutions (SIS) and PetroTechnical Services are dwarfed by the rest of the behemoth and bring in around 2% of the Reservoir Characterization division. The latter, although SLB’s smallest segment by revenue today (27% of total in 2012), is by far the most profitable with an operating margin of 28% and representing ‘38% of total EBIT.’

Halliburton’s Landmark unit ‘has a powerful position in seismic interpretation.’ Landmark generates roughly 2-3% of HAL’s revenue. HAL’s acquisition of Petris last year gets a mention in the context of Landmark’s focus on managing the ‘ever-greater volumes of E&P data’ where oils ‘lack effective integration and collaboration technology.’ Weatherford’s Field Office also gets a mention with an installed base of ‘over 400,000 wells.’ The reports, authored by Barclay’s James West, recommends investment in all of the big four, but sees most upside in Halliburton and Schlumberger.


OSIsoft 2013 User Conference, San Francisco

Chevron’s surveillance optimization program. Marathon’s unconventional real time infrastructure.

Speaking at the 2013 OSIsoft user conference in San Francisco last month, Chevron’s Jim Crompton announced his imminent retirement, after 37 years with the company. He is apparently in good company as 50% of oil and gas knowledge workers are to retire in the next five years. Their replacements are more likely to have 8 years experience than 37 and ‘we have left them the hard problems.’

Crompton turned to the hype around big data observing that this has been commonplace in the industry for decades. Oil and gas is good at handling big data sets although we are always looking for ways to improve. Oil and gas is a ‘very dangerous industry’ and we need to avoid mistakes by using data in a timely maner. On the deepwater Horizon, data indicative of the nature of the problem was available 45 minutes before the explosion, but operators were ‘focused on other things.’

While OSIsoft’s PI System is often associated with the downstream, it is seeing use in the upstream to provide a holistic view of real time asset data. Here data volumes can be overwhelming. Fiber sensors can produce 2TB/hour, billion cell reservoir simulators have been deployed and seismic generates petabytes of data. A modern production facility has more sensors than a refinery although it shares many components with the downstream.

A new offshore Angolan FPSO has some 80,000 sensors of which 1,500 are down hole. The upstream ‘factory’ is discovered not built—this is not a ‘retail type’ issue. There is a strong but healthy tension between standardization and system complexity. As the sensor count rises, there is an emerging problem with the information pipeline. Most users are only interested in their own output. Data is disappearing into shadow systems (or Excel!) through a lack of standards and a common vocabulary.

But there are significant successes with management by exception and in the use of virtual reality in training—used to develop standard operating procedures years before a platform is on-site. Alarm management is also an area of focus where systems can direct operators straight to a root cause. But ‘as we get better the problem is getting harder.’

Crompton believes that silos remain a problem as there is a tendency to educate and reward by function. Drillers don’t talk to petroleum engineers. Chevron is addressing such issues with a new focus on modeling and analytics, the surveillance, analysis and optimization program.

Here, a preliminary maturity analysis has shown the company to be ‘all over the place.’ With worldwide recovery stuck at around 33% of oil in place, the potential for improvement is vast. ‘We are not at the start but we have not yet finished the race.’

Ken Startz presented on two of Marathon’s PI System projects, MaraDrill (more of which next month) and on production operations support in the Piceance basin, Colorado. Marathon spotted an opportunity for an improved real time infrastructure with the arrival of a new engineering hire from a company ‘with better data visualization than [Invensys’] Wonderware.’ The new functionality was developed with PI Coresight in under a day. The solution has produced a 720% return on investment through speedier start up for operators and a reduced need for contract staff. The system also provides one year of look back production data. More from the OSIsoft User Conference.


Python-based analytical toolset well received by Hess

Enthought Canopy desktop for analytics and data management. DoE helps parallelize NumPy.

Austin, TX-based developer Enthought has announced ‘Canopy’ a Python-based environment for scientists, engineers and analysts. The Canopy desktop supports data collection, analysis, algorithm prototyping and testing. Marcos Lopez de Prado, head of quantitative trading at Hess Energy Trading Company, and a research fellow at Lawrence Berkeley National Laboratory said, ‘Unlike proprietary analysis languages and tools, Canopy offers the analytical environment we need, built on open-source Python. Thanks to Python’s popularity among Research Institutes and National Laboratories, thousands of libraries are available for complex mathematical modeling. Using it, we are able to research investment opportunities in an efficient and flexible manner.’

Canopy builds on Enthought’s Python distribution with an editor, integrated IPython console, graphical package manager and documentation. Canopy is available on Windows and Mac OS, and in beta, on Linux. An free Express edition including the desktop and core scientific and analytic packages is also available.

Enthought has also just been awarded a grant from the US Department of Energy’s SBIR program to port its NumPy flagship to high-performance parallel computing environments. The aim is for an intuitive front end to array computing and parallel libraries. The project will be released as open source. Other oil and gas Enthought users include Shell and ConocoPhillips. More from Enthought.


IFPen claims full-basin modeling success

Integrated study of Nova Scotia’s eastern seaboard leverages TemisFlow and Dionisos modelers.

The French Petroleum Institute IFPen and its Beicip-Franlab software and consulting unit are claiming a breakthrough in large scale basin modeling. Beicip’s consultants, working on contract for the Nova Scotia regulator, used IFPen’s TemisFlow fluid flow modeling package to model the generation, migration and trapping of hydrocarbons along the eastern seaboard of Nova Scotia. Another IFPen package, Dionisos, was used to control basin scale sedimentary fill over the duration of the hydrocarbon cycle. The study demonstrated a valid, oil-prone petroleum system in the area.

IFPen reports that the study, which integrated seismic, geological and geochemical data, has sparked off renewed interest in the zone with Shell and BP about to embark on two separate $1 billion explorations programs. More from IFPen.


GLJ Petroleum takes stake in Visage Information Solutions

Deal addresses oil country ‘big data’ analytics and visualization.

GLJ Petroleum Consultants is to take a stake in Calgary-based Visage Information Solutions, a provider of visual analytics software to the oil and gas industry. The deal is said to bolster both companies’ offerings in oil country big data analytics. GLJ CEO Jodi Anhorn said, ‘As an early adopter of Visage we have been impressed by the software’s power. The software team fits our culture and understands the role of big data in the energy industry.’

Visage provides ‘self-serve’ visual analytics to the oil and gas vertical using both public and proprietary data to improve decision making and strategy of producers, investors and financiers. Visage provides an infrastructure layer to public and in-house systems from third party vendors though its ‘Dynamic ETL Technology’ that assembles information on the fly and offers ‘in-memory’ data processing.

Visage recently teamed with Canadian data provider Geologic Systems to investigate production from British Columbia’s prolific Monteney natural gas province. The study showed how horizontal well azimuth influences production and helped identify the Monteney’s ‘sweet spots’ (OITJ Oct 2011). More from Visage and GLJ.


Cairn Energy’s new upstream information management system

Flare Solutions teams on E&P Catalog-based solution for document and data management.

UK-based Cairn Energy has deployed a new oil and gas information management system with help from Flare Solutions. The jointly developed system will help Cairn capture, store and retrieve critical documents, data and records across its global operations. The solution is built on three of Flare’s information management tools, the E&P catalog, the oil and gas taxonomies and the EPCat-tracker. Flare has also supplied an ‘operations toolkit,’ that tracks project actions and incidents and captures lessons learned.

Flare director Glenn Mansfield said, ‘Collaborating with Cairn’s technical staff allowed us to build components that directly meet the needs of the business.’ The solution was initially developed as an information management system for Cairn’s drillers, using Flare’s EPCat-Tracker to capture critical documents at key stages in the drilling of a well. Information templates allow document deliverables to be adapted to local standards or legislation.

Cairn’s head of information management John Caldwell said, ‘The new system has improved the management of information in the exploration and production process and is a valuable addition to our information management portfolio.’ According to Flare, the E&P Catalog provides ‘an intuitive interface to store and retrieve information from multiple information stores.’ More from Flare Solutions.


Software, hardware short takes

PetroDaq, Geovariances, AspenTech, Tofino, Epsis, Geoforce, KSS Fuels, Mechdyne, PGS, RAE Systems, Schlumberger.

PetroDAQ has rejigged PetroDAS and Remote tools with support for frack data, wireless sensors and a Wits to Witsml configurator. A new web viewer was released to support drilling, mudlogging, and formation evaluation.

The 2013 release of GeovariancesIsatis introduces ‘multi-acquisition factorial kriging,’ a method to extract commonalities and differences in redundant measurements of the same quantity. MAAFK is of particular interest in time-lapse seismic evaluation.

AspenTech’s 2013 release of AspenOne includes a new, web-based user interface and AspenOne Exchange, a marketplace for equipment data, third-party and AspenTech design resources.

Belden and Tofino have teamed on an offshore security solution combining Belden’s armored cable with Tofino’s Eagle security appliances. The devices prevent cyber attackers from tampering with the redundancy protocols.

Epsis has announced TeamBox 5.2 with one click video and voice conferencing for with remote participants.

Geoforce has announced an iOS (iPhone/iPad) version of its oil country asset tracking solution.

KSS Fuels has ported its flagship retail fuels pricing application to the smartphone offering users a ‘price anytime, anywhere’ function.

Mechdyne has rolled-out ‘Meeting Canvas,’ a tool that lets users connect and participate in secure meetings from any location.

PGS has announced the ‘ultimate’ seismic vessel, the 24-streamer Ramform Titan, claimed to be the ‘widest ship in the world’ with a 70 meter berth. The Titan can tow an array of ‘several hundred thousand’ hydrophones.

RAE Systems has announced ProRAE Guardian, an XML-based software development kit for developing real-time gas and radiation detection applications for oil and gas and other verticals.

Schlumberger has announced Studio Manager, adding information management capabilities to the Petrel Studio E&P knowledge environment. IM professionals can track asset team progress and resolve data issues. Studio embeds Microsoft technology including Lync, SharePoint, Exchange and Office and runs on Oracle or SQL Server.


2013 GPU Technology Conference, San Jose

Nvidia Index for Landmark’s DecisionSpace Desktop. TerraSpark ports code base to GPU. Nvidia on GPU number crunching. Repsol’s hybrid computing. Saudi Aramco’s GPU-based giga cell simulator.

At the 2013 Nvidia GPU Technology Conference in San Jose, California earlier this year, Halliburton’s Joe Winston showed how Nvidia’s Index, a subsurface data graphics accelerator unveiled at last year’s SEG, has been integrated with Landmark’s DecisionSpace Desktop. Landmark is moving to GPU-based processing to speed rendering of large data sets and also to prepare for a move to the cloud. At issue is the visualization of hundred gigabyte data sets on megapixel rated displays. The use of Index is part of a trend to complex, multi core, heterogeneous computing. In fact two Nvidia software tools are used. Dice (see below) and Index. The latter implements volume visualization through graphics primitives for objet rendering and lighting control. Winston’s 50 slide presentation shows how Index has been integrated with the DecisionSpace scene graph and the various tricks and transformations that this entailed.

Jon Marbach presented TerraSpark’s work in implementing seismic attribute computation on the GPU. TerraSpark is using GPU acceleration because 3D seismic is essentially ‘image’ data, using image-processing inspired, data parallel algorithms amenable to GPU-based computation. Computing seismic attribute volumes such as horizon tracks, curvature or coherency can be very compute intensive. Fault extraction can take hours of CPU time. Targeting Nvidia Fermi and Kepler GPUs with a modest GB of Vram, Terraspark has shown that GPU acceleration works. Most tasks achieve a 3-5x speedup (although curvature computation does much better at 32x). The porting exercise brought significant code quality enhancements. ‘Porting forces a hard look at the existing code base.’ But the results are improved algorithm accuracy and a better product. ‘Seismic attributes are a no brainer for GPU acceleration.’

Nvidia’s Stefan Radig showed how to port generic number crunching to large CPU/GPU heterogeneous clusters with the Dice library. The Dice API is claimed to allow domain experts to develop scalable software running on GPU clusters, without the need to manage low level parallelization or to handle network topologies. Dice is presented as an alternative to other cluster frameworks like Open MPI. Dice leverages an in-memory NoSQL database which provides resource allocation and scheduling. The database supports ‘ACID’-transactions in multi-user environments.

Max Grossman outlined Repsol’s hybrid implementation of a 3D Kirchhoff seismic migration algorithm on a heterogeneous GPU/CPU cluster. Migration is deemed to be a good target for hybrid execution as CPU-based systems can take weeks to process the massive data sets. Legacy implementations involve ‘pointer chasing,’ compute intensive kernels and multiple I/O bottlenecks. Repsol is adopting an incremental approach to the port—starting with a CPU-only development before moving to a hybrid CPU+GPU deployment with dynamic work distribution. Test results showed good results for migration kernel execution times (up to 35x). But overall, performance was hampered by the significant portion of non parallelizable code in the application.

Ahmad Abdelfettah (Stanford) described Saudi-Aramco sponsored work performed at King Abdullah University of Science and Technology (Kaust) on numerical techniques for reservoir simulation on GPUs. The project is part of Kaust’s ‘strategic initiative in extreme computing’ as well as Aramco’s ‘giga cell’ reservoir modeling project. Fluid flow modeling comprises ‘physics’ and ‘solver’ phases. As core counts increase, the latter comes to dominate compute bandwidth. Today, applications have yet to ‘feel’ this effect, but as core counts rise, they certainly will. Kaust has developed a library of basic linear algebra subprograms (Kblast) to prepare for the new massively parallel, heterogeneous environments. More from the GPU Technology Conference.


PNEC Data Integration 2013, Houston

Chevron’s ‘next generation’ data management. Shell on ‘lean’ data management and harmonizing applications. Noah, EnergyIQ, ExxonMobil, IHS, on data specifics in the non conventional space. ExxonMobil on regression testing the data asset. XTO on SOX. Professionalizing data management.

The 17th edition of the PNEC data integration conference was introduced by Cindy Crouse with a short reminder of PNEC founder Phil Crouse’s life. Cindy and the team have proved able conference organizers and managed a successful and well attended conference with a record 580 head count and some 43 presentations.

Chevron’s Jennie Gao outlined the requirements of a modern upstream interpretation shop. These include data integration between multiple geoscience applications and datastores. Top priority is data access between Petrel, Epos and OpenWorks R5000. However commercial tools only support selected data types. File exchange is limited as vendors offer more ‘import’ than ‘export’. It is challenging to keep data in sync between these three main core applications. OpenWorks is used as a master data store for well data and ‘gold standard’ interpretations. Data is pushed out to Petrel and Epos. OpenSpirit Copy manager and Schlumberger’s Innerlogix are deployed. Chevron has developed a complex system to handle well symbology and coordinate reference systems across the different platforms. TOAP, Tibco OpenSpirit’s Petrel plug in, provides bi-directional exchange with OpenWorks. The solution is maintained by Chevron’s ‘next generation’ data management team with help from Larsen Toubro Infotech. A wry comment from the floor suggested that, ‘25 years on, and we’re back to Geoshare!’

Karl Fleischmann (Shell E&P America) has been investigating what technical data management has to learn from manufacturing. The ‘lean’ approach to manufacturing can be applied to upstream data. Much of today’s work is unnecessary—data quality standards may be set too high and data (especially seismic) is in ‘overproduction,’ as the same work is repeated. As an example, Fleischmann cited Shell’s attempt to build a ‘well book,’ an overview of a set of wells in a field. This used to be an ‘incredibly arduous task’ taking 6 months to create a 600 page book for the North Sea Nelson field. To IT this seemed like a good candidate for automation à la manufacturing. But this was harder than it appeared. To properly understand the process, Shell embedded a data management group in the Nelson asset which helped compile the well book, hand collating documents etc. until ‘they really understood the process.’ Next, ‘value stream mapping’ was used to high grade process improvement opportunities and implement a first pass improvement with better project management, deployment of a corporate data store, but as yet no automation. The results were spectacular. Where previously it took five employees six months to handle one field, the same team managed 74 fields in half the time using a ‘highly scripted’ process. The scripts, which are recipes for standard procedures rather than computer code, make sure that (only) key data is captured quickly. In the Q&A, Fleischmann said that the process was good at identifying elements of the workflow for further automation—a concept that has support from Shell’s CEO.

Shell’s data guru, Matthias Hartung emphasized the need for ‘trusted data’ across many domains, from geotechnical, through HSE, emergency response and compliance. But current schedule-driven culture means that while there is no time to ‘do it right,’ there is always time to do it over! The result is that data management is on the maturity level it should be. This is impacting novel forms of exploration and development like non conventional with a very rapid cycle. Data management is lacking in standards and data managers’ career paths are unclear. Yet there are sustainable solutions and successes—Hartung cited the Standards leadership council and advocated data management education and accreditation. In Shell, technical data management is now established as a globally managed technical discipline and Shell is developing or harmonizing its data standards applications and architecture. Hartung suggests two urgent steps. One, professionalize data management and turn it from ‘Cinderella’ to an enduring beauty. Two, develop an academic curriculum including agreed-upon standards and data ownership.

Judging by Fred Kunzinger’s (Noah Consulting) presentation, data management in the unconventional space is more of an opportunity than a problem. Unconventional is different. Developing a shale play mandates an integrated approach unlike the old exploration, development and production stage gates. Challenges for unconventional exploration center on supply chain and logistical efficiency—think Six Sigma and ‘lean’ as above. Operators have to schedule multiple operating rigs, managing a complex patchwork of land holdings and truck water and produce the oil and gas. Unconventional exploration is also forcing operators to abandon information silos in favour of integrated, real time systems. All the usual gotchas of data management, differing well identifiers and nomenclature need to be fixed to enable the new ‘assembly line.’ Unconventionals represent a true ‘big data’ problem with great benefits to be gained from integrating large data sources to identify sweet spots and optimize completions in a continuous improvement process. A relatively small investment in IT can provide a significant return for a company drilling 1,000 Bakken wells per year at around $10 million a pop.

EnergyIQ’s Steve Cooper, in a joint presentation with TGS/Volant and Perigon, outlined a project performed for another unconventional player, Hess Corp. Again, speed is the name of the game. Data loading has been partly automated. ‘Drop boxes’ are used for QC before loading to Hess’ Technically validated database (OITJ February 2010). Perigon’s iPoint toolset and TGS’ Envoy viewer also ran.

Gbolade Ibikunle returned to the ‘lean’ manufacturing approach to data management in a presentation of Shell Nigeria’s ongoing integrated drilling and production review project which leverages Shell’s well book concept (see above). Data from multiple sources, Recall, Landmark’s EDM and OpenWorks and Schlumberger’s Oilfield data manager are all linked in. Workflow control leverages Adobe digital signatures for document sign-off. Deployment has involved a mixture of encouragement (with branded T-shirts), enforcement and annual review. Engineers now spend much less time looking for data and production is up. Quality data and good collaboration between developers and users were keys to success.

Many North American unconventional plays are found in proven oil and gas provinces with a long, richly documented history. Extracting information from such historical data can be challenging. As Stephan Auerbach (ExxonMobil) and Cindy Cumins (IHS) asked, ‘Is legacy data a graveyard or a treasure trove?’ Exxon Mobil is using IHS’ electronic document custody service to get a handle on its huge document archive. Millions of hard copy documents and other media are at or near end of life. These represent a rich heritage from 50 plus acquired companies which captured documents with what was at the time the ‘latest’ technology. Exxon and IHS are now working to unlock value from a 25 million item data set covering the US lower 48, representing over 3 million wells. The process has uncovered significant hidden data of interest to US unconventional plays. Old well logs, not available to competitors, revealed hazardous igneous intrusions—allowing Exxon to walk away from the deal. Scanned images have been geotagged in a ‘rigorous’ workflow. Hard copy is triaged for relevance before metadata capture to Iron Mountain’s Accutrac records management system.

ExxonMobil’s CT Gooding asked why more companies did not perform regression testing to maintain the quality of their data asset. The answer is that regression testing is hard, requires multiple tools and the underlying systems ‘morph constantly.’ The answer is to automate testing on sample data sets generated from user input and metadata constraints. One use case is checking data synch between engineering and geoscience databases. A test record is injected into the two schemas which are then compared with an automated SQL query ‘to see if anything breaks.’ Gooding recommends a dedicated test environment to include tests on user roles and identities, ‘don’t run a test with super user privileges.’ The test environment itself can be generated automatically with triggers inside the production database.

Eileen Mahlow (XTO Energy) recalled the times when companies were told that in-house custom development was deprecated, heralding a move to off-the-shelf (COTS) software. Then there was Sarbanes Oxley (SOX) and a suite of new controls on accounting and other systems. COTS tools and accepted practices were not generally aligned with the new SOX regulations. For instance, software should not need full privilege to run and users’ roles should align with their responsibilities. Other issues such as software defect and change management may likewise expose companies to SOX issue. What is the answer? Mahlow advised vendors do a better job of keeping code separate from configuration files and to improve entitlement management. Oil companies need to improve how they administer privileges, ‘don’t let just anyone access the dbadmin account.’ Another gotcha is the ability of an unauthorized user to bypass access controls through direct access with tools like Toad.

One PNEC plenary session was devoted to the subject of professional data management and competency training and testing. Oil IT Journal has already reported on PPDM and UK CDA’s initiatives to formulate a data management training program, backed up with certification and testing. PPDM CEO Trudy Curtis pushed the boat out a little further, comparing PPDM, now the ‘Professional petroleum data management association’ with the SEG and SPE. CDA CEO Malcolm Fleming likewise called for the establishment of a professional association with its own journal, annual conference, workshops and seminars. ‘Data managers need their own club.’ In a similar vein, Omar Akbar outlined Saudi Aramco’s plans for the DMBoard, an ‘operators only’ club that aims to succeed in upstream data management where the vendors have failed. Akbar stated that, ‘We don’t want to be controlled by solutions, we want to be controlled by the business.’ Early work is to focus on a ‘reference model’ of industry terminology.

Our PNEC coverage continues next month with more presentations on unconventional data management, on emerging solutions to managing Petrel data and on Pioneer’s, err, pioneering use of data virtualization. Meanwhile, visit the PNEC home page.


PPDM Data management symposium, Houston

Devon’s data program. BP reports progress. DrillingInfo, Bonus on PPDM to WITSML mapping.

Speaking at the 2013 PPDM Association’s Houston data management symposium earlier this year, Joe Seila outlined Devon Energy’s Oklahoma City Data Management Program (OKC DMP). This set out to manage data as a corporate asset and provide timely and accurate data feeds from rig to desktop. OKC-DMP, a ‘fully funded multi-year initiative’ is addressing data governance, master data management and data quality. Data governance is seen as critical because today, ‘data is badly messed up.’ Assets use different standards, there is a lack of authoritative sources and data producers do not take consumers’ needs into account. Key technology behind Devon’s data initiative includes DataFlux QC Pro and Web Studio, used to translate requirements into business rules. Devon’s data initiative was also the subject of a PNEC presentation on data governance (with Noah Consulting).

Rusty Forman provided an update on BP’s ‘data management strategy for the long haul.’ BP has a ‘double digit multi-billion dollar’ upstream data asset that used to be ‘poorly and inconsistently managed.’ Starting in 2009 BP initiated a program to create a new vision and strategy for upstream data and has now developed an operating model and is shifting the emphasis away from tools, projects and support services to governance-based ‘sustainable’ management. BP’s data staff has grown from 40 to over 150 professionals with teams around the world. A data ‘Xcellence’ program is now being rolled out. Staffing up at ‘warp speed’ has proved challenging. It is difficult to find people with the right skills, especially for leadership roles. Turf wars at the IT/business boundary are also an issue. Folks are concerned that others are ‘doing my job!’ Thought is required to the division of labor and to developing a symbiotic relationship between stakeholders.

Alan Berezin outlined DrillingInfo’s initiative to combining PPDM and Witsml data sources to capture hydraulic fracturing data. DrillingInfo’s service combines applications and a ‘unified data environment’ that can be in the cloud or inside the firewall. To serve frac data, DrillingInfo blends real time data streams, in-house PPDM data and its own data feeds. Mapping from Witsml to PPDM has been a challenge and different stakeholder needs must be addressed. DrillingInfo has developed an overarching ‘canonical’ model of a frac job and completion which can be repurposed to local business needs.

Jeffry Bonus (Bonus Consulting) returned to the theme of PPDM/Witsml mapping in the context of the ‘official’ proof of concept. This is worthy of note as it is a poster child for standards interoperability and collaboration under the auspices of the Standards leadership council. A joint PPDM/Energistics work group has been kicked off to deliver a standard mapping that will address use cases such as importing Witsml deviation data into a PPDM data store. Download the PPDM presentations.


Folks, facts, orgs ...

Aker, Advanced Resources, Bill Barrett, CGG, Chilworth Technology, CVR Energy, Landmark, Transocean, DNV, Accenture, Shell, Energistics, PointCross, Data Gal, Petrofac, Geoforce, Inova, Intsok, SAIC, Kongsberg, Lonestar, Energy Industry Council, Michael Baker, OS-Geo Lab, Paradigm, Parker, Sigma3, Tiandi, Weatherford.

Nicoletta Giadrossi has joined Aker Solutions as head of operations and Roy Dyrseth as head of drilling technology. Rune Fanetoft is head of process systems Norway.

Stephen Bumgardner has been promoted to director of field operations with Advanced Resources International and senior reservoir engineer Anne Oudinot has been promoted project manager.

Scot Woodall has been named CEO and president of Bill Barrett Corporation. He had been interim CEO since January 2013.

CGG has appointed David Dragone as executive VP, human resources. He hails from Schlumberger and later Areva. Allen Taylor joins CGG senior Linux engineer in the UK.

Kevin Connelly has joined process safety specialist Chilworth Technology as general manager.

CVR Energy has named Dennis McCleary as VP project management and optimization. He hails from Turnaround Consulting Service.

Landmark technology fellow Robello Samuel has been named SPE distinguished lecturer for 2013-2014.

Terry Loftis, Transocean’s director of engineering, has been appointed chair of DNV’s rig owners’ committee.

Dean Forrester, MD of Accenture’s energy industry group, and Shell’s Matthias Hartung have been elected to Energistics’ board of directors.

Joe Tischner has joined PointCross as general manager and VP E&P Operations.

Madelyn Bell is now president of her own startup, Data Gal. Bell was previously with ExxonMobil.

Petrofac is to invest £1.5 million in enhancements to its fire training facilities in Montrose, Scotland. The project includes a purpose-built simulator.

Bruno Trigoly heads-up Geoforce’s new office in Macae, Brasil.

Keith Witt has been promoted to CFO and senior VP of operations and administration at Inova Geophysical. Carey Mogdan is COO and senior VP, manufacturing and customer service.

Azam Ali Khan is Intsok’s new oil and gas advisor in India.

James Moos is now group general manager of SAIC’s engineering solutions group.

Mike Topp heads-up Kongsberg Oil & Gas Technologies’ new EAME HQ in Guildford, UK.

LoneStar Geophysical Surveys has appointed Mitch Thilmony as HSEQ Director. He hails from Viking International, Poland.

Lord Howell of Guildford, a.k.a. David, is now president of the Energy Industries’ Council.

Michael Baker Corp unit, Michael Baker Jr. has appointed Cory Wilder as VP and national pipeline engineering practice lead.

An Open Source Geospatial Laboratory has been established by the GeoData Institute and the geography and environment department of Southampton University.

Arshad Matin has been appointed as Paradigm’s president and CEO and a member of the Board of Directors. He hails from IHS.

Parker Drilling Co. has selected former VP and treasurer of Ensco, Chris Weber, as senior VP and CFO.

Sigma3 integrated reservoir solutions has promoted Kevin McKenna to VP, technology solutions.

Stephen Holditch has joined Tiandi Energy’s corporate advisory team. He was previously head of the department of petroleum engineering at Texas A&M.

John Gass and Francis Kalman have been named to the Weatherford board.


Done deals

Career Builders, Chevron Energy Technology, 2TD Drilling, Innovation Norway, Delta Marine, CSL Capital, Enerlabs, IHRDC, Invincible, Idox, Artesys, Knowledge Reservoir, RPS Group, TGS-Nopec, Reservoir Solutions, Stingray Geophysical, Trelleborg, Ambler Technologies.

Chicago-based CareerBuilder has acquired Manchester, UK-based Oil and Gas Job Search, said to be the oil and gas industry’s leading online job site outside North America.

Chevron Energy Technology Company has entered into a joint development agreement with Norwegian 2TD Drilling to develop a new rotary steerable drilling tool, OrientXpress. Innovation Norway has chipped in with 13.5 million NOK to further tool development. Chevron is to pay $3 million over a two year period.

Delta Marine Technologies has formed Delta SubSea, through the contribution of the pre-existing DMTI business into the newly-formed Delta SubSea, now re-capitalized by CSL Capital Management.

Enerlabs is to acquire a second testing laboratory in Oklahoma for $3.4 million.

IHRDC has acquired UK-based petroleum training consultancy Invincible Energy.

McLaren Software parent, Idox Group, has acquired French engineering document management specialists Artesys.

Knowledge Reservoir has been acquired by UK-based RPS Group. CEO Ivor Ellul and all current staff will remain with the business which will continue to operate from its existing locations.

In its 2012 accounts, TGS-Nopec recognizes a $25 million impairment charge related to the Reservoir Solutions business it purchased in 2011 as part of its acquisition of Stingray Geophysical. The company does not consider probable that the criteria for additional cash payments to Stingray’s previous owners will be met.

Trelleborg’s Offshore & Construction unit has acquired UK-based oil and gas engineering specialist Ambler Technologies.


DNV rolls out Safeti Offshore risk package

Standards-based quantitative risk analysis takes holistic approach to offshore safety.

DNV Software has released Safeti Offshore, a ‘complete software tool’ for offshore risk analysis. Safeti Offshore allows operators to evaluate potential hazards and associated risks in a quantitative risk analysis solution based on DNV’s three decades of analytical experience. Quantitative risk analysis (QRA) of offshore installations identifies challenges such as congested equipment layouts. Current techniques include spreadsheets, with poor validation and traceability, and computational fluid dynamics investigation of specific issues. Safeti Offshore takes a holistic approach to the evaluation of the broader risks.

DNV’s Nic Cavanagh explained, ‘Safeti Offshore was developed to support state-of-the-art, complex offshore QRA. It was developed by our risk analysts to meet the requirements of international standards such as ISO 17776 and Norsok Z-013. All credible hydrocarbon accident scenarios are considered, including fire, explosion, toxics and smoke. In addition, detailed escalation analysis is provided to model potential domino effects and evolving safety function impairment.

Offshore QRA risk metrics, such as potential loss of life are captured to a database for reporting and charting. An interactive event tree allows navigation and data drill-down. A 3D visual representation of the facility allows key risk results to be viewed in context. In the design phase, Safeti Offshore can address issues such as layout alternatives, fire and blast protection, escape and evacuation measures and other risk mitigation measures.

DNV software MDirector Are Føllesdal Tjønn added, ‘Safeti Offshore uses the same software architecture as our leading onshore analysis packages, Phast and Safeti, embedding 30 years of mathematical modelling and software engineering expertise.’


Palatir’s economics and portfolio optimization for Tullow

Cash and Plan to provide ‘dynamic and consistent’ view of complex, growing portfolio.

UK-based Tullow Oil has selected Palantir Solutions’ toolset to optimize its oil and gas asset portfolio. Tullow is to deploy PalantirCash for petroleum economics and PalantirPlan for its portfolio optimization and visualization. The combined solution will provide a ‘dynamic, consistent and accurate view of Tullow’s complex and expanding portfolio.’

The Palantir suite also allows investment strategies to be compared and potential acquisition targets evaluated.

Pete Dickerson, head of commercial planning and economics with Tullow said, ‘Palantir enables us to gain a wider understanding of our portfolio and improves the way we conduct our business planning and strategy process. We can analyze and perform sensitivity analysis on different aspects of our business from high level corporate decisions to individual assets evaluations.’


CGG Jason’s ‘major new’ 3D package

3D interpretation functionality added to seismic-to-simulation offering.

CGG’s Jason reservoir characterization software and services unit has announced a new 3D interpretation function to its EPlus package, the analytical component of the Jason suite (previously the Geophysical Workbench). The Jason suite is CGG’s ‘seismic-to-simulation’ software offering.

The new 3D interpretation module lets users build, refine and complete structural, stratigraphic and rock property models without leaving the Jason suite. The tool offers visualization of wells, horizons, faults, seismic and rock property volumes and is integrated with Jason’s deterministic and geostatistical tools for quantitative interpretation.

Sophie Zurquiyah, CGG executive VP geology, geophysics and reservoir, said, ‘The new tool adds volume visualization and interpretation capabilities to Jason’s inversion technology. This launch underlines CGG’s strategy to expand its offering in the interpretation market.’ CGG acquired Jason with Fugro’s geoscience division last year. More from Jason.


Badger plugs-on with mission impossible!

Autonomous driller not yet prime-time ready—good video though.

At the AGM last month, Badger Explorer CEO David Blacklaw reaffirmed the company’s goal of developing an autonomous drilling device. The tool is to drill ‘autonomously,’ closing the path behind it with a plug ‘to prevent oil and gas from escaping.’ On board logging tools send data back home along a trailing wire.

Badger has abandoned its plasma channel drilling technology in favor of a conventional drill bit, selling its Plasma Technology unit last year. While progress on development is slow, the company has come up with a fabulous video. Badger has financial backing from ExxonMobil, Chevron and Statoil.


Sales, contracts, partnerships and deployments

Aker, Allegro, Clariant, FEI, FMC, Fugro, Geofizyka, Paradigm, Inova, IPnett, Kadme, Kepware, PEM, JSC, Orange, Petrofac, Mubadala, Sigit, Asset Guardian, Takatuf, Technical Toolboxes, Pure.

Aker Solutions has won a five year, 900 million NOK contract for engineering services on Husky Energy’s Canadian White Rose field. The company also received a £30 million contract for commissioning and facility management services on Premier Oil’s west of Shetland Solan field.

Associated Energy has chosen Allegro Development’s Allegro 8 platform for its petroleum marketing and crude oil purchasing unit in Houston.

Clariant Oil Services is investing in a new laboratory facility located at its oil and mining campus in The Woodlands, Texas.

FEI reports that it has now installed over 200 Quemscan automated mineralogy systems worldwide.

FMC Technologies has signed a four year frame agreement with Petrobras for subsea services in Brazil.

Fugro Subsea Services unveiled a subsea simulation game at the Aberdeen Maritime Museum. A DeepWorks simulator provides visitors with a ROV piloting experience.

Geofizyka Torun has licensed Paradigm’s common reflection angle migration technology for use at its depth imaging centre in Poland. Geofizyka and Paradigm are to collaborate on the publication of workflows for unconventional exploration leveraging Paradigm’s EarthStudy 360 full azimuth imaging.

Inova has announced sales surpassing 100,000 channels for its G3i mega channel recording system.

IPnett has signed an agreement with ENI Norge to deliver an IT admission control and security solution for offshore operations at its Goliat development.

Kadme has signed a contract with Ies Brazil to broaden its presence in Brazil.

Kepware Technologies has established a partnership with Houston, Texas-based EnerSys which is to market Kepware’s flagship KEPServerEX communications platform including all electronic flow measurement related products, as well as its LinkMaster and RedundancyMaster solutions.

PEM Offshore has signed a multi-million dollar contract with Kongsberg Maritime for the supply of anchor handling, dynamic positioning, power management and crane simulation systems for a training centre in Lagos, Nigeria.

Russian engineer JSC NIIK has selected Aveva Plant to support its engineering and design effort.

Orange Business Services has signed a network infrastructure agreement with Aramco Services Co., the US-based subsidiary of Saudi Aramco.

Petrofac unit Petrofac Emirates, a Joint Venture with Mubadala Petroleum, has been awarded a contract worth approximately US$3.7 billion by Zadco for the Upper Zakum, UZ750 development in Abu Dhabi. Petrofac’s share of the contract is valued at US$2.9 billion.

Sigit Automation has selected Kepware Technologies’ solutions for its connectivity and communications. KEPServerEX will serve as the backbone to Sigit’s e-Scada solution, delivering historical flow data to its custody transfer accounting systems.

Asset Guardian has won a contract from Stena Drilling of Aberdeen. Asset Guardian’s eponymous software management tool will centralize Stena’s software and data storage across the company’s fleet of drilling vessels.

Oman Oil Co. unit Takatuf has signed a memorandum of understanding with Petrofac to establish a technical training centre in Oman.

Technical Toolboxes has partnered with Pure Technologies to offer a ‘complete solution’ for pipeline leak and theft detection, combining Technical Toolboxes’ pipeline integrity management with Pure’s ‘SmartBall’ leak detection.


Standards stuff

PODS V6. Energistics’ NDR country index. PPDM well facets. Subsea Wireless. W3C Provenance.

The Pipeline open data standards association, PODS, has announced version 6.0 of its data model. PODS 6.0 is said to be an ‘important and foundational’ step in the evolution of the model. The updated PODS model has been broken down into 31 modular components which may be implemented independently. Modularization allows companies to select parts of the model they wish to deploy. Modularization allows for updates to be delivered as minor releases to individual modules.

EnergisticsNational data repository arm is developing a country index including an overview of the national oil province, the NDR’s history and links to NDR templates and reports. A beta version is available.

The PPDM association is launching a well status and classification system following consultation with members and industry. The new well status ‘facets’ and their associated values and qualifiers definitions are available for download (members only!). A second phase of the project is underway to review and update an industry standard plot symbology set, based on the well status facets.

The Subsea wireless group, an international initiative established to promote interoperability among subsea systems and components, is drafting standards for radio frequency, acoustic and free space optical communications. The group’s mission is to ‘raise awareness and acceptance’ of through-water communications as a viable solution for the industry. Membership includes BP, Cameron, Chevron, GE, Statoil, Subsea 7, Technip and Yokogawa. Work is focused on evaluating technologies, setting communications standards and on the production of an archive of case studies to demonstrate how wireless is already being used within the industry.

The W3C’s provenance working group has published the ‘Prov’ family of documents as recommendations. Users can exchange interchange provenance information in RDF and XML. Prov includes a mapping to Dublin Core.


Cyber security round-up

EU cloud security. Industrial Defender, NCCoE, ACI news. Tofino on ‘good, bad and ugly’ SCADA patching.

The European Network and Information Security Agency has published a guide to critical information infrastructure protection in the context of cloud computing. Such a concentration of resources is a ‘double edged sword,’ while cloud providers can deploy state of the art security, if a breach does occur, the consequences could be major. The 30 page report cites the digital oilfield as ‘at risk.’

A new white paper from Industrial Defender, ‘Protecting intellectual property theft from industrial control systems,’ warns of the risk of IP loss from hackers accessing control systems. While less high profile than outright attacks on the plant, such intrusions may be harder to detect. The publication offers mitigation techniques and advocates ‘log, log, logging’ to backtrack through an eventual breach.

The US has set up a National Cyber-security Center of Excellence (NCCoE) in Rockville, Maryland. The public-private partnership is hosted by the NIST with partners Cisco, HP, Intel and Microsoft. Senate appropriations committee chair, Senator Barbara Mikulski described Maryland as ‘the global epicenter of cybersecurity.’ Visit the NCCoE.

Over the state line in Arlington, Virginia, the Division of Advanced Cyber Infrastructure of the national Science Foundation is encouraging collaboration with China-based researchers to develop a ‘framework for developing shared software infrastructure.’

Tofino Securities’ Eric Byres has been blogging on ‘Patching for SCADA and ICS security, the good, the bad and the ugly.’ The ‘ugly’ part comes from the realization that around 20% of fixes are ‘incorrect’ and of these 40% result in ‘crashes, hangs, and data corruption.’ According to Kevin Hemsley of ICS-CERT, 2011 saw a 60% failure rate in patches that were supposed to fix reported control system vulnerabilities.


Software usability lab seen as driver of cultural change

Baker Hughes UX/UI group pitches MWD/LWD usability enhancements to Statoil.

Baker Hughes (BHI) has lifted the lid on its software development effort in an article in its Connexus Magazine. BHI’s user interface/user experience (UI/UX) group performs testing on the software used in directional drilling and logging while drilling activities. The idea is to deploy software that, like a smart phone, does not require a ‘binder full’ of operating instructions. UI/UX lead Joel Tarver said, ‘Today’s users want an uncomplicated experience whether they are making a phone call or logging a well.’ The team works out of the BHI usability lab in Houston. This ‘Silicon valley-inspired’ complex has an observation room with one-way glass, surveillance cameras, and eye-tracking equipment. Some employees were recruited from the video gaming industry. Tarver described the lab as a ‘a microcosm of culture change’ for the company.

BHI has presented its data visualization concepts to Statoil in a pitch to support a new field development. The forward looking program seeks to leverage concepts ‘that may become reality’ within the 30 year lifetime of the project. Tarver believes that today, industry wastes time ‘fighting software that is ‘working against us.’ The UI/UX lab is set to transform the tide of available data into information to support clients’ decision making. More from Baker Hughes.


PCubed helps Chevron revamp its IT portfolio

IT function has gained credibility thanks to ‘inspirational’ change management advice.

Ann Arbor, Michigan-headquartered PCubed reports that its change management processes have led to a business transformation success in Chevron. According to PCubed, ‘two out of three major transformation initiatives are doomed to failure,’ hence the focus on mastering change.

Chevron’s Joy Patel, speaking to Amanda Akass, editor of PCubed’s Insight bulletin, said ‘Our ten year effort started with an IT transformation program to prioritize, simplify, and integrate.’ Chevron’s journey began with input from an ‘inspirational’ leader from PCubed driving the program. Chevron then adopted a portfolio management solution and a common classification methodology. Chevron’s IT function has now gained credibility with its business leaders thanks to its ability to introduce and sustain change and validate the benefits. More from PCubed.


Encryptics data protection for MCR Oil Tools

Endpoint-level security protects intellectual property in storage and during transmission.

MCR Oil Tools is to deploy data protection technology from Frisco, TX-based Encryptics to protect digital intellectual property stored on its servers and employee devices and communicated over email. Encryptics provides strong data encryption and trusted peer to peer digital rights management technology that controls who can see the information and whether it can be printed or forwarded.

Encryptics Professional provides endpoint-level as opposed to server-based encryption. MCR also deploys other Encryptics tools for emailing, mobile and web-portal users. MCR, which manufactures pipe cutting tools, frequently creates and shares its design specs, CAD drawings and schematics with partners and clients around the world. MCR CEO Mike Robertson observed, ‘Sending confidential data through email meant our IP was vulnerable to hackers and theft. We have spent thousands of dollars defending our designs in court. The new technology means our engineers can encrypt messages and attachments without disrupting their workflow.’ More from Encryptics.


Mojix’ RFID for BP/Clair Ridge asset tracking

Passive and high-end GPS-enabled tags track equipment from manufacturers to construction site.

BP is to use RFID* technology to track mobile assets on its $7 billion Clair Ridge redevelopment north west of the Shetlands. The technology is supplied by Los Angeles-based Mojix whose ‘track-and-trace’ wide area passive RFID solution provides real-time visibility of crates and large components in shipping. BP has deployed a Mojix STAR 3000 system to support construction by suppliers including Hyundai heavy industries in Korea, Aker Solutions in Verdal, Norway and others.

Two tag technologies will be used. Passive EPC Gen 2 RFID tags will be attached to all materials and components while high-end global positioning system telemetry tags will be used on containers and heavy equipment. Tag location information is consolidated at Mojix’ hosted web-based ‘visibility platform.’ This integrates commercial ocean vessel tracking information to provide a single view of the location and flow of goods on roads, oceans and stored in warehouses.

Clair Ridge was featured in a BP presentation at the OTC this month. The project involves the world’s first offshore full field deployment of BP’s ‘LoSal’ enhanced oil recovery technology to modify the salinity of water injected into the reservoir and increase oil recovery. More from Mojix.

* Radio frequency identity tags.


Canrig floats cloud-based drilling automation service

Drilling has many stakeholders and integration is hard. So why not move its IT to the cloud?

Speaking at the SPE Digital Energy event earlier this year, Canrig VP software development, Pradeep Annaiyappa, provided an overview of drilling automation. Offshore, much automation is already achieved, while on land, the rental model makes integration harder. This is changing as operators seek to improve efficiency by messaging between mud pumps and the autodriller.

Automating managed pressure drilling, pipe handling or directional drilling requires many safety interlocks, millisecond timing and deterministic communication—all IT challenges. Another issue is what happens when two companies want to control the mud pump. Communications need to be clarified so a driller knows whether pipe is being pulled or pumped out of hole. An ‘integration console’ is required rather than current cyber chair displays.

Ultimately, ‘third level’ advisory automation including offline, off site computations will inform a slower, one second plus loop integrating Wits/Witsml and Http data streams. Echoing earlier talk of a rig ‘data exhaust’ Annaiyappa observed that under 1% of available data actually leaves the rig, making later complex event processing difficult.

Canrig proposes a private cloud based architecture, running applications on a virtual machine. The approach would allow for data integration and automation without the need for more and more systems on site. DSATS interoperability standards should be relevant in this space. More from Canrig.


CyrusOne’s Seismic internet exchange

Data center service provider offers low cost bandwidth from 45 acre Houston campus.

Data center services provider CyrusOne has launched an internet exchange (IX) platform targeting the seismic processing and data management requirements of oil and gas companies. Seismic IX provides low-cost data transfer between CyrusOne’s data center facilities across the Houston area, allowing clients to connect with service providers and partners’ facilities. The offering provides bandwidth for ‘massive,’ real-time data transfer and multi-point accessibility of exploration and operational software. Today, the Texas IX connects major sites in Austin, Dallas, Houston, and San Antonio.

CyrusOne has underscored its support for the energy vertical with the purchase of a 32-acre lot adjacent to its Houston West campus. The acquisition increases the size of the campus to more than 45 acres. The company also reports partnerships with Texas universities to link research institutions with its oil and gas clients. CEO Gary Wojtaszek described the data center, which houses a high-performance computing cloud solution for oil and gas, as a ‘geophysical center of excellence.’ More from CyrusOne.


Endeeper, Carl Zeis roll-out petrographic appliance

Axio polarizing microscope comes with semantically-enabled Petroledge/Rockviewer software.

BG Group unit BG Brazil along with R&D partner, University of Rio Grande do Sul (UFRGS) have selected a petrographic knowledge and data management solution from Brazilian software developer Endeeper and microscopy specialist Carl Zeiss. The solution integrates Endeeper’s Petroledge and RockViewer tools with Zeiss’ Axio Imager A2 polarizing microscope, equipped with an AxioCam MRc scientific camera and ZEN Lite software. The system will be used to perform an integrated study of the Santos and Campos Basins.

Petrographic data is managed using the Server version of Petroledge and RockViewer along with the Carl Zeiss hardware. Workstation-generated data is centrally stored. This enables information exchange between researchers, assures data security and accelerates data processing and interpretation.

Endeeper was cited in Oil IT Journal last month as an early adopter of the semantic concepts advocated in a recent publication from IFPen. More from Endeeper, Carl Zeiss and UFRGS.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.