The highlight of the 2012 SPE Intelligent Energy event held in Utrecht, Netherlands this month was the low key introduction by PointCross (HQ, Foster City, California) of two upstream data stores and an open source software stack of data management tools. The new products comprise seismic and well data management solutions along with a generic enterprise taxonomy and search engine. PointCross’ Abhilash Naroth told Oil IT Journal, ‘Today’s data status quo turns around Witsml well data and relational databases—often PPDM. We are changing this paradigm with a unified repository based on Apache Hadoop, combing these and other industry formats using the scalable technology deployed by Google, Facebook and Twitter.’
Hadoop is a distributed file system designed for use across ‘petascale’ clusters. PointCross’ drilling data server and repository (DDSR) uses the ‘HBase’ non-relational database derived from Google’s ‘BigTable.’ This multi dimensional data store underpins Google Earth and Finance. Its data model is simple, all values comprise a row string, a column string and a time stamp. Developers can further structure the strings for a particular purpose. PointCross uses the BigTable to spatialize seismic positional data for rapid retrieval.
Other elements of the open source stack provided by the Apache foundation include Chukwa, used to parse and store across HBase and Hive. Hive maps ‘traditional’ SQL queries from industry standard apps to data in the Hbase commands. Spatial data is indexed using a convex hull algorithm that pulls up seismic polygons in performant, scale sensitive retrieval. Alongside the DDSR is a seismic data store and a taxonomy engine that provides lookup and cross referencing of common industry data types such as well log curve mnemonics from the major contractors.
All of PointCross’ Hadoop solutions are available for deployment within the corporate firewall or through Amazon’s elastic compute cloud for sharing access with oil field service providers. PointCross sees the scalability of such solutions as well-suited for data mining applications—looking for patterns in well logs and seismic data volumes and operational data. PointCross’ leveraging an open source stack is an interesting departure from its previous ‘allegiance’ to the Microsoft Upstream Reference Architecture. MURA is apparently absent from the DDSR—as indeed it seems to have been from Microsoft’s own Global Exploration Forum—see our report on page 7 of this issue. More from PointCross on email@example.com.
At a Baker Hughes-hosted meet in Houston this month, C-level execs from eight standards bodies met to form the ‘Standards Leadership Council’ (SLC), which sets out to ‘enhance collaboration on standards for the benefit of industry.’ Participants hailed from Energistics, PPDM, PIDX, Open Geospatial Consortium, OPC Foundation, PODS, POSC Caesar and MIMOSA. The SLC plans to identify areas of intersection, to avoid duplication of effort and to determine business value metrics for standards adoption, enhancing membership benefits and maintaining financial sustain-ability.
Hess’ Fred Kunzinger commented ‘As an operator, it’s difficult to comply with standards from so many separate organizations. We hope that the SLC will provide strategies that use industry standards to their maximum potential.’ The SLC also received endorsement from BP. Oil IT Journal understands that one initial use case for the SLC will be mapping between WITSML and the PPDM data model. It will be interesting to see how long it takes the SLC to achieve this compared to similar ‘commercial’ work by PointCross and others (see this month’s lead). Next meeting of the SLC is slotted for November in Oslo, Norway. More from Energistics.
The Jaarbeurs exhibition center, Utrecht, Netherlands, home to the 2012 Reed Exhibitions/Society of Petroleum Engineers’ Intelligent Energy event takes safety seriously. Three DayGlo-orange clad individuals opened the proceedings with in-depth instructions of what to do in the event of a ‘calamity.’ Just before leaving the stage, a last minute warning was issued about the imminent use of stroboscopic lighting which ‘may inconvenience some.’ Any strobe-sensitive individuals had about two microseconds to react before all hell broke loose. Swizzling strobe lighting worthy of a Black Sabbath revival was accompanied by music so loud that my neighbor had his fingers in his ears. The effect was quite devastating and when it ended, one’s senses were not so much dulled as temporarily extinguished.
I am not sure if that was the intention, but when the following spoof presentation of a vision of the future, with the ‘discovery’ of new technology to ‘extract electrical energy from depleted oilfields’ by using them as batteries, I (as I subsequently discovered, did others) took this at face value. As my composure returned, it was soon replaced with a growing sense of irritation at this preposterous ‘discovery’ along with its totally unconvincing mock real-time interviews from ‘experts’ around the world. The oilfield battery was, seemingly, the greatest invention since the invention of the err... battery!
The ‘invention’ was apparently the fruit of the ‘parallel processing power’ of billions of brains—a ‘crowd sourcing’ exercise which the authors of this brain fart conflated with ‘open sourcing.’ This was the only decipherable message from the production—that we are at the dawn of a brave new world of collaboration and crowd-sourcing.
Collaboration is an apple pie-ish concept. There is an implicit menace to those who are just getting on with their jobs. Maybe you/we/I should be collaborating more? Having inflicted us with ‘collaboration centers’ where folks with better things to do watch Power Point presentations on very big screens, the collective wisdom of the global IT marketing department is now telling us—‘You are not collaborating enough§ Throw away the collaboration center! Go forth into the cloud and collaborate with the hordes!’ Such fear, uncertainty and doubt (FUD) is the oldest weapon in the marketing armory. It plays to the other tag-lines of the ‘greying workforce,’ the ‘digital natives’ and the general uselessness of the older generation in the face of err.. Facebook!
Three cameras assiduously recorded these absurd goings-on. I guess that this is so that students—sorry I should say ‘YPs*’ can see what an inventive and cool a profession is petroleum engineering. I like to think that any student worth hiring would run a mile when confronted with such nonsense—although I grant you that the prospect of a six figure starting salary probably makes up for a lot.
Next up was one Jay Rogers of a company called ‘Local Motors.’ Was this too a spoof? Rogers was introduced as a ‘former employee of Dallas-based Ewing oil.’ Apart from this fiction, Local Motors is for real and was ‘recognized’ as a 2012 Industry Pioneer at CERA Week this month for its ‘open source’ vehicle design. In what is after all rather an improbable business model, Local Motors claims to crowd source design, inviting ‘inventors’ to its factory to build the vehicle of their dreams. The results are somewhere between a Batmobile and a lawn mower. Rogers made a brave comparison between his own oeuvre and that of Linux Torvalds—with both at the head of an empire of collaborators. Torvals Linux has, according to Rogers, a 60% market share of ‘installed code base,’ which may be true. The jury is out on how long it will take LM to achieve something similar.
Back home from Intelligent Energy—by the way the show got better after the opening session, I was going to dutifully deconstruct the ‘crowd sourcing’ nonsense. Then I realized that I already did this in my editorial of February 2009 (I would apologize for repeating myself of it weren’t for the fact that this is because the collaboration/crowd sourcing advocates are repetitive in the extreme!).
As I was writing this editorial, Christopher Caldwell, writing in the Financial Times has pitched in on the crowd sourcing debate—backed up with some up to date research. Caldwell’s piece, ‘Groupthink is no match for individual genius’ observes that by ‘locking people of high IQ in a room’ you do not achieve a higher collective IQ. Instead, according to a Virginia Tech study, the group gets dumber! Caldwell believes that the ‘group intelligence’ a.k.a. ‘wisdom of crowds’ is a ‘cognitive science trend on which the tide is now receding.’ Caldwell observes that while ‘individuals produced King Lear and the Discourse on the Method,’ the wisdom of crowds produces ‘a few retail fads at best, book burning and pogroms at worst.’ To which one might add that it was the individual Torvalds that created Linux—even though its subsequent maintenance has crowd support.
OK if I am so smart to dis the favorite topics of the great and good of the SPE, what do I think they should be talking about? Two things spring to mind. Safety and shale gas. I say ‘spring’ because while the SPE is conspicuously silent on such issues, you just have to pick up a paper to read about the Macondo aftermath or about ongoing potential ‘calamity’ of Total’s Elgin gas leak. While there was a session devoted to HSE, with a notable contribution from Chevron on how ‘Digital oilfield principles enhance safety,’ this would merit from a bit more visibility. It could have usefully displaced the whole opening plenary. Shale gas likewise, particularly in Europe is getting the industry a bad name—largely because of the amount of contentious stuff from folks like Josh Fox and his ‘Gasland’ movie. Which is met by an embarrassed silence from the SPE!
* Young professionals.
Drew Conway and John Myles White’s credentials are impeccable—Conway used to work in intel, White is a psychologist, statistician and maintainer of several ‘R’ packages. Their new book, ‘Machine learning for hackers* (ML4H)’ uses the successful O’Reilly mix of code and conversation to take the reader through various sample tasks.
ML4H uses the open source ‘R’ programming language throughout and assumes knowledge of command-line work. R provides data formatting visual-ization, statistics and analysis in a moderately terse syntax. According to ML4H, R has seen a ‘meteoric rise’ in the data sciences and machine learning communities—making it the ‘de facto lingua franca’ for analytics.
R provides tools for massaging ugly real world data into shape ready for visualization. Father of data analysis John Tukey is credited with the distinction between exploratory and confirmatory data analysis—the first used to identify trends, the second to see if they are statistically significant. R does both of these—a few lines of code turns a ragged public domain ‘big data’ set into an almost Share-Point-esque dashboard.
Then comes the smarts. Key to real machine learning is Bayesian statistics, combining data driven probabilities with prior knowledge. Examples such as determining a person’s gender from ‘her height and weight’ (a she maybe?) are easy to envisage in more pertinent domains such as deriving lithology from seismic attributes and logs, or in root cause analysis of maintenance data.
R’s capability to work on text analytics is demonstrated in the Bayesian spam detector. A single line of code removes ‘stop words,’ further natural language processing performs word counts and more statistics to separate the spam from the ham. Again—industrial use cases of such techniques abound. Also of interest is the section on multi-dimensional ‘spatial’ analytics which uses matrix operations like distance metrics and multi-dimensional scaling to put US senators’ voting patterns on the map.
A couple of niggles—some unnecessary neologisms—what the heck is a ‘gization?’ and some graphs could do with better labeling. The authors warn against using R on really big data—suggesting that such should be recoded in C. This is not necessarily the only option. A press release from Teradata this month announced an R interpreter for its high-end data appliance. R is also available in Spotfire’s statistics services layer.
* O’Reilly press 2012. ISBN 9781449303716.
Following its stake in drilling case-based reasoning boutique Verdande Technology last year (Oil ITJ September 2011), Baker Hughes has rolled Verdande’s DrillEdge technology into a new WellLink radar remote drilling advisory service. WellLink RRDA uses case-based reasoning and event detection to minimize non-productive time and increase safety. RRDA mines historical drilling data from nearby wells to build a database of ‘cases’ such as lost circulation, tool sticking and vibration. Cases are categorized as sets of key drilling parameters such as weight on bit and flow rates and stored in a repository.
During drilling, the RRDA compares current measurements with the historical data in real time, issuing warnings when potential problems are identified. The ‘radar’ reference refers to a customizable polar plot of different situations showing the current situation of a well and the onset of potential problems. Baker Hughes reports that the system warned a client drilling in the ultra deepwater Gulf of Mexico of multiple events including pack-offs, overpull, maxed-out torque, string stalls and pore pressure changes. The RRDA is a component of BHI’s remote ‘beacon’ collaboration and monitoring service offering. More from Baker Hughes.
The almost-formed World Wide Web Consortium’s oil, gas and chemicals business group met chez Chevron in Houston last month to envision a semantic web-based future for the industry. Attendees hailed from Apache, ConocoPhillips, Chevron, Energistics, Exxon-Mobil, Halliburton and Statoil. Participants discussed possible application of semweb technology in several areas.
Data provenance and audit across the value chain is seen as a potential application. Semantic technologies seem well suited for capturing complex, ‘anastomosing’ data flows with new standards for data provenance. Synergies are seen with the W3C data provenance working group and with ISO’s 19115-1 energy profile.
Linked data is another area of ‘considerable enthusiasm’ in industry for a semweb approach to drilling information management—focused on non conventional operations and fracking. The University of Tulsa ‘Bricks’ taxonomy is a potential candidate for semantification. Linked data would replace current ‘static’ enterprise information models.
A more speculative case was made for semweb technology in the regulatory and compliance arena—extending the W3C’s work in the field of e-government. Finally, the semantic web is seen as a potentially more sophisticated basis for ‘big data’ analytics of unstructured or semi-structured data. One idea is to ‘push’ analytics further down into large volumes of real time data to ‘provide more sophisticated generation of events that can be used to optimize operations.’ The W3C business group needs to achieve critical mass in the form of five committed member organizations. Currently only Chevron and Statoil have signed up.
Earlier this month, the 36th District Court of Beauregard Parish, Louisiana, issued its reasons for judgment in a case that set Olympia Minerals (then an El Paso unit) against Aspect Resources and a predecessor of Kerr-McGee (subsequently acquired by Anadarko). In 2000, Aspect and Kerr-McGee agreed to conduct a 3D seismic survey on Olympia’s 445 sq. mi. lease in western Louisiana, to sublease at least 15% of the acreage and to provide Olympia with a copy of the seismic data, including in the form of raw data or field tapes. But Aspect provided only processed seismic data to Olympia. Subsequently, Aspect and Kerr-McGee claimed that Olympia had ‘misused’ its seismic data by allowing a third party to remove data from a data room and put it on its computer system. The court found that the evidence ‘strongly suggested that Aspect had played an ‘inexcusable shell game’ and struck the contractual limitation on consequential damages, awarding damages for loss of royalties and ordering Aspect to provide Olympia with the field tapes.
A US Department of Energy (DOE) funded research program at the West Virginia University (WVU) department of community medicine has resulted in a self contained device to monitor environmental conditions during shale gas drilling. The units consist of battery-powered monitoring device and radio transceiver, and provide real-time measurement of volatile organic compounds, dust, light and sound. The units help operators comply (and demonstrate compliance) with environmental requirements. The requirement for monitoring is high in West Virginia where over 1,400 Marcellus wells are producing with permits issued for another 1,200 more. NETL-RUA researcher Dr. Michael McCawley developed the device to remotely monitor the environment around Marcellus shale gas wells. The device works in areas remote from power and where terrain may be an obstacle. The units are charged by solar panels and link to a base station module with a computer and cell phone modem that transmits data on to WVU.
The research was carried out under the auspices of the National Energy Technology Laboratory’s Regional University Alliance for Energy Technology Innovation. NETL-RUA was formed in 2010 as a partnership between NETL and a consortium of five mid-Atlantic universities: West Virginia University, Carnegie Mellon, Penn State, the University of Pittsburgh, and Virginia Tech. The NETL-RUA research program assists NETL in conducting basic and applied energy and environmental research that supports DOE’s mission to advance U.S. national, economic, and energy security.
Schlumberger’s Business Consulting (SBC) unit has released the 8th edition of its oil and gas human resources benchmark survey. SBC surveyed 37 upstream companies accounting for approximately 37% of global oil and gas production. The survey found an ‘outflow’ (retirement) of some 22,000 senior petrotechnical professionals (PTP) by 2015, a net loss of 5,500. While recruitment of new graduates will compensate for the loss, it ‘will not fill the experience gap*, ... threatening the timely completion of projects.’ 70% of NOCs and 60% of majors acknowledged project delays due to staffing difficulties.
Companies reported a 60% hike in recruitment targets for 2011 but an imbalance in supply and demand has resulted in increased ‘attrition rates’ (poaching) with turnover ranges of up to 7% for PEs. Regulatory-backed ‘nationalization’ of talent is a challenge in many countries where there may be a lack of experienced local talent. SBC reports that PTP ‘intensity’ has a measurable impact on production growth. High growth companies tend to have a higher ratio of PTPs per unit of operated production than lower-growth companies. High-growth companies also tend to foster diversity in the workforce with more women in their talent pool. Training and ‘time to autonomy’ is another key indicator. SBC believes HR policy, in terms of PTP Intensity and talent management practices is a driver of long-term production growth.
* It may not fill the experience gap, but the new grads will be welcomed by those desperate for Gen-X ‘digital natives.’
UK-based Vialogy reports successful demonstration of its ‘QuantumRD’ seismic processing technology on a dataset provided by Chevron. Quantum RD ‘probes’ a seismic data set by adding synthetic noise to the data to detect attributes of interest, such as porosity. A test last year involved 3D seismic data in the lower Permian Basin where QuantumRD was applied to 50 sq. km. of Devonian, Wolfcamp, Strawn and Atoka formations between 8,000 and 13,000 feet.
Vialogy’s core ‘quantum resonance interferometry’ technology is claimed to ‘detect weak signals buried in background noise,’ and can be applied to seismic, electromagnetic and magneto telluric data. This is achieved by exploiting ‘normally disregarded’ noise variations induced by changes in porosity, fluid presence and permeability. Vialogy claims that its patent active signal processing technology that adds a synthetic noise source to data is ‘unlike conventional signal processing*.’ More from Vialogy.
*Actually ‘conventional’ deconvolution frequently involves adding noise to data.
New features in AVS/Express 8.0—more. AVS/OpenViz 3.0 adds big data and in-memory analytics—more.
CGGVeritas’ Hampson-Russell unit has released HRS-9 with a common interface and data management across all modules. User-customizable workflows connect the historically separate applications. Multi-threaded 64-bit computing supports multiple CPU cores—more.
The new 1.2 release of GPlates from Sydney University, CalTech and the Norwegian Geological Survey represents the motions of tectonic plates through geological time. The software is a free download.
IDS’ VisNet2.0, a web-based drilling data dashboard is enhanced with data manipulation tools, formulae and dataset enhancement providing multiple views of data. IDS also released TourNet2 with bi-directional WITSML capabilities—IDS.
The 3.2 release of INT’s log and seismic viewers facilitates web-based visualization and data QC along with support for DLIS in Atlas TIFF, SEG-D, CSEG-Y and streaming CGM files—INT.
HRH Geology has announced ‘Spectra,’ a mass spectrometry service for wellsite formation fluid analysis—HRH.
The latest release of Headwave’s eponymous seismic toolset performs on-the-fly pre-/poststack conditioning. New AVO crossplots use color blending to label anomalies. Headwave works alongside Schlumberger’s Petrel—Headwave.
GE’s SmartSignal Shield 4.0 for oil and gas detects faults in large equipment, providing early warning of impending equipment problems. Shield 4.0 was developed by mining a ‘blind’ data set of customer data, millions of machine hours and tens of thousands of failures from 12,000 assets—GE.
The global environmental management initiative (GEMI) has released a ‘local water tool’ (LWT) and an LWT for oil and gas. The LWTs provide best practices and water risk assessment for sustainable water management. GEMI worked with IPIECA, the global oil and gas industry association for environmental and social issues, on the LWT development—GEMI.
AspenTech’s Aspen Plus now represents an integrated simulation environment for users of AspenOne process engineering products including a comprehensive physical properties database, a capital cost estimator and heat exchanger design tools—AspenTech.
A meeting of the US Groundwater Protection Council (GWPC) earlier this year looked into ‘the hot topics in injection wells,’ providing an inventory of claims and counterclaims in respect of shale gas’ safety. Jean-Philippe Nicot of the Bureau of Economic Geology (BEG) at the University of Texas at Austin advocated ‘fact-based regulation for environmental protection in shale gas development.’ The BEG has performed reviews of technical literature, regulations and records of violations and media claims, noting that, ‘whether hydraulic fracturing for shale gas production has resulted in contamination of ground water and polluted water wells is a very controversial issue.’ Opponents have made bold claims of shale gas’ potential to contaminate groundwater with methane, as shown by the ‘large number of incidences of explosions and contaminated wells in Pennsylvania, Wyoming, and Ohio in recent years*.’ On the other hand, ‘there have been over a million wells hydraulically fractured in the history of the industry [and] not one reported case of a freshwater aquifer ever having been contaminated from hydraulic fracturing.**’ We’ll review the BEG report in next month’s Journal.
Lisa Molofsky (GSI Environmental) has likewise found that methane in Pennsylvania water wells is unrelated to hydraulic fracturing. A pre-drill regional survey found that 78% of water wells contained methane—emanating from shallow glacial till and naturally fractured gas-charged sandstones. Although similar in chemistry, often, Marcellus shale gas can be distinguished from these surface sources. The work is the subject of a 2012 RPSEA proposal for a ‘stray gas investigation protocol’ which will include a pre-drill water sample database and isotopic fingerprinting of gas sources.
Joe Tiago of the EPA provided a status update on the geologic sequestration (of CO2) data system (GSDS). While CO2 use for enhanced recovery is a long-standing practice, sequestration is different—both technically and scale-wise. Federal requirements have introduced a new class of injection well with new reporting requirements for sequestration data. These will feed into a new GSDS repository. Various designs are under consideration—including an HPC platform for analysis and a Drupal content management platform.
In regard of water use, the Texas Water Development Board’s Dan Hardin observed that ‘Fracturing represents less than one percent of the state’s total water use.’ Frac water use is projected to triple by 2020. But even this will still be ‘less than one percent of the state total!’
Scott Kell has been investigating groundwater contamination in Texas and how states are responding to the findings. Contrary to popular belief, incidents are on a downward trend (at least through to 2007). 76 incidents were reported in Texas in the five year period from 1983-87, only 7 in the 2003-07 interval. ‘Incidents’ include deficient primary cement jobs, unsealed flow zones and over pressured annulus. Putting this into context, in the 15 year period from 1993—2008, some 190,000 wells were drilled in Texas and over 5 billion barrels of water injected. Kell concluded that ‘sound science should be foundational to public policy’ and that ‘State investigations are critical drivers of regulatory reform.’ More from the GWPC.
* Robert Howarth, Cornell University, in a written submission to the EPA, 2010.
** Rex Tillerson, Chairman of Exxon Mobil Congressional Testimony, 2010.
Torr Hoff kicked off the 2012 SMI E&P data management conference in London last month with a presentation of data management strategy and policy—the ‘Statoil way.’ An internal survey in Statoil determined that ‘work processes should not be more important than the work itself.’ Processes such as data QC can become bureaucratic unless they are turned into ‘business enablers.’ Statoil has been through several IT/data standardization programs since 1998. A major reorganization in 2011 saw a shift to ‘managed diversity,’ with the recognition that ‘exploration and production are different.’ The exploration business evolves quickly—and new situations—such as Statoil’s involvement in Canadian non-conventionals—require data ‘agility.’ In more mature exploration and production environments, more rigor is required as data volumes explode.
In such different contexts, ‘somebody needs to write the obvious stuff down—such as how we work and how much we can standardize on a generic exploration value chain.’ Principles, processes, decision gates and detailed functional requirements are captured in the ‘Statoil Book.’ This maps out who does what across the exploration ‘funnel’—from its ‘wide end’ of basin evaluation and bid rounds to finer drilling, discovery and appraisal. In the early days, data management was performed largely by geoscientists. Today, Statoil has full time data management professionals working to supply data and documents to its explorationists and on promoting and enforcing data standards and procedures.
Such documentation informs knowledge workers as to data requirements—but also helps data managers to understand the underlying business. Data roles span IT, information and data management. Roles include data definition owner, data value owner (keeper of the official ‘STAT’ value) and a data owner proper who oversees the other roles. A data administrator runs the actual data stores. Statoil leverages the DAMA data methodology along with Deming-style quality management. Statoil manages Petrel data in an OpenWorks project database kept in synch with a Petrel reference project.
Garrick Fraser’s (DataCo) message is ‘don’t manage, spatialize.’ Spatialization (GIS-enablement) is much easier than dealing with complex native E&P formats and unlocks data value through instant access. But even this requires considerable up-front effort in automation and scripting—but the route to data aggregation via GIS allows for data mining at the global level leveraging Google-like search. DataCo’s ‘QIP’ solution was developed for Shell and provides a ‘single, complete, verified data set for each well with quality flags and an audit trail.’ The QIP lets end users load data to OpenWorks projects and corporate data stores. Fraser believes that data visualization allows problems to be fixed before data loading—this is ‘better than resorting to Innerlogix or Exprodat.’ In any event, cleanup such as ‘UWI alignment’ of multiple data sources is ‘essential prior to spatialization’—DataCo.
Digital Earth’s Robert Winslow is advocating a global service providing a unique web page for every well (WP4EW). The idea was inspired by the electronic industry’s ‘CAPS/PartMiner*’ service which has a web page for each component—‘maximizing the chance that Google will rate it highly in search.’ The proposed WP4EW service would provide pointers to third party data held by IHS, WoodMac, Deloitte, Drillinginfo—and in-house corporate sources. The service could be financed by advertising from service providers and/or by subscriptions for heavy users. In the Q&A, a call was made for improving the quality of the component data sets by ‘crowdsourcing’ quality improvement from users. This could be achieved by a Wiki for comments on data accuracy and possible alternative information available for upload—Digital Earth.
Joe Johnston (Fugro) thinks that far too much time is wasted checking data from numerous independent data sources when kicking off an interpretation project. What is needed is a package of cleansed data to start from. Enter the Fugro ‘Path,’ a pre-load, workstation-ready composite of seismic, core, geological data and more—perhaps including scanned hard copy. The Path creates Kingdom, Petrel projects inter alia—Fugro.
David Lloyd (GDF Suez E&P UK) summarized the data/information conundrum with an equation,
IS = (IT+IM) x f(P,p,t)
Which being interpreted says, ‘information services are the product of information technology/management with a function of process people and technology.’ GDF Suez’ UK unit is upping its IS act as it becomes a North Sea operator. IT has moved out from the basement following a disaster that took six months to rectify, reclaiming stuff from C Drives and USB keys! Training is key, ‘nobody knows how to use anything’ folks need to be trained to use the phone, videoconferencing, to use Microsoft Office and the project management framework. GDF is continually updating its information security model and acceptable use policy. A global ‘partnership project management framework’ leveraging the CMMI benchmark is set to save GDF around $3 million/year through use of the same terms and templates to avoid project rework. A highly simplified PRINCE project delivery framework has also been developed and is applied equally to a geological interpretation project or an IT deployment. ITIL also ran. ‘IT has turned a corner’ and has gone from ‘losing data to making things work!’
Arief Joenaedy revealed that a 2008 site assessment at Kuwait Oil Company found ‘undocumented workflows, insufficient data management resources, poor exploration data quality and unsystematic results capture.’ To rectify this situation and ready KOC for future ‘explosive’ growth in seismics, real time, GIS and novel remote sensing data types, KOC has been working with Schlumberger on a ProSource/Seabed-based data infrastructure that also manages KOC’s Landmark SeisWorks/OpenWorks data. A corporate database of seismic data has been developed around Seabed. The result is that data retrieval that previously took two days is now down to 15 minutes. Quality is up thanks to use of original format data. Business process and disaster recovery are also improved.
Mario Fiorani outlined a ‘prototype’ approach to data management under study at ENI. The new technique contrasts with the ‘traditional’ top-down approach of data gathering, validation and QC which still takes-up 80% of the time—leaving a meagre 20% for core business activity. Many previous data initiatives have had poor outcomes, failing to have commitment from management or take-up by affiliates. Moreover it is hard to demonstrate the monetary value of data management systems which take ‘too long to design, procure and implement.’ This leaves plenty of time for people to change, the business to cancel a project as oil prices move and folks’ attention returns to operational matters.
Enter the new approach, engaging top business managers and indentifying those with a real interest in the value of data—ideally someone ‘not too focused on cost/time reduction, efficient data management and who is bored with procedures and policies.’ Buy-in works if top management can monitor and control what is going on and focus less on time saved but rather on opportunities gained from timely information. This may be as simple as providing KPIs on the cost of compressor maintenance or on operational safety. Such questions typically involve information scattered across reports, Excel and Power Points and other sources. The answer may lie in integration—through GIS, high performance computing, ‘Web 2.0’ and business intelligence.
An ENI prototype production prediction uses Monte Carlo analysis to provide a risk-based budget based on actions and events. The ‘BudRisk’ application took 3 months to develop internally and has proved a ‘huge success.’ Data from the Oracle server farm is accessible from an iPad showing graphs of downtime, lost production etc. Key to such initiatives are ‘internal knowledgeable resources.’ BudRisk’s success has triggered additional IM initiatives for logistics, portfolio management and drilling/well integrity. More from SMi’s E&P Data Management conference.
* CAPS has since been acquired by IHS—it’s a small world!
Antonio Narváez described a major drive to improve production on Pemex’ Chicontepec asset. Key to understanding and monitoring the 660 wells slotted for 2012 are The Information Store’s Petrotrek software and Microsoft’s SharePoint. Narváez also announced that Pemex will be seeking partners from the international oil and gas business to help ‘maximize production.’ In Q4 2012, interested parties will be invited to view data on the spot or via a hosted ‘virtual data room’ (1401).
Sophy Liu observed that Baker Hughes’ product line-based corporate structure meant that much potentially useful information used to be tucked away in silos. This was fixed with the WellData Discovery, built on the PetrisWindsEnterprise (PWE) platform. Petris offers text and geospatial search across all of BHI’s data sources. The new system exposed some data quality ‘challenges’ which were fixed with an enterprise well master database with global unique well identifiers standardizing header entry and feeding common well attributes to other systems. BHI uses a scorecard to keep the system aligned with users’ expectations. Recently, Petris has migrated its GUI from Java Server Pages to a Microsoft Silverlight-based GUI.
Yacimientos Petroliferos Fiscales Bolivianos (YPFB) CEO Luis Alberto Sanchez outlined an ambitious nationwide measurement and control system for Bolivia’s hydrocarbon production. YPFB was looking for an HMI provider with oil and gas experience that could assure scalability, connectivity with third party SCADA systems and OPC support. Iconics’ Genesis32 was selected particularly for its 64 bit support and interfacing capability with Emerson DeltaV, Honeywell Experion, Invensys Wonderware and GE Intellution. YPFB has been using the production reporting system for two years now. Communications have proved critical, the internet satellite sometimes fails and loses data. The system has now been upgraded to the TIA 942 telecommunication standard for data centers and the ISO 10012 measurement management standard. The Iconics Hyper Historian provides disaster recovery with a ‘store and forward’ paradigm. A SharePoint site, embedding Iconics’ PortalWorx, is used to publish information for local government access. The system now ‘monitors and controls 80% of Bolivia’s hydrocarbon chain.’
Pipeline equipment and service provider T.D. Williamson with help from Hitachi Consulting has been delving a bit deeper into the Microsoft stack, rolling out Microsoft customer relationship management (CRM) and the Dynamics AX ERP system. TDW’s ERP scope spans process standardization and software automation from order to cash, ‘ERP touches virtually every aspect of our business.’ TDW investigated ERP systems by Infor, SAP and Oracle before settling on Microsoft Dynamics. A green field manufacturing center in India provided a global ERP template that was rolled out across TDW’s other centers. Hitatchi’s FSA fleet management tool also ran.
A joint Chevron/Covisint presentation covered an ‘emerging business case’ for a software-as-a-service (SaaS) model of connectivity and security. The presentation echoes previous Covisint work with Shell (Oil ITJ June 2011). Chevron plans to provide access to their internal resources for select customers, partners and joint ventures via the Microsoft ADFS federation gateway and the Microsoft card management solution (1405). More from the GEF in next month’s issue and from Microsoft.
Ex-Technoguide founder and CEO, Jan Grimnes, has joined the ffA board as non-executive Director.
Fugro has nominated Arcadis CEO Harrie Noy as member of its supervisory board.
Ron Hovsepian has been appointed to the Ansys board.
Charlie Williams is now executive director of The US Center for Offshore Safety. He was previously with Shell.
Stéphane-Paul Frydman and Pascal Rouiller are now senior executive VPs at CGGVeritas.
Mike Lawrie is new president and CEO of Computer Science Corp., succeeding Michael W. Laphen, who has retired. Lawrie was formerly CEO of UK-based Misys.
Remi Eriksen is new CEO of Det Norske Veritas maritime and oil and gas.
Deborah Humphreville is now Global Accounts manager at EMC.
Curt Terje Espedal is new EU regional manager for Roxar Software Solutions. He was formerly with Landmark Graphics.
Gary Masters is retiring after 17 years with POSC and Energistics.
Expro has appointed George Buckley as its new Chairman.
GSE Systems has appointed Chris Sorrells to its board.
Mark Sams has joined Ikon Science as quantitative interpretation manager, Asia Pacific. He hails from Fugro Jason.
Thailand’s national oil company PTT Exploration & Production has joined the oil and gas industry technology facilitator ITF.
UK-based OPITO has launched a website to raise awareness of career opportunities in the UK oil and gas industry.
KSS Fuels has recruited Adrian Preston as CTO and Brad Ormsby as CFO. Anila Siraj is now VP R&D.
David Aldous has joined Rive Technology as CEO. He hails from Shell.
SM Energy has appointed Herbert Vogel as senior VP portfolio development and technical services. He was previously with BP Energy Co.
Chevron’s Tom Lennon is now VP PIDX International. GE Oil & Gas’ Suresh Rajamani is chair of the standards and guidelines committee.
The 2012-13 pipeline open data standard (PODS) board of directors includes Jeff Allen of Coler & Colantonio, Rob Brook of Pacific Gas & Electric, Ron Brush of New Century Software, Kenneth Greer of CenterPoint Energy, Paul Herrmann of Chevron, Mike King of BP, J. W. Lucas of Enterprise Products, Elizabeth Ziemer of Enbridge, Scott Moravec of Eagle Information Mapping, Dominic Palazzolo of Williams, and Tim Willms of Boardwalk Pipelines.
John Briscoe has been appointed as Senior VP and CFO of Weatherford International. Prior to joining Weatherford, Briscoe was Vice President and Controller of Transocean.
Garrett Geck is now an account manager at Seisland.
David Edson is president and CEO of James W. Sewall Company.
Marshall Watson (Texas Tech) is president of the society of petroleum evaluation engineers (SPEE) executive committee.
Lars Christian Bacher is VP development and production international for Statoil.
Sutherland Asbill & Brennan has launched a crisis management and complex litigation website.
Venture Information Management is inviting participation from UK-based E&P companies in a survey of Microsoft SharePoint usage.
In last month’s review of Jean-François Nauroy’s book, Geomechanics applied to the Petroleum Industry we made the inexcusable error of saying that it did not have an index. It does. 3 ½ pages in fact. Our humble apologies to M. Nauroy and his editor, Technip.
Schlumberger has acquired multiphase flow modeling boutique SPT Group from the Altor Fund II. Reports in the Norwegian press put the price at 2,5 billion NOK ($440 million).
Acorn Energy is to purchase OmniMetrix for $8.5 million. OmniMetrix develops remote monitoring equipment and systems for emergency power generators and pipelines. Pritchard Capital Partners served as an advisor to Acorn in the deal.
IHS has acquired the Computer Assisted Product Selection (CAPS) electronic components database and tools business, from PartMiner and the digital oil and gas pipeline and infrastructure information business from Hild Technology Services.
Lufkin Industries has acquired Zenith Oilfield Technology for £81.1 ($127.3) million net of cash. Zenith provides technology for monitoring and analysis of down-hole data and related completion products for the artificial lift market.
Digitalcore and Numerical Rocks are to merge into an as yet unnamed new company. Odd Hjelmeland is CEO and Vic Pantano is COO. $10 million has been secured for expansion into the Middle East and the Americas.
France’s Fonds Stratégique d’Investissement and IFP Energies nouvelles have agreed to ‘support’ CGGVeritas’ long-term strategy. The units together own 10.7% of the capital and 14.2% of the voting rights.
Coil Tubing Technology has withdrawn from its proposed SEC listing, preferring to ‘focus its resources and capital on its operations and growth,’ and that the costs of preparing and filing SEC reports would ‘impede such focus.’
Insight Venture Partners is leading a $165 million equity investment in Drilling Info (DI). Proceeds will be used to expand DI’s data and analytics offerings and to repay early investors.
Aker Solutions has taken full ownership of its Clean Carbon unit and is to build on its CO2 technology strengths in particular at Norway’s Mongstad testbed. Aker CTO Åsmund Bøe commented, ‘The commercial market for full scale Carbon Capture and Storage (CCS) is further away than was expected a few years ago.’ Cost of the 50% stake was NOK 0 (zero) plus a share in earnings over the next 10 years.
The Green Grid, a consortium of companies, government agencies and educational institutions reports a ‘breakthrough’ in energy efficiency at ‘Project Mercury,’ eBay’s new data center in Phoenix, Arizona. The Green Grid’s ‘power usage effectiveness metric’ and ‘data center maturity model’ guided the design and build of the ‘tens of thousands’ of servers.
Hess Corp has launched Hess Energy Solutions, offering comprehensive energy services for commercial, industrial and institutional customers. The unit uses real-time control and monitoring systems to manage energy-consuming and energy-producing assets (1703).
Honeywell’s ‘Attune’ advisory services (AAS) combine cloud-based tools and analytics with a network of operations centers and expertise in energy management in buildings and facilities. AAS translates facility information into actions that deliver ‘energy, operational and environmental outcomes’ (1704).
The US EPA’s office of transportation and air quality (OTAQ) has awarded a five year $15 million contract to consultants ICF International for analytical and modeling support in the evaluation of mobile greenhouse gas sources, pollutants and air toxics.
The UK government has put up £20 million in prize money for innovations in CCS technology—complementing the £1 billion ‘already committed’ to CCS in the UK.
Shell has added Toronto-based NRX’ Asset Hub to its upstream enterprise application portfolio following a successful 18 month pilot project in the Kashagan oilfield. Asset Hub will be used to manage maintenance master data for new capital projects and brownfield sites. Asset Hub is a central maintenance reference library of validated maintenance information and best practices. The library’s value is enhanced as new site data is added—providing maintenance personnel with rapid access to complete and validated data.
Asset Hub provides connectors for enterprise asset management systems and document management systems. An ‘open’ API framework is available for integration with internal or third party systems. Data governance capabilities and a ‘rules engine’ ensure that all data meets internal and external standards for maintenance, safety and regulatory compliance. Other NRX clients include Chevron and Bechtel. NRX partnered in the Fiatech ISO 15926 ‘work in progress’ (WIP) implementation 1801. More from NRX.
Det Norske Veritas (DNV) has selected Palisade’s @Risk and DecisionTools to analyze and quantify environmental and financial risk in oil, gas and other energy sectors. DNV uses @Risk’s Monte Carlo-based simulation in its models to analyze risk factors in several projects across different industries. @Risk was used in deepwater offshore riser calculations in a probabilistic analysis of flow conditions and corrosion rates. More probabilistic analysis led DNV to adjust the design of a CO2 injection project when the value chain model showed sodium hydroxide consumption to be a major cost.
DNV senior researcher Davion Hill said, ‘Assessing risk factors accurately is critical to our business, as our customers rely on us to help them make better decisions and save costs. This is especially important in the energy industry, where one of the main goals is to bring down costs. We are committed to our role as an objective third party, and use Palisade’s @Risk to help us quantify risk for our customers.’ More from DNV and Palisade.
Variance Reduction International’s ‘lean six sigma’ evangelist Kevin Johnson has been blogging on how ‘mind mapping’ can be used to optimize design and operations of industry-specific activities such as pipeline pigging (cleaning). Mind mapping (MM) is a natural graphical method of representing complex concepts. An MM is usually a single-page representation of a large concept space. MM software includes FreeMind (with a Python interface), an open source spinoff, FreePlane, Visio or even PowerPoint.
Johnson uses the case of improved pigging operations. An MM needs as much detail as possible to make sure that potential areas of opportunity for improvement are not overlooked. MMs can use symbols or text. The idea is to capture the big picture simply and efficiently and see parts of the problem we may not have thought about. More from VRI.
Shell has signed a $4.7 million contract for use of AGR’s riserless mud recovery system on the Prelude floating LNG development off Australia’s north-west coast. AGR’s facilities solutions unit has also signed a £760,000 frame agreement with Lundin Norway for engineering and technical services through 2013.
Aker Solutions has been awarded a FEED contract from Statoil to design the world’s largest spar platform for the Aasta Hansteen field development in the Norwegian Sea.
Genpro Engenharia is to deploy Aveva PDMS on its projects in Brazil.
Common Data Access has awarded Schlumberger’s information solutions unit a new five-year contract to operate its data store.
DrillingInfo has selected Energy Navigator’s Value Navigator forecasting, economic analysis, reserve evaluation and management tool for exclusive use in the United States.
GSE Systems has won new contracts valued at approximately $8.0 million, including a $1.1 million in contracts for process simulation applications in the United States and Saudi Arabia, as well as the sale of simulation and computer-based learning modules to a GSE EnVision global refining customer.
Petrofac has been awarded a US$330 million lump-sum engineering, procurement and construction contract by Gazprom Neft Badra for the first phase of the Badra Oilfield Development Project in Iraq.
Honeywell has signed an agreement with Magnetrol International to incorporate its spillage overfill protection products into Honeywell Enraf Alarm Scout.
ICON Engineering has chosen IFS Applications’ project-based ERP suite to streamline its operations and execute its growth plan. Technip has also implemented IFS Applications in its Flexi France unit. Systems integration was performed by Capgemini.
Industrial Defender and Good Harbor Consulting are to team on cybersecurity of critical infrastructure.
Intergraph and Skire have signed with INPEX for use of Skire Unifier on the Ichthys LNG project.
Niger Delta E&P has chosen Optimization Petroleum Technologies’ PEOffice suite for reservoir and production analysis.
Panopticon Software has partnered with QlikTech to offer real-time visualization capabilities for the QlikView business discovery platform.
Petrosys and geoLOGIC systems have released a direct link from Petrosys mapping package to hosted data in the gDC database. Users can access well header, directional survey, tops and production display in Petrosys maps, 3D visualization and modeling.
Rapid7 and Modulo are working on a holistic view of cyber risk and an integrated solution for risk management.
Ecorp and Gasfrac Energy Services have signed a MOU for EU use of LPG-based fracking technology.
Price reporting agency Argus has introduced North American electricity and natural gas forward curves through ZE Power Group’s ZEMA data management platform.
Spatial Energy and Blue Marble Geographics have formed an alliance to distribute content and software. Spatial Energy’s imagery content will now be a freely available within the Global Energy Mapper software application.
Petrofac’s SPD unit has won a two year £200,000 contract to deliver its WellAtlas drilling management package to Wintershall.
Teledyne Oil & Gas group was awarded a three year global frame agreement with FMC Technologies, to supply a portfolio of interconnect and sensing products and services in support of FMC’s offshore oil & gas business. The deal includes new high power and fiber optic technologies.
Xplore Technologies has received a purchase order from a North American major for its iX104C5 rugged tablet computers. The deal is worth $2.8 million. The tablets will be used to improve the productivity of the customer’s field operations.
A paper by Karl Schleiche of the University of Texas at Austin describes how to process an internet downloadable 2D land line data set with Seismic Unix. The paper is the beginning of an effort to build a open data/open source library for ‘reproducible’ research, software testing, demonstrations, and user training.
The association of international petroleum negotiators (AIPN) has published the 2012 version of its model joint operating agreement—the fruit of four years of drafting by 180 industry representatives from five continents. First published in 1990, the AIPN’s JOA is claimed to be the most widely used JOA in the upstream.
Energistics’ infrastructure team is to developing a specification for packaging and archiving models using Microsoft’s ‘open’ packaging convention. The Resqml schema will be split into a set of independent schema and properties and packaging recommendations will be issued to the Witsml and Prodml special interest groups. The latter teams are also working on a completion object for synchronizing data across drilling systems, well surveillance systems and nodal analysis.
The artificial lift R&D council (ALRDC) is forming an industry consortium to advance artificial lift in horizontal wells. Note that the ALRDC’s recommended practices are ‘not necessarily recommend’ by the ALRDC! The consortium will be led by the University of Tulsa.
Lutelandet Offshore and Christian Michelsen Research are inviting interested parties to join a consortium to establish environmentally sound technology for decommissioning of offshore installations. The initiative came out of a report from Norway’s climate and pollution agency on ‘Decommissioning of offshore installations’ which forecasts a steep increase in decommissioning around 2020 when some 200,000 tonnes of steel a year will need to be dismantled and recycled.
The UK Decc and Norwegian Insok and petroleum directorate have announced the fourth UK/Norway cross-border business to business mentoring program. This year the program has an enhanced recovery focus and sets out to ‘match operators and large contractors with small and medium sized companies in the reciprocal country.’ Mentors include Statoil, GDF SUEZ, ConocoPhillips and Aker Solutions along with several UK companies.
TNO has announced the second phase of its integrated system approach to petroleum production (ISAPP). The project centers on history matching using an ensemble Kalman filter, integrated with JOA’s JewelSuite and CMG’s simulation tools. TNO is inviting interested companies to participate in ISAPP2 with the goal of ‘increasing oilfield recovery by 10% or more.’
OFS Portal has provided the EU Committee for Standardization with a code of practice for e-invoicing in the European Union to facilitate adoption of e-commerce in Europe. CEN (Europe’s Committee for Standardization) was asked by the Commission to develop a framework of best practices for trading parties, service providers and public authorities. OFS Portal CEO Bill Le Sage said, ‘As representing the needs of the oil and gas industry, we were pleased to assist CEN in its efforts and bring our global experience to bear. The code of practice is a benchmark for the rest of the world.’
Key concepts for the code of practice—e-invoices should be formatted with royalty-free standards from international standard organizations, service providers should pledge that data transmitted will be kept confidential. Audit controls should be proportionate to the taxable persons’ individual circumstances. The best practices are available as CEN workshop agreements on 2401. More from firstname.lastname@example.org.
The globally harmonized system of classification and labeling (GHS) will become the hazard communication standard (HCS) of the Occupational Safety and Health Administration (OSHA) later this month. IHS director of EHS and sustainability products Jeff Ladner explained, ‘When the final rule is published, we will update our Product Stewardship Solution and our managed regulatory content. Clients can incorporate GHS into business processes to reduce risk and costs.’
GHS promotes uniform standards for authoring, labeling and document creation in more than 30 countries, and has enabled businesses in those jurisdictions to reduce risk exposure and enhance workplace safety. IHS recommends that companies treat GHS as a ‘strategic operational concern,’ not a regulatory matter. GHS will require a level of detail and precision that ‘exceeds the capability of manual processes, legacy systems and spreadsheets.’ IHS advises use of robust information management solutions for hazard communication. More from email@example.com.
Calgary-based data field data collection specialist HotButton has selected Panopticon’s real-time monitoring and analysis dashboard for its new energy field data analytics (EFDA) system. HotButton utilizes the Panopticon 5.9 release to deliver data visualizations including tree maps, heat maps and scatter plots along with data connectors for business intelligence applications. Field data is collected on a device-independent SD Card providing resilience in the event of device failure. The fusion of Panopticon’s data visualization technology allows field workers to monitor and analyze production data, flow data in refineries and pipelines. HotButton claims that Panopticon’s time series and Treemap visualizations are useful in spotting anomalies in production and environmental data and to track equipment operating performance.
HotButton president and CEO Jane Glendon said, ‘We selected Panopticon after extensive technical testing. The data intelligence products are well designed and facilitate visual analysis, particularly of large volumes dynamic data. Tabular reports at the end of a shift are no longer adequate for optimal operational and safety performance.’ HotButton’s core technology, ArrowSync, is embedded in third party applications such as Petris’ Field-Pro/Production Access as a data collection front end for production reporting. More from HotButton and Panopticon.
Schlumberger’s Interact real time wireline data unit has released a log data browser for the Apple iPad/iPhone. Interact subscribers can access logs as soon as they are uploaded to the central Interact server and visualize time and depth logs with the traditional Sclumberger log display formats. We saw a demo of the app at the SPE Intelligent Energy event and the display is pretty compelling. The fully-featured composite log presentation is enhanced with various ‘smarts,’ a tap on the screen pops up track scales and data values.
The app also embeds the Interact user interface for navigating a log archive. User-defined presentation formats allow for scale change, zoom and addition of curves on the hoof. The app currently supports all measured-depth and time log curves. Security builds atop the existing Interact subscription along with the native Apple iOS4 device encryption. While using the app, multitasking features are disabled to ensure that all data is deleted on close. Schlumberger also has iPhone apps for drilling hydraulics calculations and for browsing its authoritative Oilfield Glossary. One wonders what the .NET folks over in the Petrel/Ocean department think of all this Apple app infidelity? Download the log viewer app from the Apple app store.
Ion Geophysical reports a win in a patent battle with Sercel as the US Court of Appeals for the Federal Circuit affirmed an earlier judgment that Sercel’s 408UL, 428XL, and SeaRay digital sensor units infringe Ion’s US Patent No. 5,852,242. The outcome is that Sercel continues to be prohibited from using or selling its sensors in the US.
CGG observes that the injunction only covers the Sercel digital sensor technology, is limited to the territory of the United States will remain in effect until the patent expires in December 2015. The injunction does not affect Sercel’s right to use or sell its technology in other parts of the world. According to CGG, the 408UL and 428XL recording systems are not covered by the injunction and can continue to be made, sold and used in the United States. An amount of $13 million will be included in Sercel Q4 2011 results to cover the $10.7 million amount plus interest. More from Ion and Sercel.
The UK-based oil and gas producers association (OGP) has established a process safety subcommittee (PSS) to track ‘leading indicators’ and prevent accidents. At the kickoff meeting in London this month, Hans Jørn Johansen was elected to the role of subcommittee steward. Johansen explained, ‘We’ve established the PSS because work with low-frequency/high-impact incidents is important for the industry and society at large.’ PSS chair is Statoil’s Jan Roar Bakke, a 30 year oil and gas safety veteran who is also professor of safety technology at the University of Stavanger. Bakke was an expert witness at the Piper Alpha inquiry.
Bakke explained, ‘We want to go beyond the analysis of lagging indicators—the tiers 1 and 2 indicators that tell you how a leak or a release led to a major accident. By looking at leading indicators, we hope to be able to identify factors that influence major accident risk.’ The PSS is to interface with other OGP activities including the wells expert committee and other with external bodies. The plan is to ‘define leading indicators for major accidents, evaluate their potential and then get the wider industry to use these indicators to ensure safer operations.’
Faced with a proposal from the European Commission on EU-wide regulation of offshore oil and gas activities, the OGP appears less enthusiastic. Speaking at an offshore regulation stakeholder meeting in Brussels earlier this year, OGP executive director Michael Engell- Jensen held that, ‘Such regulation would conflict with current national rules [and] lead to duplication, confusion and uncertainty for the industry. While OGP fully supports the Commission’s objectives of further improving safety and environmental performance, these objectives could be better achieved through a properly worded Directive.’ More from OGP.
Wright Express’ ‘Octane’ app for mobile devices provides fleet operators with a fuel site locator along with real time, transaction-based fuel prices and text-to-speech capabilities. Octane accesses a database of time-stamped transaction data to provide geo-located information on fueling locations and prices. Text-to-speech technology allows for hands-free operation, providing audio directions to the selected fueling site. Wright executive VP Dave Maxsimic said, ‘Drivers need to arrive at fuel sites safely and quickly while fleet operators are looking to save money by better managing fuel costs and consumption. Octane meets all these needs, underscoring our commitment to providing best-in-class fleet management tools.’
Octane also locates sites with alternative fuels, such as diesel, E85 and compressed natural gas. In times of adverse weather or power outage, a 24-hour transaction toggle shows the nearest site where fuel is still available. The app is available free for users of Wright’s payment solutions on both iPhone and Android mobile devices—More from Wright.