Writing on shale gas production and hydraulic fracking falls into two camps. There is the industry camp, where the technology and benefits are held in awe and the environmental impact is, lets say, glossed over. The other camp—driven by Nimbys* and the Green movement—holds that shale gas is the devil’s work and must be resisted at all costs.
Possibly most readers of Oil IT Journal will be in the former camp. But the importance of the anti shale gas movement is hard to underestimate, at least in Europe. In France, where I live, government policy on the subject has been largely informed by Josh Fox’ documentary ‘Gasland’. France has banned hydraulic fracturing and revoked three exploration permits. The effect of the film, the Macondo disaster and the revelation that oil companies ‘do not pay tax’ mean that in an election year, few politicians will defend the industry. The situation in the US varies from State to State. But objection to non conventional exploration is a very real phenomenon—and it is one that is likely to extend into more anti-industry sentiment.
I found the presentations from the US Groundwater Protection Council event, held late last year in Atlanta, a worthwhile contribution to the debate. The GWPC has been working to track the health of the US’ groundwater with its ‘risk-based data model’ and the FracFocus portal. You can read our report on page 5 of this issue and/or checkout the meeting’s presentations.
Of course the GWPC is exactly the kind of thing that some would like to see abolished—although quite why is hard to understand. I know, ‘self-regulation’ is a wonderful idea. But when there is an incident, it is a bit facile to attribute it to a ‘rogue operator.’ Regulation needs to be independent and properly resourced. If you want safety, you need oversight. If you want oversight, it costs money. What was the name of that agency again?...
The GWPC has been operating a ‘risk-based data management system’ (RBDMS) of drilling activity that impacts US groundwater for a couple of decades. The RBDMS is expanding apace, with a new hydraulic frac module to track the increasingly complex operations of the non conventional boom. But, while such a database is undoubtedly a good contribution, in-field inspection and surveillance will be necessary to catch the rogues!
Another contribution to the shale gas debate comes in a new publication, the Journal of Economics of Energy and Environmental Policy (currently a free download) published by the International Association of Energy Economists. The inaugural issue includes a lengthy article on the influence of shale gas on US energy and environmental policy by researchers from the Massachusetts Institute of Technology. This provides a measured analysis of the shale gas phenomenon putting the additional regulation-driven cost at around $500k per well. While the authors claim this will not ‘very significantly’ affect economics, some will undoubtedly find this prohibitive.
The study also looks at the likelihood of a more active LNG market becoming ‘more akin to the global oil market, with prices differentiated only by transportation cost.’ While gas producers may be interested in ‘aligning’ gas prices with oil, this is definitely not the way the US is heading right now. Also, although data in the EEEP study has prices rising from $5 per MMBTU, today they are in free fall—around $2.50 at the time of writing.
Which brings me to another fascinating analysis, from Ziff Energy. Ziff compares the relative value of gas versus oil when measured on an energy equivalent basis. With oil at $100 per barrel, the natural gas price ‘should be’ around $17, i.e. a ‘natural’ ratio of 6:1. Ziff states, ‘the ratio has widened to current unprecedented levels of 24:1, and 32:1 going forward into 2012 as valued on the New York Mercantile Exchange.’ Ziff concludes that the current ‘value disconnect’ is a driver for proposals to export liquefied natural gas (LNG) to countries ‘where natural gas is priced relative to oil on an energy equivalent basis.’
The incredible and widening cost of energy from the two fuels must be making domestic energy consumers wonder why more natural gas isn’t being used in transport. One such ‘wonderer’ is Chesapeake’s Aubrey McClendon, quoted recently in the Financial Times as stating that ‘natural gas is not oversupplied in the US, it is under-demanded.’ Right now, the price ratio of a barrel of oil to a MMBTU natural gas is around 40:1. If the ‘natural’ ratio is 6.1 that means that energy from natural gas is around seven time cheaper than energy from oil.
Exactly what this means at the pump is a bit trickier to figure. I am indebted here to Wikpedia, where I learned that compressed natural gas (CNG) is measured and sold in ‘gasoline gallon equivalents’ (GGE)—i.e. the amount of fuel that gives the same amount of energy as a gallon of gasoline. At $2.5 natural gas and $100 oil, the Ziff ration is now 40:1. Divide this by the ‘natural’ ratio of 6:1 and we find that energy from natural gas is seven times cheaper than oil. So with gasoline at $3 per gallon, a GGE of CNG ‘should’ be around 40¢! Unfortunately this is far from retail reality with CNG at some $2.5 per GGE in the Houston area**. A problem of infrastructure? Instead of powering a transport revolution, in some areas, shale gas is being flared as companies go for the more profitable associated liquids.
Actually, in the couple of conferences I attended last month in the US, the unrepresentative sample of folks I chatted to were less involved in conventionals than in a myriad of projects to re-vamp existing production with a facility upgrade here, and a new well or workover there. Sometimes the ‘big picture’ is the little picture. In fact for most, it is business as usual—even if things are happening at an unusually frenetic pace. Which is more than can be said for the rest of the economy.
* Not in my back yard.
** More calculations on the Oil IT website.
Shell and HP are working on an ‘enterprise IT capability of the future.’ The companies are currently qualifying the innovation effort and are to kick off a consortium to jointly develop new intellectual property that will be productized and marketed by HP under the ‘ERPforIT’ umbrella. Emily Orton from HP’s Autonomy unit told Oil IT Journal, ‘ERPforIT will let companies integrate, provision and manage IT services from diverse suppliers with different delivery models under a single service management system. ERPforIT will enable new business models and form the basis for a new Enterprise IT automation as a service offering.’ The initiative builds on Shell’s existing relationship with HP which provides the oil and gas behemoth’s 100,000 employees with core service management automation services. HP’s Enterprise Services unit runs Shell’s Workplace Services and provides multi-supplier integration, supplemented by HP’s software as a service offering and Shell’s own enterprise IT expertise.
An earlier edition of HP’s ERPforIT framework was rolled out for Baltimore-based Constellation Energy last year when Constellation CIO and VP of operations and infrastructure Jeffrey Johnson observed that while the IT organization was making [ERP] ‘shoes’ for the business, it was itself running barefoot! ERPforIT has replaced Constellation’s ‘fragmented’ processes and tools and ‘heroic’ IT management with a structured process comprising a configuration management database, HP’s discovery and dependency mapping software and leveraging the Information Technology Infrastructure Library (ITIL) V3.0.
Shell likewise uses HP’s Universal Configuration Management Database (UCMDB) to manage software and infrastructure components and relationships providing control in the face of evolving hardware and software configurations and helping avoid downtime. Orton intimated that ERPforIT also embeds HP’s TRIM enterprise records management and compliance solution. Following HP’s acquisition of Autonomy, TRIM has seen major changes with the replacement of its Oracle Inso filtering technology and open source Lucene search engine with Autonomy’s ‘Intelligent Data Operating Layer’ (IDOL). IDOL automatically indexes and ‘understands’ related content across structured and unstructured sources. More from HP.
RAE Systems’ ‘ProRae Guardian CloudServer’ is claimed to be the industry’s first cloud-hosted service for real-time detection and sharing of sensor data from multiple remote sites. ProRae Guardian targets, inter alia, upstream and downstream oil and gas facilities offering secure access to hosted data streaming from real time sources such as gas, volatile organic compounds and biometric information on field workers’ health. Thomas Nègre, RAE Systems VP products and marketing said, ‘Organizations cannot afford to risk worker, public, responder and industrial safety when a critical situation arises. The sharing of gas and other data in real-time with geographically dispersed teams is extremely important. The ProRAE GCS meets this need.’
RAE Systems provides rugged, intrinsically safe and globally certified monitors along with wireless connectivity to ‘elevate’ worker and public safety and improve incident-response time. The subscription-based service is hosted and maintained by RAE Systems in a secure and fail-safe environment to ensure uninterrupted service 24/7. The system operates across firewalls, supports multiple hosts for data aggregation and up to 32 remote viewers. More from RAE Systems.
Following the signing of Invensys Operations Management’s ‘multi-year, multi-million dollar’ contract with Shell for the provision of simulation solutions to its global upstream, downstream and petrochemicals operations, Oil IT Journal interviewed Invensys’ director of simulations and optimization, Harpreet Gulati.
Oil IT—Tell us about the deal with Shell.
This agreement covers the heritage SimSci technology (Romeo, Data Reconciliation, PipePhase and Pro/II). SimSci has some 45 years history in refining and is now a global standard for design and optimization that is used by Exxon, Shell and now Total. Shell in particular has been both customer and partner for over 25 years. By partner I mean that Shell technology and IP feeds into a product’s development. The key philosophy is that data, quality, validated basic data about what is happening in the refinery is the foundation of all monitoring and optimization. Great effort is placed on ensuring the accuracy of basic data. This ‘common reconciled data’ approach is the focus of a joint Shell/Invensys effort.
Is this stored in a real time database?
We do have our own RTDB (Wonderware). But we integrate with what the customer has in place. This may be Osisoft PI or Honeywell’s PHD. Whatever is there.
And what about the IT infrastructure for interoperability between these point applications?
Well, it is not really for me to talk about Shell’s IT infrastructure. But Shell has now standardized on our simulation tools across all upstream, downstream and petrochemicals divisions. These are indeed stand-alone applications. But this significant commitment means that Invensys is now a member of Shell’s IT infrastructure design team. Shell is trialing a lot of new IT trends and is for instance a heavy user of virtualization. This means that we have packaged our tools for deployment in a virtualized environment.
Of course Invensys has its own data integration strategy. Our software is modular and shares, for instance, thermodynamics and Excel drag and drop connectivity. This ensures interoperability and data consistency. But Shell uses a lot of other applications and its focus is different to ours. We can always pass data back and forward between applications. In fact most use cases only require a limited subset of data interchange.
Is the Sim4Me portal in the mix?
Yes, Sim4Me is used by non specialists such as operators and planners to access the simulators. The portal also acts as a bridge to other environments like mechanical and control engineering. Shell has not been using Sim4Me for long, but initial feedback is good and we expect take-up to increase as new versions are rolled out.
Is this dynamic or steady state simulation?
Dynamic simulation is a part of the deal but the main focus today is steady state simulation for design and process optimization. Shell is heavily into design, revamp and improving operations with analysis and decision support. One Romeo module, automated rigorous performance monitoring (ARPM) was developed with input from Exxon and Shell. ARPM provides model-based advice for predictive monitoring and optimization. This shows trends, correlations and provides KPIs to the Historian and dashboards. The key issue for predictive monitoring is that it can tell you not just where you are now, but where you should be. You can perform ‘what if’ analyses to see what would happen if you clean a heat exchanger or compressor blades. It is all about operating close to the ideal efficiency level. This can be done by tracking the ‘delta’ between ideal and actual performance over time and using modeling to support the decision making process.
Do you take the AI approach with canned scenario-based model comparison?
No, we do not use the correlated modeling/AI approach. Most all of our modeling is first principle science based on sound chemical engineering principles.
This is all very well in the refinery, but the upstream is a different kettle of fish with more unknowns in the well bore and reservoir.
Indeed, and the upstream has different skill sets and culture. The traditional upstream is not so concerned about efficiencies in the produce, deplete, abandon process. But this is changing, especially in Shell which is increasingly using the optimizing techniques of the downstream. This is happening at more mature producing assets as well as on complex assets such as FPSOs, which look very much like refineries anyhow with heat exchangers and columns. The upstream is getting more sophisticated. Mature assets benefit from ‘what if’ modeling, to investigate possible upgrades to surface facilities as more water is produced.
But to get back to the well bore and the skill sets, do you plan any partnerships with the ‘upstream upstream’ to better integrate the well bore and reservoir.
Yes. We are working closely with Computer Modeling Group of Calgary to incorporate its GEM fluid flow modeler. This targets an SAGD oil sands development, a field where traditional reservoir modeling tools fail. Other partnerships in the upstream are in the offing.
Another issue in the up/downstream divide is that in the refinery you can always solve a problem with more measurement. This can be hard in the upstream context.
Sure but the upstream is changing. There are more three phase meters deployed, fields are more and more instrumented. We are moving on from the days of periodic well tests. Today there is more real PVT and mass balance measurement in the oil field. But while the upstream is getting more sophisticated, there is one big ‘gotcha.’ It is relatively easy to develop a tool and get it to work, initially. It is much harder to ensure that it is still running and delivering benefits a couple of years down the line. Will people still use it? Will they trust the data? This is where our strategy of sustainable integration comes in, automating as much as possible and eliminating data re-entry. We are back to the data infrastructure and the importance of validated, accessible data. But what goes for data is even more true of software, this is an endemic problem, folks buy an application, deploy it and use it for a while, then the excitement goes and the usage declines, and the system gets neglected and unsustainable.
Vendors are at fault too, with upgrades and changing interfaces...
Absolutely, part of our offering is ensuring that new releases are not disruptive. More from Invensys.
The US Department of Energy has opened an online portal to help oil and gas producers comply with the New Mexico waste pit legislation. The Portal was created by researchers at New Mexico Tech’s Petroleum Recovery Research Center with support from DOE’s Office of Fossil Energy. Pits containing drilling waste, produced water and production fluids are required to meet certain hydrologic, engineering and legal criteria to protect soil and groundwater from contamination.
The portal aims to simplify compliance for smaller operators and encourage the development of marginal resources. Users can check potential pit locations in terms of aquifer depth, geology, soil maps, watercourses, mineral usage and geotechnical data. Topographic and aerial photography base layers show distances from nearby structures. Site criteria maps can be generated and included as part of an operator’s online permit application. The portal was developed using ESRI ArcGIS 9, ArcSDE, ArcIMS, and ArcGIS Server. The work builds on development for the Go-Tech online production database. PRRC hardware includes Dell PowerEdge 4600/6400 dual Xeon servers. Visit the portal on the Source3 website.
The American Records Management Association and the Electronic Discovery Reference Model (EDRM) have published a white paper explaining how the EDRM’s information governance reference model (IGRM) ‘complements’ ARMA’s generally accepted recordkeeping principles (GARP). The IGRM supports GARP principles by identifying cross-functional groups of information governance stakeholders and by depicting their intersecting objectives as a relationship between ‘duty, value and the information asset.’
The white paper purports to identify the many business benefits to be accrued from ‘proactive adoption’ of GARP and explain how the IGRM supports these by enabling an organization to achieve the desired level of information governance maturity. The white paper concludes with an exhortation to ‘lower risks and achieve greater efficiencies through process improvement, electronic discovery’ by increasing integration with an organization’s information governance policy, procedures, and infrastructure.’
EDRM has developed an XML model for records keeping/information discovery. ARMA’s GARP offers more qualitative advice on records keeping. Reading between the lines it might appear that the organizations have gotten together to retrofit the two approaches. Such an activity is not without merit as records responsibilities are shared between IT, legal, records managers themselves and the business. Current members of EDRM included Chesapeake. Exxon and Halliburton were involved in the past. ARMA has many oil and gas members. More from ARMA and EDRM.
IHS has announced IHS Connect, a new online front end to its ‘intellectual wealth and thought leadership’ resources a.k.a. ‘data.’ IHS Connect targets upstream oil and gas professionals, including strategic planners, analysts, economists and new business developers. The Connect platform is a new single point of delivery for IHS’ data and consulting services. IHS Connect Oil and Gas integrates IHS’ products and services into a single dashboard view, providing ‘seamless access’ to combined content from the whole gamut of IHS’ services. Connect promises subscription content aggregation in a customizable dashboard providing navigation, search, GIS-based display and ‘streamlined workflows.’
Following years of acquisitions, IHS occupies a strategic position in upstream information supply. Sub units include legacy PI/Dwights US well data, Petroconsultants’ international datasets, Cambridge Energy Research Associates and others. Most recently the company has expanded into interpretation software with the acquisition of Seismic Micro Technology. More from IHS.
Statoil has awarded a contract to a Kongsberg Oil & Gas Technologies-led consortium for the development of a real-time environmental monitoring solution. The three year, 150 MNOK ($25 million) project was awarded by Statoil’s R&D center in Trondheim as a component of the company’s ‘New Energy and HSE’ research program. The project sets out to ‘demonstrate’ solutions for environmental monitoring of operations in sensitive areas during drilling, production and decommissioning. The plan is to integrate environmental monitoring with daily operations to facilitate early detection and reaction to potential environmental impact.
Kongsberg’s Maritime Subsea unit will contribute sensor and communication technologies to the project. IBM will provide information integration and business analytics while Det Norske Veritas (DNV) is to supply marine environmental analytics and risk management. Statoil itself is to play an active role in the project contributing its offshore operational know-how. More from Kongsberg.
In the November 2011 edition of the TOP500 list of high performance computers, ENI’s 15,000 core HP Proliant based machine came in at N° 87 with a 163 teraflop bandwidth—Top 500.
Intertec has released an explosion proof instrumentation enclosure. An inert gas pressurisation system qualifies non explosion-protected equipment such as analyzers for use in IEC-Ex and ATEX Zone 1, 21, 2 or 22 hazardous areas—Intertec.
C&C Reservoirs has announced the user field knowledge platform. Accessible through the DAKS, digital analog knowledge system online service, UFK lets users input and manage their proprietary E&P data in a standard format—C&C Reservoirs.
MIT researchers have bested Cooley and Tukey with a ‘nearly optimal sparse’ Fourier transform algorithm. The lossy technique promises a ten fold speedup but your mileage may vary—MIT.
IFS has tuned its oil and gas construction market focused ERP solution to the Brazilian market. The new release of IFS Applications supports Brazilian legal including taxes and ‘Nota Fiscal’ documents. IFS customers include: Technip, Wellstream, Heerema, Babcock, Yantai Raffles and SeaDrill—IFS.
ffA, a.k.a. Foster Findlay Associates, has released ‘GeoTeric,’ a novel seismic interpretation package that ‘directly translates geophysical data into geological information.’ GeoTeric embeds ffA’s frequency decomposition and RGB colour blending technologies and introduces new ‘adaptive geobodies’ technology, co-developed with Lundin Norge—ffA.
Meyer’s 2012 suite adds time dependent midfield fracture pressure decline analysis to MShale and an enhanced 3D wellbore model for deviated wells. The latest release can handle much larger datasets thanks to a new 65 bit edition—Meyer.
New Century Software has released a geometric network synchronizer GNS for complex transmission pipeline systems. GNS comprises an Esri ArcMap extension and a server component that builds and maintains geometric networks in a PODS database. GNS can feed into business intelligence application New Century’s own Gas HCA tool—New Century.
Version 3.0 Of WellWhiz, a pre/post processor for CMG’s IMEX reservoir modeller adds perforation, casing and open hole gravel pack modelling, support for retrograde condensate, ‘wiggly wells’ and more—WellWhiz.
The 2011 Meeting of the Groundwater Protection Council, a grouping of US regulators, was held in Atlanta, GA late last year. ALL Consulting’s J. Daniel Arthur offered a primer on well integrity testing, noting that, contrary to popular belief, integrity testing is widely used by industry. For example, the tubular casing used in shale gas wells is tested at the steel mill. Before a frac job, it is pressured-up to check for leaks. If it fails it is replaced and tested again. Other tests are carried out at the wellsite. Arthur concluded that internal and external well integrity tests have been the cornerstone of US EPA’s underground injection control (UIC) program since 1980. Today, with the growth of non conventional exploitation, well integrity is more critical than ever before.
Mike Nickolaus presented the GWPC’s activity in chemical disclosure and groundwater protection. The GWPC’s attention was drawn to fracking by the LEAF case (an early challenge to fracking in coalbed methane operations) and has been active in advising congress on regulation since 2000. The GWPC published the Shale Gas Primer in 2009, followed by a two state study of contamination incident investigations and oilfield practice in September 2011. Most recently, the GWPC has launched the FracFocus hydraulic fracturing chemical disclosure registry which now has over 4,400 frac records loaded. Nickolaus concluded that the time some spend protesting against fracking would be better spent protesting more significant sources of pollution such as septic tanks, agriculture and inadequate storm drainage.
The GWPC has a history of digital data collection going back to 1992 when it introduced its ‘risk-based data management system’ (RBDMS), developed for the oil and gas regulatory program. GWPC’s Paul Jehn outlined the RBDMS Environmental module which collects qualitative and measured field observations for soil, sediment, water and air metrics. RBDMS includes roles-based security, forms for facility creation, sample data, and field and laboratory results all running on a ‘server-neutral’ GIS. Data is delivered using the EPA’s WQX schema, allowing for EPA-compliant data straight from a LIMS*. A new RBDMS hydrofrac module tracks water use and the ‘sentinel indicators’ of fracking’s effects on watersheds.
The GWPC and the Interstate Oil and Gas Compact Commission are now planning a web site of national oil and gas well information for public access. The RBDMS National Oil and Gas Portal will feature live data feeds of oil, gas, and injection information across contiguous state boundaries in a clickable map application. Bi-directional data transfer between states and the portal will be enabled by web services from partnering agencies. Data ‘harvesting’ will initially target areas where frac operations are raising concerns such as the Susquehanna and Delaware river basins.
Chris Harto presented Argonne National Lab.’s investigations into the environmental costs of managing produced brines. Harto recommends a ‘hybrid life cycle assessment’ approach to balance energy consumption, emissions and water use.
Sheila Olmstead of the Washington-based think tank ‘Resources for the Future’ introduced a new study on managing the risks of shale gas development. The survey will seek expert opinions on the risks of shale gas development, survey public perceptions, assess the drivers of and policy levers to reduce risks, understand the current and prospective regulatory landscape and develop recommendations for regulation and voluntary action by firms to reduce risks. The $1.2 million 18 month study has just kicked-off, funded by a $1.8 million grant from the Alfred P. Sloan Foundation. Proceedings on the GWPC website.
* Laboratory information management system.
In his keynote, CEO Bret Bolin welcomed the 600 plus attendees—P2ES’ biggest show ever, noting that 2011 was a ‘transformational year’ for the company with international expansion and significant acquisitions including Explorer, WellPoint Systems and most recently, Beyond Compliance. More acquisitions are planned for 2012.
Jennifer Thorpe (El Paso) and Todd Burdette (P2ES) described how El Paso has used Tobin GIS Studio (TGS) to feed data from Autocad to its in-house ‘Gas Map’ application. Gas Map provides El Paso’s 300 plus users with the ‘big picture’ of its land and geoscience situation. Geoprocessing with Tobin’s cartographic toolkit and ArcGIS are used in the nightly refresh. While small companies may use TGS to manage data in a personal geodatabase, larger units like El Paso (and Chesapeake) store spatial data in ESRI SDE.
Hazel Wilkins (P2ES) offered some advice on those engaging in acquisitions and divestments. Land data administration needs to be an integral part of the A&D process and, as ever, data quality is critical. Wilkins suggests making friends with the deal makers and getting land issues included in the debate. The cost of data cleanup should be included in the deal. In one deal, the client recovered $500k after the deal from undue revenues on properties that has not been sold! But the golden rule is that ‘today’s buyer maybe tomorrow’s seller.’ Oh and ‘make sure you are not selling stuff that you don’t want to sell.’ You don’t want the buyer to be doing a ‘happy dance’ because of acreage acquired in error. Companies may not have the resources to do everything in the deal’s time frame. In which case P2ES’ professional services unit is there to help.
Stonebridge consultants’ Travis Osborne, who has been involved in well master data management for decades, noted that the subject has been ‘rebranded’ a couple of times and wonders, ‘Why are we still talking about this?’ MDM means deciding what to call things and what is important to your company. Some customers can’t handle more than ten data items, some like more. The PPDM data model can easily run into the hundreds. MDM is usually done for a purpose, to link to accounting, engineering or geoscience. If the link is too heavily weighted to one segment, then it will use and own it. So it is better to try to offer something for everybody. Target attributes need to be traceable through the data life cycle, so it may be better to stay with just a few. Project scope needs ring fencing to keep it manageable. It is also important to get at data early in the life cycle.
It is a good idea to avoid ‘swivel chair integration’, cutting and pasting data from email to the ERP system. While bi-directional read/write data access may be desirable, it is unlikely to be worth the cost and complexity. Most benefits accrue from exposing otherwise hidden information. MDM vendors are plenty, but real solutions are few, although Well360° (from Stonebridge partner IDV) got a plug as did P2ES’ Excalibur.
Beyond Compliance (now part of P2ES) was introduced by founder and CEO Ron Visser who observed that worker competency and training was increasingly under scrutiny from the regulator and that today it was necessary to demonstrate due diligence in recruiting. Compliance can be onerous. BC’s stance is that the process should work for employees, not the other way around. Unmonitored employees and safety procedures can mean risk of injury or shut down. BC replaces paper and spreadsheets (‘our N° 1 competitor’) with an integrated compliance management system, ‘ICMS,’ for skills tracking, change management, workplace incident tacking and more. ICMS is a hosted solution that can be tailored to client requirements. For onsite asset management, forms can be downloaded to ATEX certified tablets with checklists to ensure for instance that a new tank is earthed and fire code certified. All of which is bringing compliance very close to maintenance management, but Visser recommends keeping the activities separate. The field device can be set up so that operators only see what they have to. The information gathered can then be consolidated at a higher level in software, checking against other systems. BC powers Husky’s operations information management system (HOIMS). Other clients include Exxon, Nexen, Devon, Murphy, Sonatrach, PetroCanada, ConocoPhillps and Pason.
Brent Douglas described how Encana has merged SCADA data into its P2ES Enterprise Upstream field operations database. Encana went live with EU in 2008 and now has around 12,000 active wells. In the old days, an operator would visit each well every day. But increasingly, well sites are being instrumented, at a typical cost of $100k per site. Enter the ‘well site of the future’ with cameras, infrared, H2S monitors, emergency shutdown devices and automatic chokes. The question now arises, what should be brought into the database? This is a moot point. Currently CygNet provides wellhead and line pressure, tank levels, turbine meters and (real soon now) cathodic protection readings and more. Water management is a key compliance issue in North Louisiana and the system tracks water hauls, water/gas ratios etc. Some data goes back out to the operators—notably production shortfalls from Aries—getting operators to explain why they have been adrift. Aries forecasts are stored in CygNet by API number. A useful tool, ‘More4Apps’ uploads data—more4apps.com.
The perennial issue of customization versus maintainability was discussed in the Enterprise Upstream user forum. Noble Energy’s John Paciotti Noble prefers to talk of personalization of Oracle forms, for example to issue alerts. Is this customization? Possibly, but they are not overwritten in a patch cycle and are supported by Oracle. They can be put to good use to define conditions, context and actions and to force compliant data entry. The well report name on the form title bar can set defaults and blank out non relevant options. ‘Sophisticated’ vs. unsophisticated working interest partners can be flagged so they receive the appropriate agreements. Forms can call outside applications such as Allegro for gas marketing data. Noble is prepared to share its customizations and has requested a repository for customization best practice sharing between users.
Julie Burgan (Swift Energy) has been leveraging Oracle alerts. When daily SCADA data is loaded to EU, alerts notify the lease operator as to which meters failed to load. Alerts are issued for well completions that have not been tested in the last 60 days and for meters without gas analysis in the last 90 days. Other checks like completions with production but status ‘shut in’ keep the land department on its toes.
P2ES’ Clay Allison provided insights and a forum for debate as to the business intelligence roadmap for P2ES’ portfolio and on product management practices. P2ES’ Excalibur analytics offers embedded Spotfire for production reporting and capital allocation. With the WellPoint acquisition came Bolo and its intelligent dashboard. There is overlap in the BI space and P2ES is currently figuring out how to rationalize its P2 Analytics offering. The debate turned to questions of security—a big issue for all-seeing BI tools. Oracle’s business intelligence suite is good in that it picks up underlying security models. But it is harder to prevent a Spotfire power user from accessing say HR or sensitive well data. Another issue is the in-house Toad power user writing SQL embedded in Excel. Real time reporting is highly desirable but ‘there are lots of holes in it as of now.’ BI tools are more complex than the transactional environment. And departments tend to have different databases and deploy whatever reporting tool comes with it. Part of the problem is that Oracle buys companies without much thought to rationalization. P2ES is at least trying to offer better interoperability. More from P2ES.
Speaking at the EU OPC Foundation roadshow in Paris last month, president Thomas Burke traced the history of what originated as OLE for process control back in 1995. Since then, OPC has grown to become the ‘interoperability standard for a connected world.’ OPC has more members and take-up in the EU than in any other region. While the Microsoft COM/DCOM technology was ‘a good idea at the time,’ it has proven complex, insecure and has shut out some markets. DCOM was tried in the PLC environment but ‘it did not work.’ Notwithstanding the misgivings, classic OPC was a resounding success with an estimated installed base of 40 million devices. By 2004, with the wide take up of XML, it was ‘time to push the reset button.’ Work began on the new Universal Access (UA) spec, standardizing on the message rather than on lower level protocols. OPC is ‘the open alternative architecture to closed proprietary systems.’ OPC-UA extricates the standard from the Microsoft world to enable more platforms, applications and vertical integration.
OPC-UA is scalable from embedded systems to the mainframe and is key to multi-vendor interoperability. Along with Microsoft platforms, OPC spans servers, SCADA systems, Linux and smart phones. SAP ‘may not be the best client,’ but OPC-UA is the basis of SAP’s device-level interoperability. Most all automation vendors are on board.
Michel Condemine (4CE-Industry) who heads-up the French branch of OPC observed that the COM version of OPC only addressed the ‘easy’ problems, and had too large a footprint (20 meg RAM) for embedded systems. Despite the move away from a Microsoft only environment, Condemine appears to believe that more can be done to ‘open’ the spec. Hence the joint industry project to develop an open source (but not, ‘free!’) IT stack to support the higher level protocol. A group of French companies (Areva/Euriware, EDF, 4CE and Schneider Electric) are engaged on a joint project to develop a complete stack for Windows (including CE) and various Linux systems. The aim is for a compact (under 500k) code base that runs across all platforms along with a reference implementation for a ‘mainstream’ OPC UA server. The C++ code is developed outside of .NET/Visual Studio to make it ‘information model independent’ and nonproprietary. Security and regulatory compliance is assured by PKI certificates and a certification authority.
In the Q&A, Condemine was asked, how ‘real’ was OPC-UA in the face of the huge installed base of ‘vanilla’ COM-based OPC. He assured the audience that Siemens, Schneider Electric and others all use OPC-UA today. Legacy OPC can be ‘wrapped’ to turn them into UA servers. Areva’s wind generators have OPC UA servers inside. SAP has a true UA client. Just as COM is dead, so is vanilla OPC. UA is the ‘route to the future.’
While the Open UA project delves deeper into the infrastructure, another project, the ‘Meta model’ sets out to occupy the high ground—federating the information models of different plant communities. Target domains include ISA S95, S88, IEC 61850, 61970, Bacnet, Mimosa and EDDL. Here collaboration is planned with Mimosa/OpenO&M, S88/95, OAGiS and more. Sparx Systems’ Enterprise Architect is the development tool of choice.
Claude Gomez (SciLab Enterprises) presented SciLab’s work on a free and open source scientific calculator. The computation engine can be embedded into applications providing a high level language, hundreds of math functions, data structures and user defined types. Matrix computations and simulations are also included. Math teachers use it, ‘so it must be easy!’ The SciLab project has backing from the French Petroleum Institute (IFP). An ‘Enterprise’ edition adds support and training.
AGR Field Operations has appointed John-Paul Guerrero manager of business development for the America’s region.
David Anderson has joined Aker Solutions as global maintenance manager, drilling services. He hails from Ability Drilling.
Martin Craighead is Baker Hughes’ new president and CEO, succeeding Chad Deaton, who remains chairman. Craighead was previously COO.
Arnstein Øvsthus has been appointed as general manager of CGGVeritas and Eidesvik Offshore’s ship management joint venture. He hails from GC Rieber Shipping.
Ruben Moreno has joined Concentric Energy Advisors as assistant VP in its Washington, DC office. He was formerly with R.W. Beck.
Dahlman Rose has appointed Robert Meier as head of equity sales and trading. He was previously with Gleacher & Co.
DNV has launched a Deepwater Technology Centre in Singapore in collaboration with the industry, universities and governmental R&D institutes.
TengBeng Koid has replaced Jo Webber as CEO of Energy Solutions International. Both are members of the board. Koid was formerly President, international for SMT.
Former Commissioner of Environment for the City of Chicago Suzanne Malec-McKenna, has joined communications firm Jasculca Terman and Associates, as Senior Counsel, working with JT’s energy and environmental clients.
Freepoint Commodities has hired Michael Gamson as a Senior Managing Director in charge of developing a domestic refined petroleum products business. He was most recently a manager at Vitol.
GSE has named Phil Polefrone, formerly of UniStar Nuclear Energy, Senior VP of Workforce Solutions.
Murry Gerber has been named to Halliburton’s board of directors. He was formerly chairman and CEO of EQT Corp.
Honeywell has named Shane Tedjarati as President, High Growth Regions and Stephen Shang is to take over as President, Honeywell China. Tedjarati will continue to report directly to Honeywell Chairman and CEO Dave Cote.
The UK Energy Industries Council (EIC) has appointed Ian Stokes as its new CEO, taking over from Mike Major.
Philippe Pinchon has taken over as Director of the French Petroleum Institute/IFP School, replacing Jean-Luc Karnik who is moving over to IFP Training.
IHS has appointed former SMT president and CEO Arshad Matin as executive VP to lead its Information and Insight Operations unit and the IHS Research and Analysis organization.
Brian Hanson is the new ION Geophysical CEO. He became President in August 2011. Former CEO Bob Peebler became Executive Chairman on January 1, 2012.
Kosmos Energy has named Darrell McKenna COO, responsible for its global drilling, development, and production functions, and health, environment, and safety programs. McKenna was formerly president of Hess Australia.
Carlito deSouza is to head-up Maptek’s new office in Calgary.
John Harp has been named CEO of MDU Resources Group unit Knife River Corp. He continues as CEO of MDU Construction Services. Dave Barney is president of Knife River and Jeff Thiede, president of MDU Construction Services.
The first chair of the MIT Energy Initiative has been filled by Christopher Knittel, William Barton Rogers Professor of Energy Economics at the MIT Sloan School of Management.
Steve Malcolm, former president and CEO of Williams, has been elected to the Oneok Partners board of directors.
P2 Energy Solutions has named Amy Zupon Chief Technology Officer. She was previously Global Head of Product Management for Ventyx.
Pace Global Energy Services has named Krish Chettur as director in its consulting practice. He will be based in its London office. He hails from IHS CERA.
Shailesh Saksena is the new sales director at the Paradigm New Delhi office.
Tracey Dancy’s new consultancy Dancy Dynamics is offering marketing solutions to companies in the oil and gas vertical.
January 2012 sees the launch of the new TU Delft Process Technology Institute headed up by Professor Andrzej Stankiewicz.
FEI has acquired rugged scanning electron microscope manufacturer Aspex Corp. for $30.5 million.
Avatar Systems has acquired Ogas, an oil and gas software house.
Barco has received a €50 million loan from the European Investment Bank to finance R&I into networked visualization and software for, inter alia, control rooms.
Merrick Systems has received a majority investment from private equity firm HitecVision. Founders Samina and Kemal Farid remain ‘significant’ shareholders.
Esri has acquired business intelligence software boutique SpotOn.
IFS is to acquire all the share capital of LatinIFS Tecnologia da Informação in an all cash transaction.
Wood Group’s Mustang unit has acquired Latin American automation engineering and consultancy company ISI Solutions for an initial consideration of US$5.2 million.
Pareto Staur Energy and Drilling Technologies are to invest NOK 50 million in Numerical Rocks.
Oceaneering International has acquired AGR Field Operations for approximately $230 million.
Timesway Group has offered $3.76 per share for the outstanding shares in Pansoft.
Schlumberger has acquired a Shell Technology Ventures-backed company, through-the-bit logging specialist ThruBit.
Total Safety has acquired Z-Safety Services.
The Abu Dhabi Global Environmental Data Initiative (AGEDI) hosted the 2011 ‘Eye On Earth’ summit last month. The group has implemented a new ‘international geospatial platform for sharing environmental data’ i.e. a website. The revamped ‘Eye on Earth’ was jointly developed by Esri, Microsoft, and the EU Environment Agency. The site provides tools for creating maps, accessing datasets and managing geospatial content. Content can be shared with the public, among groups or used privately. Stakeholders can use the network to develop policy, design plans and ‘take action.’ Under the hood, Esri’s ArcGIS Online runs in a Windows Azure cloud. EEA hosts and maintains the platform.
A ‘stakeholder-based certification system’ for oil and gas was launched late last year by Equitable Origin (EO) to promote higher environmental and social standards, greater transparency and more accountability. The market-based mechanism is designed to incentivize and reward best practices in the industry. EO has developed the ‘EO100’ standard as a basis for certification across environmental, social and sustainability performance. Consumers will be able to buy from companies that responsibly produce and source oil and gas, ‘just as they can buy Fair Trade coffee.’
The current (Jan 2012) issue of GE measurement and control’s Orbit magazine includes an introduction to cyber security. This begins with a bewilderingly long list of standards and regulations that impact cyber security. Risk management and defense-in-depth are essential strategies with NIST SP800-37 a framework for information system security risks. ‘Defense in depth’ involves multiple layers—network, host computer and application defense. GE uses ‘misuse’ and ‘abuse case’ scenarios to identify security requirements. Manual and automated code reviews, ‘fuzz testing’ and penetration testing are also useful. GE has begun to evaluate its security posture using the ‘building security in maturity model’ (BSIMM). This joint industry working group, publishes a suite of recommended practices and self-evaluation tools. GE has also initiated an internal certification program to secure products early in the development lifecycle.
Anti virus vendor Kaspersky Labs has produced a white paper enumerating ‘ Ten ways the IT department enables cybercrime.’ These include neglecting proliferating copies of data on USB sticks etc., failure to appreciate the value of data on mobile devices and being in denial about personal use of laptops and other mobile devices. It is a moot point as to whether these are really the IT departments fault. Left to its own devices, IT would ban iPhones etc. and go back to diskless workstations without USB ports.
A new report from Industrial Defender and Pike Research highlights the need for an integrated approach to security, compliance and change management in industrial control systems. The report, ‘ Convergence in Automation Systems Protection’ found that as automation environments were developed over decades without a master plan, they now contain heterogeneous systems that are difficult to manage. This, combined with limited resources, and the ‘exponential growth’ of intelligent device deployment is creating an environment in which operators have limited experience.
Christian Michelsen Research unit GexCon has received approval from the US Pipeline and Hazardous Materials Safety Administration (PHMSA) for the use of its FLACS simulator for LNG vapor dispersion modeling scenarios according to federal regulations (49 CFR 193.2059). The approval culminates a multi-year effort which included the validation of FLACS against a total of 33 dense gas dispersion experiments, as specified in the model evaluation protocol. To date, FLACS is the only approved model for the simulation of all LNG vapor dispersion scenarios required for the siting of an onshore LNG facility in the United States.
GexCon continues to develop FLACS via a joint industry project (ExxonMobil, IRSN, Statoil and Total are partners) developing an incompressible solver, a parallel version of the software and improvements to dispersion and explosion modeling. GexCon is also involved in an EU Framework program working on CO2 releases from pipelines to study dense gas dispersion in terrain.
A next generation postprocessor, FlowVis, is under development at GexCon’s parent company, Christian Michelsen Research. Other ongoing R&D concerns methods and models to investigate offshore and onshore dispersion, explosions and fires. CMR is also looking to establish a Norwegian center of excellence for safety and security (NORCESS).
Presentations from last year’s FLACS user group and more from GexCon.
Absoft has won a 250,000 contract for Dolphin Drilling in Brazil to implement materials management and financial software.
Aveva and Riegl Laser Measurement Systems are to integrate RIEGL’s laser scan hardware with AVEVA’s portfolio of laser engineering and design software solutions.
Swedish ÅF AB is to deploy Aveva Plant as its principal design portfolio for new large and medium sized design projects. Brazilian Setal has implemented Aveva Net to manage engineering information management for Petrobras.
Wood Group’s Alliance Engineering unit has been awarded the detailed engineering and design of topside facilities for Williams Partners’ Gulfstar FPS spar.
Dong Energy used 3D visualization technology supplied by Aker Solutions to analyze and plan operations on a damaged subsea control module.
Cadac Organice has implemented its SharePoint-based solution for engineering document management at pipe and cable infrastructure provider Visser & Smit Hanab.
DHL has been named logistics partner of South Africa’s Mining Oil and Gas Services in the development of a new Oilfield Services Complex in Saldanha, South Africa.
Numerical Rocks has teamed with Drilling Technologies to offer reservoir property characterization services including advanced mud logging service, traditional and computational core analysis.
The Gulf of Mexico Research Initiative has issued a new request for proposals for research into the effects of the Deepwater Horizon incident. Up to $7.5 million per year is available. Funding comes from BP’s $500 million, ten year commitment—1801.
The Houston Technology Center and the NASA Johnson Space Center are teaming to accelerate the growth of emerging technology companies in the Houston region and to develop the insights required to support NASA’s long-term goals of increasing private/public collaboration. NRG Energy is supporting the effort with a $25,000 donation.
Southwestern Energy has engaged Noah Consulting to work on its ‘Sumit’ solution for well data management along with support for Southwestern’s new application environment.
BP has awarded a Foster Wheeler unit the engineering, procurement and construction management contract for an enhanced gas separation project (EGSP) in Scotland.
Woodside Energy has awarded a $150 million contract to FMC Technologies’ Australian subsidiary for the design, manufacture and supply of subsea production systems to support the Greater Western Flank Phase 1 Project.
Petrofac unit SPD has secured its first contract and implementation of its drilling management software program, WellAtlas to an Aberdeen-based independent.
Integrated Energy Services unit has signed an agreement with
Schlumberger’s Production Management division to supply the ‘emerging and growing’ production services and production enhancement market.
Yokogawa Engineering Asia has been awarded a contract for front end engineering design of an integrated control and safety system for the Arrow Energy LNG Project.
Speaking at ANSI-ESO Joint Presidents Group meeting in Washington late last year, American Petroleum Institute standards director David Miller recalled the API’s 90 year history of standards-setting. The first drilling threads standard was published in 1924. Today the API maintains some 600 standards across the industry striving for ‘openness, balance, consensus and due process.’
For the UK-based Oil and Gas Producers association (OGP), the Montara and Macondo incidents have underscored the need for ‘robust and comprehensive’ standards. OGP therefore welcomes the resolution of the International Regulators’ Forum (IRF) meeting last November in support of ISO/IEC standards as the basis for global offshore regulation. The IRF is setting up a standards subgroup to ‘engage’ with the OGP standards committee, the ISO/TC67 management committee and other relevant groups.
To minimize the impact of disasters, terrorist attacks and other major incidents, ISO has published a new standard for emergency management and incident response—ISO 22320:2011. Workgroup convener Ernst-Peter Dö bbeling said, ‘Incident response requires participation of public and private organizations working at international, regional and national levels. Harmonized international guidance is needed to coordinate efforts and ensure effective action. ISO 22320 can be used by all types of organizations to improve their incident response capability.’
The US National Shipbuilding Research Program (NSRP) has announced a $1.4 million project to automate data exchange between CAD models and production planning systems, leveraging earlier NSRP ERP/CAD integration—1903.
Fiatech has just published a report titled, ‘Harmonization of pump schemas with the ISO 15926 reference data library,’ an ‘exploratory analysis’ of the work required to harmonize activities across ISO 15926 RDL, the automating equipment information exchange (AEX) cfiXML and the Hydraulic Institute’s standard for electronic data exchange for pumping equipment, HI-EDE 50.7. The mapping effort is a new Fiatech project for 2012.
Aker reports progress on its ‘myDrilling’ customer portal, now several weeks into a pilot project with client Seadrill and its West Phoenix ultra deep water semi-sub. myDrilling is an interactive portal under development by Aker’s Drilling Technologies unit that sets out to enhance collaboration with customers and to improve performance and safety. On login, users can access real time and stored data on installed equipment and collaborate on ongoing activities. myDrilling helps an operator returning from four weeks off the rig to get up to speed on current service orders, bulletins, and project status. Aker plans to productize the tool adding real time data from Riglogger, project support and support for mobile clients including BlackBerry, iPhone and Android devices.
Aker’s Sjur Henning Hollekim said, ‘Feedback from the initial 20 active Seadrill users is positive and we plan to increase the number of users to 100 over the next weeks and months and add functionality. We are still fairly early in the development of myDrilling, which is part of our eConcept program.’ More from Aker.
Maersk Drilling has selected simulators from Kongsberg Maritime for its Maersk Offshore Simulation and Innovation Centre (MOSAIC 2) training facility in Svendborg, Denmark. The frame agreement covers delivery of a range of simulators and ongoing support for offshore drilling operations training, extending an existing facility.
The deal includes an offshore vessel simulator that can be configured variously as the control room on-board a semi-sub, a jack up ‘Towmaster’ station or an anchor handling vessel. The simulator is mounted on a motion platform to give crews a highly realistic experience in a life-like setting. The MOSAIC includes Kongsberg’s K-Chief automation system, K-Pos dynamic positioning trainer and the Neptune engine room simulator with an interactive touch screen controller. A new riser management and crane simulator will be available in the final version scheduled for 2013.
Maersk Training operations manager Tonny Moeller said, ‘Lessons learnt in the MOSAIC 1 simulator have demonstrably improved our safety record. The new centre is expected to enable time and cost savings for Maersk Drilling because offshore crews and specialists can train on procedures and emergency situations prior to a mission.’ More from Kongsberg.
Speaking at the recent Nokia/Qt Developer Days conference in Munich, Mike Krus, principle software engineer with Midland Valley Exploration (MVE), offered attendees a geological fireworks display of colorful maps, cross sections and structural geological models. MVE’s software runs on both Windows and Linux in 32 and 64 bits. Qt enables a common toolkit across these platforms and a single common code base.
Qt’s WebKit browser code is put to good use, enabling co-visualization of MVE’s results with Google Earth data. The cross platform nature of Qt has also been leveraged in porting to rugged tablets and other devices for field work. But Qt is not just a GUI. MVE uses the Qt Concurrent toolkit to farm out compute intense tasks across multi-core architectures for hassle-free parallel programming. Structural geology’s requirements for kinematic modeling stress a graphical toolkit to the limits. To date, MVE has relied on Sim’s Coin3D for its tools. Coin is in the process of being phased out and replaced by OpenSceneGraph. Another change is that since Nokia now only supports mobile platforms, MVE gets desktop support for Qt from Digia. More from Midland Valley.
FEI Company has announced a ‘core-to-pore’ imaging workflow for shale gas reservoirs. The petrographic workflow includes novel automated, high-resolution imaging of cores at scales ranging from centimeters to nanometers. Multi-scale imaging is key to relating the fracture network observed in core samples to the nanometer-scale distribution of pores.
Paul Scagnetti, VP of FEI’s Natural Resources unit explained, ‘The large data sets generated in the analysis preserve the spatial relationships between features at all scales. The relationship between permeability and pore network connectivity determines how hydrocarbon will flow to a fracture for production.’
The solution is implemented on an FEI scanning electron microscope systems by adding the modular automated processing system (MAPS) imaging software and Qemscan petrographic analyzer. A grid of high-resolution electron microscope images covering the sample surface, is stitched into a single data set. 3D structural data from a DualBeam system can be overlaid or indeed optical microscopy, micro-CT, or cathode-luminescence data. More from FEI.
Saudi Aramco’s ‘EXPEC’ research center has announced successful field testing of a ‘downhole drilling microchip.’ The device is a low cost combination of sensors and data recorder that can be injected into the drilling mud, recording mud pressure and temperature as it circulates down the drill string and back up to the surface. Aramco’s ‘drill pill’ has been developed in a four-year joint program between EXPEC and Tulsa University’s drilling research project (TUDRP). The idea developed when Aramco’s Shaohua Zhou met up with TUDRP associate director Mengjiao Yu at a meeting of the Society of Petroleum Engineers. Zou explained, ‘Our idea was for low-cost downhole data acquisition system that could help optimize mud and cement formulations while drilling and reduce data acquisition cost.’
The 7mm prototype pills travelled 6.6 km in an 8-3/8” wellbore to 11,050ft. The chips provided ‘meaningful and realistic’ dynamic bottom-hole pressure and temperature measurements. The drill pill is an ‘open’ platform that can be adapted for other mini sensors. More from TUDRP and from EXPEC.
Aker Solutions has opened a new drilling equipment simulator in Houston to provide offshore rig operators with safety and drilling efficiency training. The $2.5 million ‘state-of-the-art’ facility doubles the capacity of Aker’s Katy-based training center. The technology behind the simulator is supplied by Aker’s Performance Technology Center (PTC) unit, formerly First Interactive. PTC’s X-Factor tools include ‘mathematically correct’ models of rigs along with visionarium-type virtual reality environments.
Aker’s Glenn Ellis said, ‘Realistic, real-time visualization of drilling operations enables rig operators to make better and faster decisions. The result is safer operations, more efficient drilling and increased rig uptime.’ The new 9,800 square foot training center includes a 240° domed screen and real drilling control systems software. Commercial rigs are ‘meticulously’ recreated as a virtual asset, including equipment and control systems. Some 88 servers power dual simultaneous simulations. Aker also operates simulators in Brazil, Singapore, Norway and South Korea and, real soon now, in Baku, Azerbaijan. More from Aker.
Casper Wassink has just received a PhD from TU Delft for his iconoclastic work on non-destructive testing (NDT) and safety systems in refineries and other plants. Wassink found that potentially life-saving new technologies can spend ‘decades’ gathering dust before industry is forced to implement them—often following a major accident. Wassink’s study found accidents are not caused by technological or regulatory failure but rather by cultural issues and compartmentalization in the sector. Wassink explained, ‘The practice of dividing budgets into a number of separate stockpiles for inspection and maintenance and the ensuing cost cutting has led to a culture in which NDT departments have developed an aversion to innovation and look no further than the next quarterly report.
As a result, innovations that reduce safety risks but which only start to yield cost savings in the longer term are left on the shelf. To bring about a change in this undesirable situation requires the cooperation of the science, service provision and industrial sectors with the common aim of innovating more rapidly.’ What is often overlooked in industry is the fact that NDT is performed with two different objectives—a social one, to safeguard the public, and a commercial one, to optimize asset availability. Ideally, monitoring should be independent of industry, but current government policy is leaning in the opposite direction with the privatization of key NDT positions. More from ApplusRTD.
Rockwell Automation has released a free Safety Return on Investment (SRI) tool to compute potential savings and productivity gains from safety systems-related investment. The SRI was co-developed with machine safety consultant J.B. Titus.
Rockwell’s Mark Eitzman said, ‘Investment in safety programs and systems can reduce the financial and human impact of incidents in a facility. But engineers, plant managers and EH&S professionals struggle to accurately cost-justify safety investments. The SRI tool calculates the potential cost of an incident and demonstrates the financial benefits of a proactive safety program.’
The tool uses data such as cost of controls and software, equipment uptime, direct injury cost and indirect costs such as noncompliance fines and repair costs. Rockwell plans to update the tool to keep it relevant and will add a new category for quality improvement. Download the safety tool and read Titus’ presentation.