IT security specialist McAfee has just released a report ‘In the Crossfire, Critical Infrastructure in the age of Cyber War.’ The report, prepared by a team from the Obama Presidency’s Center for Strategic and International Studies (CSIS), involved 600 IT and security executives from critical infrastructure enterprises around the world. Operators report that their IT networks are under repeated cyber attack often by ‘high level adversaries’ with ‘severe’ impact.
‘Theft of service’ cyber attacks are highest in the oil and gas sector (75% of respondents). Oil and gas also reported the highest rates of ‘stealthy infiltration’ (71%). Typically service theft is by distributed denial of service (DDOS) from a rented ‘botnet.’ Motives for such attacks may be mischief or financial gain. Their impact includes making websites inaccessible and affecting e-mail, IP telephony and other ‘operationally significant functions.’ The latter includes oil and gas sectors attacks on SCADA/control systems that could give hijackers control of systems, creating the potential for large environmental disasters.
The study analyzes the degree and impact of regulation on cyber security—and the involvement of foreign governments in such nefarious practices. Regulation is high in India, China and Germany, but (despite Homeland Security) lowest in the US. Most believe that foreign governments are involved in network attacks against their country’s critical infrastructure, with the US and China seen as the ‘most worrisome’ potential cyber aggressors. Cost is the biggest obstacle to ensuring security. But in oil and gas, lack of awareness of the problem is a serious issue. ‘Management does not understand the scale of the threat.’ In Saudi Arabia a remarkable 90% said that their sector was under prepared to some degree.
The report introduces a Security Measure Adoption Rate scorecard that evaluates security fixes such as encryption, authentication and application white listing. The SMAR evaluation showed China as leading the field in SCADA security. Chinese operators have adopted nearly three times as many key security measures as Indian and Spanish operators. The 2008 Conficker worm attack on Microsoft Windows-based systems was ‘a wake-up call’ as ‘it got into places that raised real concerns.’
Addressing the issue of SCADA/control system to internet connectivity, Phyllis Schneck McAfee VP and CSIS member noted, ‘Remote access to control systems poses a huge danger. We must either protect them appropriately or move them to more private networks.’
Such ‘reperimeterization’ was a hot topic in our report from last month’s API oil and gas IT Security conference. The full McAfee/CSIS report and slides are available on oilit.com/links/1002_1a and 1b.
Speaking at the SMi Data and Information Management conference in London this month (full report in next month’s Oil IT Journal), Hampton Data MD Wally Jakubowicz offered a ‘magic formula,’ relating decision quality to parameters including data quality, expertise, data volumes, available time frame and financial resources. The formula was used to develop a taxonomy of decision types from data-driven to ‘gut feeling’ based. Current data quality approaches fail to capture either the expertise of the decision maker or what data was used. Modern technology can help here, as user profiles and actions can be captured and extend a data object.
A one day workshop followed up on Jakubowicz’ presentation, investigating how the human/data interaction could be captured and augmented with relationships and time stamps. The idea is to propose a new open industry standard for metadata relevance, knowledge and data quality. Feedback from the workshop (participants included Aramco, Total, KOC, Dong, Vale and Schlumberger) was positive. Jakubowicz told Oil IT Journal, ‘Getting real metrics seems to be the elusive holy grail that many are looking for.’ The plan now is to initiate an E&P Metrics Interest Group, possibly under the auspices of the UK’s PILOT program. More from hamptondata.co.uk.
I was surprised when I read, in Steve Ballmer’s address to the Microsoft Global Energy Forum (more on page 7), his explanation of Microsoft’s latest marketing concept of ‘three screens and the cloud.’ Previously I understood the ‘three screens’ in Microsoft’s new paradigm for world rule to be a cell phone, a laptop and a workstation. Not at all! Screen three is a TV! While the TV as a ‘third screen’ makes a lot of sense for Microsoft, which has been working for years to get a bigger slice of the media gateau, the TV in the enterprise is harder to swallow. Maybe the ‘one size fits all’ approach to marketing has its limits!
One size doesn’t fit all for upstream data modeling either, as our report in this issue from the PPDM ‘What is a Well’ WIAW workgroup underlines. As PPDM CEO Trudy Curtis observed, ‘As an industry, we don’t agree on these fundamental objects.’
Of course, not agreeing on terminology is not limited to oil and gas. When I moved from London to Aberdeen, I was surprised to find that what I called a ‘turnip’ was called a ‘neep’ in Scotland. But that’s not all. A turnip can also be a rutabaga, a swede, or vice-versa depending on where you are and who you are talking to (1). The point is, whatever we decide to call a ‘turnip’ we are still likely to come unstuck in some localities and with some usages.
Listening PPDM’s effort to cage the ineffable reminded me of an earlier attempt, POSC’s (now Energistics) Epicentre E&P data model. While I do not claim personal knowledge of Epicentre’s entrails, I understand that its modeling of wells and wellbores was authoritative at the time and may constitute ‘prior art’ of some use to today’s practitioners. Unfortunately, Energistics’ website refresh has all but written Epicentre out of its history along with inward pointing links to seminal facets of the project. This (hopefully temporary) glitch is a shame for such a major modeling effort. Epicentre, by the way, was defined in the Express (3) data modeling language, generally considered to be as rich a tool as it was hard to master! It also embedded a strong E&P modeling body of knowledge going back to the American Petroleum Institutes RP 66, itself a standardization of Schlumberger’s DLIS log data format.
Tim Berners-Lee’s (TBL) idea of ‘linked data’ has seen some encouraging take-up from the US and UK governments. The US Department of Energy has launched openEI.org, an open-source, ‘linked data’-based website of energy data. In the UK, the data.gov.uk seeks to provide a ‘programmable entry point’ into government data. I say ‘encouraging’ rather than, for instance, ‘world-shaking,’ because both sites have something of the skunk works about them. If you poke around a bit looking for something specific, you may be in for a run around.
The important thing about both initiatives is that governments appear to be pushing more data into the public domain, revisiting restrictive copyright issues. A subsidiary issue is the actual format in which the data is supplied.
You might like to reflect on the relative merits of different ways of publishing data online. If you are in a government department and have just received a Word document with a table of the current month’s oil and gas production data for your country, you could scan the document and include an image in a PDF file on the website. This is about the worst you could do. Better would be to put the Word document online—but this is far from optimal, as getting at the data in a table inside Word is awkward. Flash (as used on wiaw.org) is probably not the greatest thing either—especially for a standard! There are other options. Comma separated values aren’t bad. An HTML table is OK. But these still require frustrating editorial gymnastics to get at the data.
TBL’s aim for linked data goes beyond just being able to ‘get at the data.’ The key idea is to be able to get at it programmatically. Data providers should provide an application programming interface (API) so that programmers can write programs, grab data across multiple sources, and ‘mash-up’ the data in interesting new ways. Now the question is, which API is right for the web?
Enter the W3C’s Resource Description Framework (RDF). We have written extensively about RDF in previous issues of Oil IT Journal, in reports from ‘Semantic Days’ conferences and the 2008 Chevron-hosted, W3C’s ‘Semantic Technology in Oil & Gas’ event. But I have a confession to make. The promise of machine readability, ‘reasoning’ and ‘open data’ led me, like many others, to ascribe almost magical properties to RDF. In particular, it has been represented as (somehow) holding the key to solving the ‘what is a turnip (or well)’ problem.
A new book (4) from O’Reilly press which we will review in a future issue puts the record on RDF straight. RDF is not about ‘reasoning,’ it is not even really about ‘semantics.’ RDF is just a very minimalist data modeling language—at the opposite end of the modeling spectrum from say, Express. RDF offers no more help in ‘disambiguating’ turnip or well terminology than previous modeling languages.
Another relevant new book is ‘Data Modeling for the Business (5),’ to be reviewed in next month’s Oil IT Journal. Of particular interest is a sub-chapter by Mona Pomraning on data modeling in BP. The approach here is light-years away from the hard core modeling of Express or even PPDM. Hoberman and co-authors advocate what is almost a ‘touchy feely’ approach, with stakeholder involvement and multiple (four) levels of graphical modeling. ‘Logical’ models such as PPDM, MIMOSA and PRODML are second to bottom (above the physical model) and there are ‘high level’ and ‘very high level’ ‘C-Suite’ models above these. It seems as though the whole modeling focus has shifted from data to disambiguation.
2 Unbelievably, a googlewhack!
4 Programming the Semantic Web. Segaran et al. O’Reilly 2009.
5 Data modeling for the Business, Hoberman et al. Technics Publications—2009.
Around 35 turned out last month for the 1st PPDM European User Group, hosted by BP at its Sunbury, UK location. PPDM CEO Trudy Curtis kicked off the proceedings with an insightful analysis of the current situation regarding upstream master data management (MDM). Companies are faced with the challenge of multiple lists of ‘wells’ that use widely differing definitions. Such differences make it nigh impossible to work across subsurface, drilling production and finances. Each domain only lists ‘owned’ wells. Company lists are thus incomplete, overlapping and use ‘competing and conflicting’ names. When you do try to pull everything together, you discover that folks are talking different ‘stuff.’ This is the classic MDM problem that sparked off PPDM’s ‘What is a Well?’ (WIAW) workgroup, which kicked off in 2008. To demonstrate the nature of the WIAW challenge, Curtis set attendees a short practical exercise involving various combinations of producing intervals, lateral boreholes and completions. The results showed that beyond the simplest configurations, there was little consensus on exactly how the various geometries and completion configurations should be named. Curtis observed that ‘Some come to fisticuffs over such issues! In general, as an industry, we don’t agree on these fundamental revenue generating objects.’
The WIAW group is working to define terms, eliminate ambiguity and duplication by mapping usage across applications, and by figuring out how to address gaps and overlaps. The idea is to create a WIAW/MDM ‘Rosetta stone.’ Anadarko, BP, Chesapeake, Chevron, ConocoPhillips, Hess and Nexen were involved in WIAW Phase I and now have ‘baseline definitions’ in an Adobe Flash application on whatisawell.org. These include regulatory constraints on naming for the US, Canada and Australia. PPDM is keen to do the UK and Norway next. All WIAW is mapped to PPDM 3.8.
Fred Kunzinger (Hess) and Shannon Tassin (Noah Consulting) gave back to back presentations on how Hess has leveraged the WIAW work to ‘raise the bar’ on data management—transforming it from a ‘necessary evil’ to a critical factor in Hess’ success. Hess has 4 different well log data bases and ‘no money to merge them—despite multi-million logging jobs in a single GOM well!’ Hess upstream mission statement includes ‘standard processes, practices and technology’ to support its global business’ and ‘credible data for fact-based decisions.’ Hess engaged Noah Consulting to develop a technical information lifecycle (TIL) strategy before initiating a structured data governance approach. Hess spent 14 months on an off-the-shelf solution from Schlumberger and Halliburton but this ‘did not fit with what we were trying to do.’ In the end a bespoke solution was developed around PPDM and Volant’s EnerConnect middleware.
The solution, a ‘technically validated database’ (TVDB), uses the PPDM 3.8 data model and is now the hub of a constellation of application software including Hess’ Well Master (on PPDM 3.7), SDE Spatial, GeoQuest, Paradigm and Petrel. Validation is assured by assigning authoritative sources, standard naming conventions, data ownership, roles and responsibilities—all with documented procedures. Consistency with Hess’ standard technical architecture is ‘where battle lines are drawn!’ Kunzinger wound up by warning of possible confusion between the ‘single source of data’ approach and the TVDB. The latter embeds the former but crucially adds data governance.
Host Gavin Goodland noted that some of BP’s data acquisition categories cost ‘in excess of a billion dollars per year.’ The data thus acquired is carried as an asset and it is possible to calculate the cost of not managing data in terms of data asset degradation, time lost/wasted and e-discovery costs (BP is sued several times per year in the US).
BP carried out some competitor analysis on data management in the upstream by comparing return on investment with how centralized the data role was. The result was a ‘U’ curve with companies with a clearly centralized or distributed approach doing well while companies with a ‘wishy-washy’ approach performing poorly.
BP units are increasingly following standard processes, as laid down in a ‘Standard IT Bill of technology’ that includes a BP operating model for data management. Data governance includes ‘smart standards,’ i.e. ‘you don’t have to standardize everything—particularly business processes.’ BP initially ‘wrestled’ with data management. Is it an IT or business function? In fact it is both! Technology and tools represent a small part of the equation and are very subsidiary to performance management inter alia. BP outsources much of its data management. Key elements of data governance include genuine authority to take decisions, visible executive-level support, and a business that owns and accounts for its data. BP has instigated data governance boards with representation from all business segments. On the topic of standards Goodland observed that ‘there are many and there is a temptation to drink in all the standards’ saloons.’ Often companies may participate in a standards effort but the benefits may be unclear. With help from Matthew West, BP has mapped the standards landscape across activities and domains and filtered down the potential watering holes. These include PPDM (Goodland is now on the PPDM board), various ISO standards including 15926, PODS, Energistics and others. What will actually be officialized is as yet undecided. The plan is to ‘harvest’ short term potential and later, to identify gaps and overlaps where harmonization can take place.
Matthew West (Information Junction and ISO 15926 luminary) described a ‘comprehensive approach to information quality,’ noting that managing data and documents represented essentially the same problem set. West enumerated some information quality (IQ) myths as follows.
1) IQ is hard. Not true—there is no intellectual challenge. IQ is about attention to detail—the problem is that it is not generally done and companies prefer to ‘reconcile’ data with armies of accountants.
2) IQ adds costs. Not true it is not about adding checks, it is about getting it right first time.
3) IQ is about being more accurate. No, just fit for purpose. It is the lack of quality that adds costs—through having to ‘fix’ bad data or by making bad decisions.
In the end, quality is about meeting customer requirements—although these may involve educating the customer as to the risk of poor data quality. West recommends use of the ISO 9001 product quality standard since, ‘information is just another product.’ Quality needs to be built in to an Enterprise Architecture since ‘all IT is about information.’ More from ppdm.org.
Document capture and content management specialist User Friendly Consulting has published a booklet titled ‘Data Capture and Document Management,’ with ten tips for saving time and money. The words of wisdom stem from UFC’s 20 years of scanning, OCR and content management for companies including Halliburton, KBR, Oneok, TransWestern Pipeline and Southern Union Gas.
Those embarking on a document capture system should think beyond reduction in paper filing cabinets. A feature rich and correctly utilized system can be much more. Email capture is now a whole segment of enterprise content management (ECM) onto itself. A good system monitors emails and attachments along with faxes machine traffic. Even ‘Office’ files, Word, Excel etc. need ‘capture’ to be properly integrated into the workflow.
UFC’s oeuvre recommends the elimination of manual intervention where possible. Automation features such as image clean-up, database lookup and validation and text recognition (OCR) and extraction add to a project’s value.
Data capture projects scale badly and require attention issues such as choice of hardware, software solutions. The booklet covers issues such as extending Microsoft SharePoint with a data capture or document management system for an enterprise solution. These systems come at a cost, but a good design reduces integration costs and allows for flexibility.
UFC warns against low-cost, entry level solutions with limited customization functionality, preferring a ‘scalable, easily integrated, easily customized feature rich data capture or document management system.’ What might these be? Well the booklet is vendor-neutral, but a few hints of systems that have UFC’s seal of approval are to be found on the UFC website. Quillix Capture is used by UFC on a major ongoing project for Halliburton, augmented by UFC’s own ‘MuWave’ helper app. The Laserfiche document management system was deployed as a back end because of its SharePoint integration and its capacity to include remote capture. More from firstname.lastname@example.org.
Houston-based IT services, data and compute infrastructure provider CyrusOne has been chosen by Petroleum Geo-Services (PGS) for its North American data center. A 10,000 square foot ‘data hall’ in CyrusOne’s West Houston data center will house ‘mission-critical’ data and application servers in support of PGS’ worldwide exploration projects and operations. The center will provide seismic data processing, reservoir analysis, interpretation and electromagnetic services. CyrusOne claims a ‘secure, scalable, high density environment’ and ‘minimal’ environmental impact.
Other CyrusOne energy clients include Repsol, whose Kaleidoscope supercomputer runs on a CyrusOne cluster (OITJ Nov 08) and Dynegy who outsourced its IT infrastructure and business continuity systems to the service provider in 2004. Around the same time, CyrusOne signed with Schlumberger Information Solutions (OITJ Oct 04) to provide clients with ‘managed’ petrotechnical computing services. More from cyrusone.com.
Oak Ridge National Laboratory, Lawrence Berkeley National Laboratory and Los Alamos National Laboratory have collectively announced the creation of the Hybrid Multicore Consortium (HMC). The three US R&D units are convinced of the potential of current and emerging accelerator technologies. Hybrid multicore architectures have an ‘unrealized potential’ to delivering high-end production computing capabilities for the most demanding scientific applications. The HMC will address the migration of existing applications to accelerator based systems and will identify obstacles to near term use.
A kick off meeting was held last month with presentations from the three labs, NASA and other members including Georgia Institute of Technology and the Swiss Federal Institute of Technology (ETH). More from computing.ornl.gov/HMC.
Oil and gas economics boutique Palantir Solutions is teaming with reservoir evaluations specialist Fekete Associates on a ‘seamless’ workflow spanning technical analysis, economic, financial and portfolio planning. Fekete’s FAST Harmony is to be interfaced with PalantirCASH. Harmony offers reserves evaluation and production analysis while Cash provides economic valuations of global portfolios.
Fekete VP Kevin Dunn said, ‘We are working with many progressive producers who want the efficiency and reliability that this solution will provide. The first implementations of these tools will give early adopters a competitive edge in the market.’
Feteke’s production forecasts flow directly into Palantir’s cashflow and portfolio models resulting in an integrated planning system that is ‘responsive and dynamic.’ Changes in the technical or commercial environment can be modeled and updated on the fly as new business opportunities arise. More from fekete.com and palantirsolutions.com.
Halliburton’s WellDynamics unit has rolled-out ‘SmartWell Master,’ a supervisory application for the control and monitoring of SmartWell downhole equipment including permanent monitoring systems. SmartWell Master is based on Iconics’ Genesis32 web-enabled Human Machine Interface/SCADA software.
IBM Maximo for Oil and Gas has been updated with asset integrity management and integrated operations. The 7.1.1 release now integrates with the IBM Chemicals and Petroleum Integrated Information Framework.
New Century Software and LandWorks are to combine their software solutions providing a direct connection between pipeline right-of-way records, pipeline engineering and operations data.
Arm waving business process modelers should check out Future Strategies’ new ‘hands-on’ modeling templates. The printable shapes of BPMN modeling elements are said to be useful for group discussions and early planning sessions. More from futstrat.com
Emerson Process Management has announced a wireless video solution that integrates video streams from Industrial Video & Control’s relay server with the its DeltaV digital automation system. IVC’s enterprise-class camera management software and rugged IP cameras are certified for hazardous area operations.
V 10.0 of Pitney Bowes’ Encom PA ‘overflows’ with new features such as enhanced profile-spreadsheet linkage,
Data management and 3D visualization.
Foster Findlay Associates (ffA) has announced a synchronized release of its SVI Pro and SEA 3D Pro 2010 applications on Windows and Linux. The Linux release now leverages NVIDIA/CUDA based seismic attribute computation and is compatibile with Halliburton’s GeoProbe R5000 release.
IHS and SeisWare have developed a dynamic data link between Petra and SeisWare. The tools can now share well data, tops, grids, log curves and deviation surveys.
IDS has announced a new web-delivered drill site management tool, ‘SiteNet.’ SiteNet tracks a well site from permitting and initial survey to recovery and remediation. SiteNet tracks costs, connects locations to events and equipment, captures site surveys and approvals and logs communications and transactions. SiteNet integrates with IDS’s DataNet2 suite of web-based reporting tools.
FaultSeal has released FaultRisk Enterprise for in-house fault seal analysis. Users’ FaultRisk GUI connects over the network to a FaultRisk compute engine server in the data center.
The latest release (V7.0) of Heliosoft’s Seismaster Pro adds a stand-alone 2D/3D SEG-Y viewer, noise suppression using a ‘KL’ transform, attribute generation and principal component analysis.
MetaCarta has released its Geotagger geolocation based search engine on Red Hat Linux as a stand-alone product. The move unbundles MetaCarta’s technology from current appliance offering and ‘opens up new markets for the company with solution providers.’
Ikon Science has upgraded its RokDoc-ChronoSeis 4D modeling package with, inter alia, a reservoir simulator link and a model validator that checks the model for consistency, alerting users to missing logs, surfaces and attributes.
Emerson Process Management’s (EPM) Roxar unit has announced RMS 2010, a new release of its reservoir modeling system. New features include a well correlation system, enhanced structural modeling and a 3D gridder. Curiously the new release does not appear to leverage or integrate with EPM technology per se.
The 2009.1 release of Paradigm’s Sysdrill includes localization for Chinese and Russian speakers and integration with Paradigm’s Epos 4 data infrastructure.
Some 125 turned out for the 33rd Gas Lift Workshop, hosted by the Artificial Lift R&D Council, held in Houston earlier this month. Roman Molotkov (Weatherford) made a case for a ‘paradigm shift’ in gas lift analysis, with the roll-out of ‘WellSavvy.’ WellSavvy is a new take on the ‘digital engineer’ concept—this digital engineer is a robot! WellSavvy ‘perceives and acts on its environment without human intervention analyzing the whole field in seconds, providing optimization recommendations.’ The trainable system monitors SCADA and other data sources and performs real-time pattern recognition and diagnosis. The WellSavvy server houses a knowledge base of ‘all potential conditions.’ The server ‘learns’ from the WellSavvy client as new data and situations are encountered, updating the knowledge base. The knowledge base addresses continuous gas lift for injection and production pressure operated gas lift valves and covers multiple inlet and outlet conditions. In all some fifteen attributes are monitored. WellSavvy leverages some of the concepts co-developed by Oxy, Weatheford and IntelligentAgent as we reported from last year’s SPE ATCE (OITJ October 09 and SPE 124926*).
Ken Decker (Decker Technology) traced the history of the Valve Performance Clearinghouse (VPC) which originated with the API 11 V2 spec in 1995 and a joint industry project, administered by Decker Technology. JIP members now include Chevron, ConocoPhillips, ExxonMobil, Petronas, PTC, SaudiAramco and Shell. Many gas lift software vendors now have VPC correlations built into the code. 46 valves have been tested to date including some of the new high pressure and ‘barrier’ valves. Decker comments that a gas lift valve is a mechanical device, it obeys the laws of physics in spite of attempts to attribute perceived flow performance. He warns that one inch valves do not perform the same way as one and a half inch valves, and asks if you are using the same design techniques with one inch valves as you do with one and a half inch valves. Current gas lift design methods do not make this distinction. Maybe you should!
More from the ALRDC meet in next month’s Journal and from alrdc.com.
President and CEO Claudi Santiago described 2009 as a ‘challenging year’ for GE’s Oil and Gas unit and for the industry in general. Demand was down and volatility affected investment decisions. 2010 is looking ‘encouraging’ as demand is ‘set to return to 2007 levels.’ The US rig count is up 37% since mid 2009 and E&P spend is up 10%. We now live in a ‘multi polar’ world with hydrocarbon demand growth coming from India, the Middle East and China. In 2020 China and India will consume the equivalent of all of Saudi Arabia’s production. Combine this with our currently depleting reservoirs means that we will need ‘five times Saudi Arabia’s oil production’ to satisfy worldwide demand. This opens up the market for gas as a ‘bridge fuel.’ But for this to happen, we have ten years to find ‘four times Russia’s current gas production!’
Working on mega projects inside and outside oil and gas GE has developed a process that ‘mitigates the risk of innovation.’ The process starts with customer ‘intimacy’ to ensure GE understands what’s needed. Next appropriate domain expertise is assembled from scientists, engineers and systems integrators. GE then leverages technologies from sister companies (notably aviation, converting jet engines for use as gas compressors). GE Oil and Gas plans to spend $500 million on R&D over the next three years and will also benefit from its parent company’s global $5 billion per year R&D spend.
Michael Illane (Chevron) described ‘mega project’ performance on the Gorgon liquefied natural gas (LNG) project where GE has captured around $2 billion of project value. Alongside the initial eight wells, each with a 65 ton subsea tree, there is an LNG plant on Barrow Island and the largest carbon capture and storage project in the world, complete with 4D seismic monitoring. Mega project issues include the impact of subsurface risk on topside design and the potential large cost of rework. ‘Dis-economies’ of scale are an unfortunate reality. Illane noted that engineering companies ‘don’t do a very good job of managing productivity.’ In a Canadian oil sands project, less than 50% of time is actually on the job. ‘We need to put more effort on analyzing and mitigating risk, particularly of high impact low probability events such as security and labor relations.’
Jackie Mutschler described the impact of new technologies on BP’s E&P challenges. These include thin beds, high pressure/high temperature wells, hard to image traps, seals as reservoirs and ‘access issues.’ Wide and multi-azimuth seismic has been a big breakthrough for BP along with ocean bottom seismic. On land, ‘independent simultaneous sweep’ seismic has brought a five-fold seismic productivity hike. In production engineering, ‘BrightWater,’ (a joint Nalco, BP and Chevron development) has been used successfully in Pakistan, Argentina and elsewhere to shut off bad water flood pathways and optimize sweep efficiency. BrightWater has added 500 million barrels to BP’s books at a cost of $3 per bbl.
More generally, BP’s Field of the Future is ‘transforming the value chain.’ BP’s Gulf of Mexico digital infrastructure (fiber) incorporates information with workflows and is ‘changing how people work.’ The ‘real time challenge’ is the hardest, BP has around two million data tags worldwide. Production data feeds into reservoir models along with data on well bore conditions and shape. BP’s ‘advanced collaboration environment’ (ACE) brings it all together. ACE is used in the North Sea to perform holistic model-based analysis of slugging issues. The Field of the Future program has added 30,000 bopd net to BP.
A combined ‘legacy’ Hydrill/VetcoGrey (now both GE) session conducted by Bob Judge and Jim Allison looked at trends in high-end drilling technology. GE’s kit was used in the current world water depth record of 3,051 meters. Ultra deepwater brings challenges of pressure, hydrostatic load, and bending load on the drill string. Wells are getting hotter (up to 500°F) and are encountering higher pressure formations. Deepwater day rates of $500k/day make rework and non productive time serious issues. One intriguing answer to ultra deep environments is ‘dual gradient’ drilling. A seabed-located unit and secondary riser system isolate the subsurface from the pressure of the drilling mud in the primary riser. Dual gradient drilling detects kicks sooner than surface systems. Elsewhere the combination of remote diagnostics and increasingly instrumentalized blow-out preventers and other subsea components is extending the digital oilfield concept to the seabed. More from ge.com/oilandgas.
Last year, Oil IT Journal visited GE Oil & Gas’ Nailsea, UK-based ‘Smart Center’, a mini operations control room for conducting remote monitoring and diagnostics of subsea equipment. A typical Smart Center operation might involve a remote desktop connection to a device located half way around the world, along with a voice link to talk an on-the-spot engineer through commissioning or testing.
The Smart Center can also act as a large scale system emulator, leveraging GE’s ‘Mimic’ process simulation package, acting as an offline scenario emulator or in a risk avoidance mode for testing new software before rolling out in the field.
Today the Smart Center’s focus is for ‘occasional’ use in specific scenarios, but the plan is to work towards more real time production optimization, rolling-in software from SPT Group (Oil ITJ May 2009) and research in artificial intelligence underway at GE’s Shanghai Center.
We asked if the Smart Center might ‘compete’ with the majors’ internal operations centers. GE acknowledges that the Smart Center may not suit all. But the system is perfect for independents faced with immediate issues such as slug monitoring and hydrate build-up. As more compressors, booster technology and separation is happening on the seabed, remote monitoring is a given.
Today such systems are optimized manually, but this means staying well away from the envelope. As refinery optimization has demonstrated, automation allows operations nearer to the margin. One fact. should make the potential clear—currently, a single high end device can stream around 20,000 data points, but typically, the operator only looks at half a dozen. Most Smart Center interest comes from remote assets. It is Baku or the Bass Straight rather than Aberdeen. More from ge.com/oilandgas.
Herb Yuan’s keynote introduced Shell’s new approach to technology. This involves integrated workflows that embed Shell’s proprietary application portfolio while reinforcing Shell standards and data management policy. All of which is being crafted around Schlumberger’s Petrel and its Ocean development environment—both flagship oil and gas products for Microsoft. Shell’s Petrel/Ocean modules includes a link to ‘nDI’ (an upgrade to Shell’s own 3DI seismic interpretation system), and Shell’s ‘Simple Visualization Software’ toolset for fracture network modeling and geomechanics, a curiously titled ‘Voice BodyWorld’ component and many others—in various stages of development. Modules are bundled and deployed with Petrel across Shell. For Yuan, the key enablers are the .NET platform and the Ocean domain-specific environment. The development leverages Microsoft’s VisualStudio/TeamSystem for project/source code management and documentation. The Shell/Schlumberger/Microsoft love-in dates back to the 2002 ‘Shellfish’ .NET pilot. OpenSpirit provides the gateway to third party apps and data sources such as SMT, Finder, OpenWorks, GeoFrame, ‘managed SEGY’ and Recall.
Reid Smith updated Marathon’s enterprise content management solution that was unveiled as ‘ViewPoint’ last year, but is now ‘MaraView.’ MaraView has extended its data/visualization focus with a dashboard that includes support for project teams and social networking functionality including a ‘colleague tracker,’ blogs, a myStocks component (!) and newsfeeds. Bing Maps, search and production KPIs complete the dashboard. Guiding principles behind MaraView include the fact that information should be managed as an asset, that standard processes and technology are beneficial and that the system ‘embeds legal and regulatory compliance in business processes.’
Andy Morley described how Baker Hughes’ ‘Knowledge WorkSpaces for Collaboration’ (KWC) originated in a 2002 move to eliminate internal ‘rogue’ websites. Today, the KWC portal integrates knowledge management, well data management, GIS, SAP, communities of practice and more. Baker Hughes, in collaboration with the SPE, conducted a technology review that led to the selection of SharePoint and the migration of around 1,000 internal Documentum eRooms. KWC is credited with successes including a global HR consolidation, a ‘GaugePro’ product launch workspace, SAP implementation tracking and SOX auditing and field-based knowledge capture, a template Wiki for capturing oil field and area-specific information. External KWCs support collaboration with customers and suppliers and eLearning. Baker Hughes is currently working to add content, improve ‘findability’ and integrate more with business and technical systems.
Wes Couch’s presentation outlined how Southwestern Energy Company (SWEC), with help from Stonebridge, has added a SharePoint to leverage its OpenText/Livelink-based well file solution. The company was challenged by poor information sharing between its operational teams and across disciplines. Livelink was deployed to provide ‘a solid and consistent framework’ for information management. SWEC’s technique is to use SharePoint to allow operational teams to manage their data and to develop their own websites. SharePoint’s content management and workflow approvals system were considered key. A Partner Information Portal was rolled out in 2008 to distribute documents to the appropriate working interest owners in a timely manner—replacing some 15,000 emails per week! Couch warned, ‘SharePoint is powerful, but it is not the solution for everything. An information architecture must be in place before the platform can be fully utilized.’
Microsoft CEO Steve Ballmer wound up the proceedings describing oil and gas as ‘a leading-edge user’ of IT with modeling, real time, visualization and collaboration ‘pushing the state of the art.’ A futuristic video showed the shape of things to come—full, wall-sized screens for collaboration and visualization, biometric security, real-time translation, ‘enhanced’ reality and use of sensor information. The ‘three screens and a cloud’ paradigm, the natural user interface, smart cloud and pervasive security promises a paradigm shift that will transform the way information gets used in the oil and gas industry. Advances in consumer technology such as the Natal Xbox peripheral will feed back into advanced environments like oil and gas. More from links/1002_2.
Last year (OITJ Feb 09) Shell’s Mike Hinkle described early trials of social networking. The idea then was to ‘do more with less, to reduce travel and connect with the ‘deep expertise’ of an ageing workforce.’ Shell’s enthusiasm for social networking was backed up by a survey from Microsoft and Accenture that found ‘over 70% of respondents believed that collaboration and knowledge sharing are important.’ The 2009 survey revealed that organizations are still using ‘older means of collaboration’ like face-to-face meetings, e-mails and conference calls’ instead of snazzy social media.
Today, it is Shell group IT architect Johan Krebbers who is banging the collaboration drum noting that ‘80% of our teams are global, with members in multiple locations and we must provide a world-class collaboration capability.’ A new study from the same team found (again) that energy professionals ‘want more effective ways to share and communicate’ and that they are desperately waiting for social media and collaboration tools to help them do so. The 2010 study also found (again) that internal barriers inhibit effective collaboration.
Despite the fact that this year’s findings are the same as last year’s, Microsoft’s Craig Hodges described the 2010 findings as an ‘eye-opener.’ Maybe Microsoft should subscribe to Oil IT Journal to refresh its collective memory! An RSS feed is available for ‘seamless’ SharePoint integration. More from the Microsoft survey on links/1002_3.
Nic Snape is now CEO of 1Spatial as Michael Sanderson moves to Chairman.
Acorn Energy has appointed David Beatson as VP and CTO. Beatson was previously president of Confero Solutions.
AGR Field Operations has named Scott Thetford as Senior VP, Americas. Scott hails from Pace Global.
The Artificial Lift Company has promoted Hassan Mansir to VP Engineering and Technology.
Ricky Holloman has joined consultants Chief Outsiders. He was previously with Expro Group.
In an internal move, Apache Corp. has named David French VP business development.
Atos Origin has appointed Alexandre Gouvêa, previously with Orange Business Services, CEO for Latin America.
Best Energy has named Eugene Allen as president and COO and the arrival of David Voyticky on its board. Allen previously ran Best’s Well Services unit.
Steven Warshauer has joined BNK Petroleum as director of geology. Warshauer was previously with Devon Energy.
Cindy Reece of ExxonMobil has been elected to chair Energistics’ board of directors replacing BP’s David Latin who retires. Reece is the upstream technical computing manager for ExxonMobil’s Technical Computing Company.
CSC has named Donald Purdy chief cyber security strategist. Purdy was previously president of DRA Enterprises.
CNOOC’s Zhou Shouwei is to chair DNV’s new Greater China Offshore Committee, a forum for the exchange of information on new technology and innovation relating to offshore engineering.
ETL Solutions has named Phillippa Hallewell as marketing executive.
Lisa Howat has joined FaultSeal as junior structural geologist.
CenterPoint Energy’s Cindi Salas, GITA president, and Oracle’s Xavier Lopez have been appointed to the US National Geospatial Advisory Committee.
Ron Rothman has been named president of Honeywell Security Group.
Ingrain has named Marcus Ganz COO and Gary Sinclair regional manager, Middle East and North Africa. Ganz hails from Schlumberger, Sinclair from Weatherford.
Petrophysicist Martin Storey is ‘moving on’ from Inpex. He plans to continue consulting from his Perth, Australia location.
Mohamed Hales is to head-up Invensys Operations Management’s new facility in Algiers. Alongside the regional sales and operations HQ is an engineering center of excellence centre, technology showcase a training facility.
B.J. Carney, formerly Geoscience Coordinator, Marcellus Shale at Chesapeake Energy is now VP Geoscience at Northeast Natural Energy.
Neuralog reports the following new sales hires—Dustin MacNeil (Canada), Bryan Mills (US), Juan Muniz (Mexico) and Renato Cerna (Ecuador).
Samantha May is now commercial manager at UK-based Oilennium.
Pearson-Harper has engaged three new team members. Keith Joseph (Business Development Manager), Brian Dagnall (Technical Author) and returnee Graham Smales (Technical Assistant).
PAS has appointed Jim Huff, previously with Invensys, as VP Technology.
FileTek has named Mark Seamans Executive VP CTO. Seamans was previously head of Autonomy’s Cardiff business unit.
Kjell-Arne Bjerkhaug has been named MD of Kadme AS. Bjerkhaug joins Kadme from Geograf. Gianluca Monachese moves to the post of business development director.
Alexey Tyurikov is to head-up AspenTech’s new Moscow office.
RigNet has appointed John Bush as manager of business development.
Paal Kibsgaard has been named Schlumberger COO.
Seismic Micro Technology has appointed TengBeng Koid president, international. Koid hails from Ion Geophysical.
Since Schlumberger has taken over the running of UK’s DEAL website, users of the Firefox web browser are greeted with the following message ‘The DEAL data store currently only supports Internet Explorer. Please try again using this browser.’ Deal was set up to ‘facilitate’ access to UK oil and gas data.
Triple Point Technology (TPT) has acquired Softmar, a vessel operations and freight rate risk management solution provider. TPT has also acquired Enerbility Software, improving connectivity to EU energy traders.
Rio-based GeoQuasar Energy Solutions is to act as ARKeX’s representative in Brazil and Mexico.
Aspen Technology has ended its reseller agreement for Russia and the CIS with Hyperion Systems Engineering.
Foster Findlay Associates and TerraSpark Geosciences are to explore ‘parallel’ R&D and synergies beentween TerraSpark’s Insight Earth and ffA’s SVI Pro and SEA 3D Pro seismic analysis software.
Chinese automation services provider Recon Technologies has received a RMB 1.1 million (approximately $161,000) grant from Nanjing city. The award acknowledged Recon’s successful NASDAQ float.
Schlumberger has signed with New Tech Engineering for the provision of wellsite consultants and engineering services to its Integrated Project Management activities worldwide.
Stallion Oilfield Services has emerged from Chapter 11 protection following a successful ‘pre-negotiated’ restructuring agreement entered into late last year.
Technip is involved in discussions with the SEC and US Department of Justice regarding the ongoing investigation of, TSKJ, a Nigerian joint venture. Technip recorded a €245 million charge in Q4 2009 reflecting the estimated ‘cost of resolution.’
Intellog has closed its initial round of fundraising for its planned development of web based search, data management and collaboration tools for petroleum producer. The company also announced that Timothy O’Rourke has been appointed Chair.
Badley Geoscience has announced OCTek, a joint venture R&D project with Prof. Nick Kusznir of Liverpool University’s Geodynamics Research Group. OCTek uses gravity inversion to produce maps of crustal thickness at rifted continental margins and their ocean-continent transitions. Maps will be available in their present-day positions as well as at the time of breakup, immediately prior to ocean-basin formation. OCTek should benefit new ventures’ exploration strategy in deepwater areas and in petroleum systems modeling. More on OCTek from badleys.co.uk.
Christian Michelsen Research, Statoil and the University of Bergen are teaming on ‘GeoIllustrator,’ a new software package for geological data concept analysis. GeoIllustrator will leverage ‘state of the art’ computer graphics and interaction hardware such as Watcom’s ‘Cintiq’ sketch-based gestural interaction device, to ‘help understand surface and subsurface geology.’ The project is funded by Statoil and the Norwegian Research Council. More from links/1002_8.
BP, the Massachusetts Institute of Technology and the University of Manchester are to team on oilfield corrosion R&D in an extension of BP’s ‘Inherently Reliable Facilities’ (IRF) program. The major research collaboration is endowed with an initial $2 million investment and BP intends to follow on with a similar amount for up to a further four years. The R&D program includes corrosion and corrosion-fatigue modeling, environmental cracking, novel coatings and new monitoring technology. Corrosion management is ‘crucial to achieving safe, reliable and efficient operation of processing facilities and infrastructure.’
Simon Webster, BP’s VP for the IRF program said, ‘Corrosion control, mitigation, and monitoring are significant concerns in our industry. The success of the IRF program depends on access to specialized materials and corrosion expertise and laboratory facilities.’
BP is also funding the Corrosion and Reliability Engineering initiative at The University of Akron, USA with a one-off, $500,000 grant. BP’s 20-strong IRF team is targeting ‘a billion incremental barrels of non-proven reserves’ by extending production facilities’ lifespan. More from links/1002_5.
The Houston Advanced Research Center (HARC) and Austria’s University of Leoben are teaming on an ‘Environmentally Friendly Drilling Systems’ R&D initiative, a component of the Environmentally Friendly Drilling (EFD) program. Program manager, HARC senior scientist Rich Haut said, ‘Opportunities for innovation here at home and abroad will broaden as operators and regulators learn about each others’ programs and technologies. Working with European countries will leverage our resources to improve environmental performance.’ EFD- EU chapter manager Gerhard Thonhauser (UoL) added that ‘European operations can learn about technologies used in environmentally sensitive areas, such as hydraulic fracturing techniques.’ More from efdsystems.org.
A white paper ‘Making Data Real’ authored by the ARC Advisory Group, on behalf of Aveva and Mustang Engineering, discusses managing information across the asset lifecycle—and across ‘handover,’ when the engineering contractor passes a new facility to its owners.
For ARC, information is deemed a ‘quintessential’ part of the Asset Lifecycle Management (ALM) process. For engineering companies like Mustang, IM is critical for a successful business. But the same processes and tools have the potential to provide ALM benefits beyond handover, to an owner operator.
Mustang’s project information management unit supports six business units and 14 locations around the world. In the old days, project information management (PIM) consisted primarily of engineering document control. Handover was a ‘loosely defined’ workflow with data ‘buried in hundreds of hard-copy vendor books and technical documents.’ Following handover, owner-operators had to ‘mine’ information manually from documents and drawings to populate operations databases—an inefficient and error-prone process.
Today, EPCs and owner-operators have sophisticated database applications to generate and manage their information—but this actually adds complexity to data handover. Electronic data introduces compatibility issues between different systems, requiring data reconciliation across engineering data management systems and taxonomies—a challenging process.
Consequently, Mustang was looking for a data management system that could provide a master tag and document register for use by its own engineering groups and its owner-operator clients.
Following a proof of concept trial, Mustang chose Aveva Net as its IM platform because it was application neutral, scalable and promised improved future ALM workflows.
Mustang reports a significant improvement in the handover process. Instead of the traditional end-of-project ‘big bang,’ handover is a continual process throughout the design phase. Mustang’s engineers can start populating the commissioning database immediately after front end engineering design (FEED) and the initial handover of the instrumentation database to the client’s automation business unit. The progressive approach improves the timeframe required for operational readiness, an important milestone for project owners.
One critical Aveva Net capability is highlighting data gaps. These include items that are shown on an engineering diagram but which have not yet been created in the 3-D model, or where there is a failure to comply with client handover specifications. The toolset’s ability to associate tagged objects and their documents provides insights into the cascading effect of a change. The new solution is described by ARC as ‘a quantum improvement in data quality and project execution.’ The full report is available from arcweb.com.
Chevron is to deploy the R5000 release of Landmark’s OpenWorks geosciences interpretation suite as a component of its ‘next generation’ interpretation environment where it will see use by some 1,500 of Chevron’s earth scientists. Jim Green, CIO of Chevron’s Energy Technology business said, ‘Chevron continues to make strategic investments in critical information technologies to drive performance and growth in our exploration and development. OpenWorks has been a part of Chevron’s upstream workflows for the past 15 years and it remains an important component of our IT portfolio.’ Landmark, a Halliburton brand, claims ‘more than 12,000 interpreters and 800 corporate clients’ for its OpenWorks flagship.
Tullow Oil has awarded Ikon Science a global contract for the provision of rock physics and quantitative interpretation technologies and services.
Newfield Exploration has selected Paradigm’s Geolog as its corporate standard for well log data management and petrophysical analysis. Newfield senior petrophysicist Dick Merkel said, ‘Geolog enhances our reservoir description capabilities with a single software environment, eliminating data conversion and transfer delays.’
Marathon Oil has chosen MicroSeismic Inc. (MSI) to execute two buried array seismic surveys in the Haynesville and Marcellus shale areas. MSI’s Passive Seismic Emission Tomography (PSET) technology will be used to provide passive seismic imaging of two 20 square mile areas and will be used for monitoring and analysis of hydraulic fracturing. MSI has conducted ten buried array programs in the past 14 months.
Flare Solutions has signed with Wipro Technologies to provide global application support and system administration of its flagship E&P Catalog Application for an unnamed ‘multinational oil company.’
Buenos Aires-headquartered Pluspetrol is to deploy Blue Sky Network’s D2000 GPS-based aviation tracking devices. The system will be used to stay in contact with its contracted helicopter fleet operating in Peru. The D2000 system includes a short code messaging interface, a ‘Mayday’ button and an Ethernet port for connectivity. Blue Sky operates via the Iridium global satellite network.
Woodside has awarded KBR a ‘basis of design’ study for its Browse Liquefied Natural Gas (LNG) Development. The nine month study includes a 12 million tons per annum onshore liquefaction facility in Western Australia and associated infrastructure and marine facilities.
WorleyParsons is to ‘significantly increase’ use of Aveva Plant. The deal extends a two year global contract placed by WorleyParsons’ Kuala Lumpur unit.
Shell has awarded Emerson Process Management a five year global framework agreement to act as main automation contractor on future global capital projects. The deal includes project and support services on brown and greenfield sites for distributed control systems and safety instrumented systems along with maintenance of existing systems. In a separate announcement, Shell has named Yokogawa as its, err, ‘main automation contractor’ for integrated production control and safety instrumented systems. Postscript—Shell says ‘main’ does not mean ‘exclusive.’
ConocoPhillips Australia entered into a four year, extensible agreement with AGR Field Operations for the provision of professional engineering services and support on operations in Australia and the waters between Timor-Leste and Australia.
Fuel retailer Wayne Oil has chosen KSS Inc.’s PriceNet and PriceNet Web systems to automate and optimize fuel price management at 13 locations across the Southeastern US. The solution will be delivered via KSS’ application service provider (ASP) model.
Engineering design house Siirtec NIGI has selected Intergraph SmartPlant Enterprise plant for its oil and gas projects.
Cortex Business Solutions and partner Powervision Software are to supply Murphy Oil’s Canadian unit with a supply chain automation solution. Murphy’s suppliers wil be hooked in to the Cortex Trading Partner Network, leveraging Powervision’s workflow management package. The service providers also announced a similar deal with an unnamed ‘large Calgary-based Energy Trust.’
Cortex has also struck a deal with Helsinki-based Basware Inc. to expand its geographical coverage into the USA. Under the agreement, Basware customers receive purchase invoices electronically through an interface to the Cortex supplier base. The companies also announced Apache Corp. as anchor client for the new joint venture. The joint solution will utilize Basware’s Invoice Automation solutions and the Cortex Trading Partner Network to reach Apache’s top suppliers in North America. Basware claims over 850,000 users for its purchase-to-pay solutions.
Oildex reports record activity in 2009 for its ePayables solution with some 58 million transactions and a 200,000-strong user base demonstrating the popularity of its Software as a Service solution set.
BP Oil UK has signed with Malta-based Electronic Shipping Solutions (ESS) for the use of its ‘CargoDocs’ electronic shipping documents application. The first CargoDocs e-bill of lading covered a lifting delivery to BP’s Belfast terminal last month. BP has been involved with ESS for five years and contributed to CargoDocs’ development.
Technip is revamping its e-sourcing portfolio with the deployment of Ivalua’s ‘Buyer’ spend management solution. Ivalua Buyer will be used to manage Technip’s annual $4.4B spend. Technip’s international reach also represents an opportunity for Ivalua to ‘spread its wings’ according to Philippe Levy, CEO for Ivalua US.
Apache Corp. has joined the OFS Portal trading community.
SmartSignal has announced ‘Shield,’ a predictive-diagnostic package for the oil, gas and power verticals. SmartSignal Shield uses shared ‘blind’ data from customers—comprising what is claimed to be ‘the world’s largest base of predictive-diagnostic intelligence.’ SmartSignal has analyzed data from hundreds of millions of machine hours and tens of thousands of failures to correlate fault patterns with operating behavior. This database forms the intelligence of Shield.
Jim Gagnard, SmartSignal CEO, said, ‘Shield delivers predictive diagnostics while our competitors just talk about it. Only SmartSignal has the historical data to make this possible. We’re glad to take this major step forward and provide the intelligence our customers need so they can avoid surprises that damage their performance.’ More from
Weatherford has unveiled a new online Tubular Connection Database (TCD) to provide clients and field technicians with the ability to verify connection size and makeup criteria in the field. TCD provides fast access to thousands of connections, ‘hevi’ and ‘spiral-wate’ drillpipe and collars, specification sheets and contacts.
The site is a one-stop shop for information on technical specifications and sizes of tubing, casing and drillpipe connections from various vendors. Information in the database is supplied directly from the connection manufacturers and is the latest published data. Current, updated information is easily accessible in one place, saving time and reducing error. In addition, TCD has a ‘mobile interface,’ enabling access from any web-enabled, personal-digital-assistant device or smart phone. More from tcd.weatherford.com.
Leica Geosystems has boosted its laser scanning offering in deals with software providers MicroSurvey and INOVx. 3D laser scanning is used to ‘capture’ and digitize real-world environments such as offshore production facilities, refineries and geological outcrops. The captured ‘point cloud’ data is then processed for incorporation in CAD/CAM and other interpretation and design tools.
Leica is to resell MicroSurvey’s ‘PointCloud’ CAD mapping package. PointCloud was developed using Leica’s point cloud engine (pcE) technology to manage large laser scan data sets from scanners or aerial Lidar.
The other deal announces a joint development initiative with INOVx to convert laser scan data into ‘intelligent’ plant models. The new products will be leverage Leica’ Cyclone software and INOVx’ RealityLINx package. More from leica-geosystems.com/hds.
Petrobras has deployed Emerson’s Smart Wireless solution, a component of the PlantWeb digital plant architecture, to automate the collection of pressure, temperature, and vibration data from compressors at its Sao Mateus gas compression facility in southeastern Brazil. Some 56 transmitters feed into a Smart Wireless gateway providing round-the-clock performance information from two of the plant’s seven compressors into the DeltaV digital automation system.
The WirelessHART compliant installation replaces daily clipboard rounds and error-prone manual data entry into Excel.
Weldon Araújo, Petrobras Sao Mateus maintenance operator said, ‘We now have online access to real-time data about the compressors’ operation, and can review historical data and trends. Alarms are generated when problems occur, enabling staff to react to an abnormal situation. Connecting the existing monitors on the compressors to the central control system with cables was not feasible because of installation and maintenance costs. Also, since the plant is prone to flooding, a wired installation would be less reliable.’ The wireless solution took four days to install and saved Petrobras $200,000 over a wired installation.
Another Emerson deployment is reported by Nynas AB at its Nynäshamn, Sweden refinery. Here a ‘self-organizing’ wireless network reduced installation costs and enabled online monitoring of vapor pressure and levels in tanks. The deployment saved €10,000 over a comparable wired alternative. More from emerson.com.
FreeWave Technologies has released a new ‘data radio,’ the FGR2-PE, offering long distance industrial serial and Ethernet wireless connectivity using license-free spread spectrum. The FGR2-PE is said to be well suited for oil and gas SCADA applications.
Product manager Matthias van Doorn said, ‘The FGR2-PE is an ideal platform for the wireless transmission of critical data and offers the same proven reliability and quality that our customers have come to know and expect in our radios. Utility customers appreciate its industry-leading RF performance as well as the versatility and flexibility of the new FGR2-PE to support automation, and many other telemetry and SCADA applications.’ The FGR2-PE provides security options including AES encryption and proprietary spread spectrum technology for secure wireless data communications. More from freewave.com.
Alaska-based geospatial solutions provider GeoNorth is leveraging new virtualized hosting technology a.k.a. ‘the cloud.’ Virtualization means that neither GeoNorth nor its clients need physical hardware to purchase or maintain, speeding startup and reducing costs. Resources can be tuned for a given workload and augmented on-demand at minimal cost. GeoNorth has trialed several major cloud providers including Heroku, Gogrid, Rackspace and Amazon.
One high-profile client recently migrated to the Heroku cloud (an Amazon Elastic Computing reseller). Heroku offers a complete Ruby on Rails application stack. GeoNorth reports the new site is ‘much more responsive and easy-to-maintain’ on the Heroku cloud. Cloud-based development means less time wasted on IT niceties and more focus on building the business solution. Moreover, Cloud hosting leverages ‘high-availability’ state of the art clusters backed up with best practices and redundant physical machines. More from geonorth.com.
Ontario-based N4 Systems has signed with safety equipment manufacturer MSA (originally Mine Safety Appliances) of Pittsburgh, to offer inventory tracking and safety-compliance documentation to fall protection equipment users. MSA is integrating N4 Systems’ Field ID system into its new ‘EvoTech’ full body harness. Field ID’s RFID will be embedded into the harness’ label pack, allowing administrators to track usage, inspection and distribution of harnesses as they transition from jobs and users.
MSA’s Robert Apel said, ‘Electronic logging of equipment inspection data is the future of safety compliance management. Field ID gives our customers new options and greater flexibility. Field ID is among the most advanced and easy-to-use safety management systems on the market today.’
N4 Systems CEO Somen Mondal added, ‘Field ID goes beyond traditional inspection systems—allowing interaction between all components of the safety ecosystem. The Field ID Safety Network lets all stakeholders share safety data and simplifies the compliance process.’
MSA sells approximately $1 billion of safety-related goods and services into many verticals including oil, gas and petrochemicals. More from msanet.com and fieldid.com.
Aberdeen-based Senergy has announced ‘WellScope,’ a wellbore planning and design tool that leverages computational fluid dynamics (CFD). CFD is a widely used technique that models fluid flow in and around components such as airplane wings, automobiles, drilling bits and now, producing wells.
WellScope produces 3D models of horizontal, deviated or vertical wells to evaluate inflow performance across different damage conditions and completion scenarios. According to Senergy, WellScope models, which may comprise up to 10 million cells, provide ‘accurate predictions’ of future well performance. Developer Michael Byrne, an SPE distinguished lecturer said, ‘Predicting well performance prior to drilling is one of the greatest challenges faced by the industry. Previous analytical solutions required simplification of reservoir layers, completion geometry and formation damage. This lack of in-depth modeling led to over simplification of the impact of formation damage and poor inflow prediction. Wellscope provides a more scientific approach and allows for informed decision making.’
Wellscope models include fluid flow physics and chemistry and can be updated from an interactive GUI. According to Senergy, the CFD approach is ‘set to transform the industry’s approach to inflow performance management.’ More from senergyworld.com.
Grant Thornton’s 8th annual Survey of Upstream US Energy Companies includes a review of 2009 authored by Partner Loretta Cross. Cross described 2009 as ‘a year of significant challenges and transitions’ with prices down over 70% from the 2008 high and with many companies taking steps to restructure their financial obligations. An absence of buyers made disposals virtually impossible. Hedge positions became critical and banker price decks were higher at times than the traded commodity price. Chapter 11 filings were rife in E&P and renewable. Out-of-court solutions employed debt-for-debt and debt-for-equity exchanges to restructure their balance sheets and obtain relief from creditors.
Today, there are signs of a return to normalcy. In September and October alone, energy issuers were able to access over $14 billion in unsecured corporate debt in the public markets. It is expected that the energy bond market will continue to grow through the second quarter of 2010 as companies take advantage of continued access to the capital markets to alleviate near-term pressures and upgrade balance sheets. Oil and gas reserve and property acquisitions have picked up steam. Joint venture and ‘asset monetization’ transactions are back. In summary, the exploration and production industry appears to have survived a short, deep down cycle and is now looking attractive for the capital markets at least as compared with the rest of the US economy (everything is relative!). More from gt.com.