I have to say that the demise of the W3C’s ‘semantic’ oil, gas and chemicals business group came as little surprise to me. What was a surprise was the role that Oil IT Journal played in its demise. A year or so ago, when the group initially formed, we signed up, causing something of a kerfuffle among the group’s members who objected to the presence of ‘the media’. Following a call from a senior W3C person who asked us nicely to leave, we did.
This was no big deal. We already had over a hundred articles on the semantic web in and out of oil and gas online and we were quite happy to wait on the group to present the results of its deliberations in public.
Also, the W3C bash was not the first time that the oil and gas industry has attempted to get semantic. We previously reported on Chevron’s now defunct ‘Oilfield ontology repository’ (June 2007) that set out to ‘collect public oil and gas ontologies and make them freely available to the industry at large.’ And we provided comprehensive coverage of the earlier encounter between the W3C and the oil industry in our 2008 report in the Journal and in our Technology Watch report from the event which I have just migrated into the public domain. So you can now download The Data Room’s Technology Watch report from the 2008 World Wide Web Consortium’s Workshop on the Semantic Web in Oil and Gas.
Of course, because we try to think about and analyze what we report, our articles are not the usual sales-pitch-cum-guff that came from the semantic community in the early days. ‘Seamless, machine to machine interaction’ and ‘understanding,’ format-free data exchange and interoperability. I confess to having been guilty of great gullibility in my own early writing on this topic.
The real irony of the W3C group’s reticence to going public is that while the ‘semantic web’ per se has failed to gain traction, its close cousin, the ‘open data’ movement is, at least in some quarters. So how are we going to square ‘confidentiality’ and ‘open?’ That’s an interesting question that spins-off many issues for standards bodies regarding intellectual property rights over the standard itself and over the data used to develop and demonstrate the same. And of course over who pays for all of the above. That’s enough on standards for now. Tomorrow I’m attending the inaugural EAME meeting of the Standards leadership council so more next month I’m afraid.
We haven’t published an ‘Industry at large’ section since March 2011. I’m not sure how popular such general background information is to our readership. Many complain of having too much to read already—which cuts both ways in that if we do ‘industry at large’ it might cut down on your reading. Anyhow I though that a look back through some of the material I have accumulated in my scrapbook might be more interesting than rants on standards.
Last night there was not much on the telly so we watched a DVD of the BBC’s dramatization of Anthony Trollope’s The way we live now. This relates the bodice ripping, deceit and scheming that accompanied the late nineteen century boom in railroad construction. The situation struck me as similar to today’s non conventional exploration. No, I’m not suggesting that Aubrey McClendon is Augustus Melmotte or that Chesapeake is the South Central Pacific and Mexican Railway Board. In fact what is most interesting about the railway mania is not that is was a con, as Trollope would have, but that it was unsustainable speculative bubble—more on railway mania from Wikipedia.
Unsustainable or not the EU wants a bit of the action and is agonizing over earlier knee-jerk reactions that saw the banning of hydraulic fracking in France and elsewhere. Brussels and EU governments are starting to back pedal on this with various entreaties to ‘respond’ to the shale gas ‘Eldorado.’ But because fracking is ‘bad,’ the debate has shifted to investigation of ‘other’ techniques. These include the use of propane and helium as frac ‘fluids.’ No doubt when the EU has approved these, folks will then discover that that these techniques are a) not very effective and b) do nothing to address the real issues of non conventional development—resource use, traffic intensity and the disposal of frac effluent. Fracking with laughing gas, very appropriate.
On the environmental front, I am sure you noticed the interesting development in that natural gas is displacing some coal used in US power generation. Meanwhile, some EU countries go ‘green’ by eliminating nuclear power. This means that the relatively cheap coal is ending up being burned in the EU’s power stations. It’s a crazy world all right!
Finally, hidden in the good news from the non conventional front is some very bad news indeed. I refer to the extraordinary satellite image that appeared on the front page of the Financial Times of shale gas being flared in North Dakota. This image made me recall Robert Skinner’s (former director, Oxford Institute for Energy Studies) address at the 2006 San Antonio SPE ATCE.
At the time it had been suggested that the booming conventional natural gas supply in Canada could be used to produce oil from the Athabasca tar sands. Skinner described this waste of a valuable resource as ‘unconscionable.’ Here we are now with plethoric natural gas just over the border in the US that is actually being flared off! You can view this as being either off the scale in ‘unconscionable’ or as a potential business opportunity. How about an XL-II pipeline taking natural gas from the Bakken to Alberta and fuel the Cold Lake shale oil retorts. To this casual observer that sounds like a better idea than just lighting up the night sky.
At Cold Lake in Alberta Exxon-Mobil unit Imperial Oil is developing a huge oil sands resource using cyclic steam stimulation, steam-assisted gravity drainage and other more exotic technologies. Today, Cold Lake produces 140,000 bbls/day of bitumen from four major plants and 4,600 wells. Ongoing drilling from ‘megapads,’ targets a doubling of production and generates information at a rate that is stressing Exxon’s data management to the limit.
Two presentations this month offered an unprecedented insight into how Exxon’s long-established data infrastructure is adapting to the ‘big data’ era of continuous drilling. Speaking at the SMi E&P data management conference in London, Jim Whelan described how ‘everything is changing’ in data and information management. ‘We used to have well-defined data sources but acquisitions and mergers especially in non-conventionals have meant that we now have a wealth of information in archives that are more or less well managed.’ Real time ‘big data’ is swamping established processes.
Speaking at the 2013 Microsoft Global energy forum in Houston, Exxon’s Bret McKee offered a new slant on information management in this high throughput environment leveraging Microsoft SharePoint. Here an information framework developed with help from Access Sciences is used to register incoming documents in SharePoint according to enterprise keywords from a standard list.
But the ‘back office’ as described in Whelan’s talk is a mighty enterprise data infrastructure based around a corporate database and well established procedures. These seek to capture and QC incoming data into the ‘InfoStore’ before it gets used and abused by the projects. Exxon’s environment includes Recall, Petrel, PetroWeb, ESRI, SharePoint and many proprietary applications, all connected by ‘StanLAN,’ a standardized network. The Cold Lake use case worked because of the time spent on data up front, notably on scanning around 2.5 million legacy documents. Users are now very happy, but the project will need ongoing resources to assure sustainability and combat data ‘entropy.’
The next major task is hooking up all the real time data coming from Cold Lake’s drilling—here Witsml has been mooted. Control system integration is also on the cards. A ‘digital energy’ environment is under construction at Cold Lake linking the Honeywell TPS/Experion PKS control system via secure StanLAN network devices from Enterasys. Exxon is developing a SharePoint site for its Cold Lake programmatics group, tasked with migration to a new ‘General managed environment.’ More on Cold Lake, and on SMi E&P Data Management and the 2013 GEF.
The French Petroleum Institute IFP Energies Nouvelles has taken delivery of a new supercomputer from Bull. The new machine, ‘Ener 110’ will be located at Axel’One, part of the IFP’s downstream R&D unit in Lyon France. It will be used to model internal combustion engine dynamics, catalytic conversion processes and geological CO2 sequestration.
Ener 110 is a massively parallel Bull-X B700 series DLC blade cluster with a 110 teraflop bandwidth. Its 378 nodes (6048 cores) are linked by a 56 gigabit Infiniband/FDR network. Local and remote 3D/3D visualization allows for interaction and collaboration on large models. The facility cost €4.5 million of which €3.1 million went on the hardware and € 1.5 on the electric infrastructure.
Ener 110’s B700 direct liquid cooling blades make the system self regulating—no independent air conditioning is required.
The facility was funded with a €2 million grant from ERDF, the European Union’s Regional Development Fund, €500k from the Rhône-Alpes region and the remainder from the IFP. More on ENER 110 (in French), on ERDF and on the Bull-X B700.
At the GE Oil & Gas forum in Florence (more next month), remote monitoring was the big topic. How do you assure cyber security in this environment?
Actually, the measurement and control unit is not directly responsible for remote diagnostics although we are involved with the Florence I-Center. M&C is a part of the control solutions business unit where we cover oil and gas, power gen and nuclear. Our cyber security services are leveraged in new builds and upgrades of major facilities with attention to security risks and change management, planning what infrastructure needs to shut down and for how long. We test for operating system vulnerabilities with anti-virus signatures, host intrusion detection and alignment with new regulatory requirements. We are now addressing network optimization, minimizing latency. In 2010 we introduced central account management with all of the above running for a central device that also controls update and backup.
So this is mostly about green field sites?
Actually even in the retrofit market, real estate is at a premium. Hence the importance of our October 2012 CAP Update release which introduced new virtualization technology along with a combined outer perimeter defense and firewall device.
What does virtualization mean on an offshore platform?
Space saving—in one such environment we have moved from 4 servers to 2 physical machines driving all the above services plus a roll-over capability. All integrated with the client’s architecture.
Does this mean integration with business systems and networks?
We tend to stay in the control systems space. But sure, these systems can broadcast information up to business systems. But you have top be careful here. You really don’t want to put commodity solutions in control systems. For one thing, very few people have the combined control systems and security background necessary to manage such a complex environment.
What operating system is used here is it a specialist real time system?
We use a core Microsoft operating system that we have hardened and tested to ensure that every control system works.
A few years back we reported from a cyber security event where the debate turned on ‘perimeterization’ vs. ‘deperimeterization.’ What’s the thinking on this today?
The focus is still on protection. Oil and gas companies in general follow the Purdue ‘defense in depth’ posture for control systems. This divides networks into protected segments and provides four layers of protection with different focus. Some parts of the network will only allow unidirectional data flow. Systems can be hardened or loosened up as needed. The mesh network approach uses ‘daisy-chain’ filtering with embedded controls.
Tell us more about the CAP Update.
The new release reflects our continuous process improvement effort adding central patch management, patch/tag inventory, push updates and reporting. We have added a common vulnerability scoring system (CVSS). This evaluates the criticality of fix, whether a reboot is required and how long this took in the lab.
What’s the service level, do you assure protection against anything?
No! The thing is that customers struggle with foundational security. There are lots of scary threats out there, customers are being targeted. Things are complicated further with BYOD and users accessing external media on the web. Employees can be a strong risk. So we support customers in their attempt to maintain a secure posture. Cyber security challenges are shared by vendors and customers alike. We advise on issues like allocating a security budget and partner selection. But no we can’t guarantee protection against all and any threats. We help customers understand their changing systems. But clients are the stewards of their assets.
And what of third party kit?
We operate on common control technology for turbomachinery and take a holistic approach to security. We do manage some third party components but our business is to provide a secure system. This requires a very good level of control over what is deployed and domain-specific knowledge in our field, rotating equipment, turbo compressors, gas/steam turbines and generators.
It sounds as though the ‘digital oilfield’ ideal is receding when a this subset alone requires such focus.
There are plenty of challenges in understanding and operating this kind of equipment properly. We can communicate information out to other, central systems. Also we do integrate from our sub domain out to other systems. But always in the context of overall plant security.
Do you advise on new builds?
Our role is in securing GE kit and helping integrate with other stuff. We are not really in the consulting space but will offer input and advice to clients, often reaching out to consulting specialists to make sure they have the required expertise.
Can you name a flagship client?
It is hard to get people to talk in the security space. We work across several verticals globally. There are some very closed security conferences, reporting on this topic is a challenge!
The abstract from an upcoming poster presentation to be delivered at the AAPG provides a snapshot into the digital field geology. The paper, authored by geologists from UK-based Midland Valley Exploration, traces the history of geological mapping leading to the advent of digital technology, in particular, with MVE’s own digital geologic mapping application, FieldMove.
The trail set teams of ‘paper based’ geologists against ‘digital’ colleagues and put them to work, mapping a small area in NW Scotland. The digitals used ruggedized tablet PCs equipped with Midland Valley’s 2DMove. Digital geologists were ‘slightly slower’ from outcrop-to-outcrop, but faster and more effective than paper based geologists overall. Paper based geologists had more work to do at the end of the day, digitizing or inking-in maps. Digital geologists had more software tools available for analyses which ‘improved geologic understanding’ and planning of the next day’s work. The study’s findings were used in the specification and development of MVE’s FieldMove package, used for tablet-based field mapping, More from MVE and on the upcoming AAPG gathering in Pittsburgh.
In a blog post this month, Chris Jepps (Exprodat) and Rich Priem (Priemere GeoTechnology) exhort upstream data specialists to ‘stop making basic positioning mistakes!’ This authoritative advice is based on the OGP’s geodetic awareness guidance note (Oil ITJ July 2007). OGP has published examples of geodetic errors that have led to wells being drilled in the wrong lease, missing targets and even intersecting adjacent boreholes.
Such errors stem from improper use of geodetic datums and coordinate reference systems. Jepps and Priem believe that many ArcGIS users in the petroleum industry are making these mistakes ‘on a regular basis.’ While ArcGIS includes tools for managing geodetic transformations, their application is up to the user. Enter the Geodetic Assistant, a free ArcGIS extension that checks for typical geo-gaffes. The Assistant alerts users when a coordinate transformation is ambiguous or wrong and suggests a fix. Read the Exprodat blog and download the Assistant.
While entry level tape has practically disappeared, enterprise tape is doing fine as reliability, capacity and bandwidth continue to improve. Eurotech’s Bernie Boyce offers this state-of-the-art snapshot.
The latest IBM TS1140 Magstar tape drive is being adopted by seismic contractors, datacenters and transcription companies. Magstar JC cartridges have a 4TB native capacity and are backward compatible with the well established 3592-E05. Also of interest is the new LTO6 solution with a 2.5TB native capacity. Both offer the ‘Linear tape file system.’ LTFS-based NAS* tape appliances add extra smarts like a non-proprietary file-based format, online, long term low energy storage, data protection, reporting and security. Another interesting development is the Crossroad’s StrongBox, an easy-to-deploy tape-based NAS appliance.
Eurotech recently installed a StrongBox for a non oil and gas client to automate project document capture, storage and access. The solution leverages DocuProtection’s secure DMS that collects all relevant files and documentation into a single project container.
HP is also on the LTO6 bandwagon
with its StoreEver portfolio. Moving from a ‘fragmented’ approach to data retention
to a ‘converged’ data center reduces cost and complexity. However, convergence
mandates a ‘comprehensive approach’ to data protection. HP’s David Scott
observed, ‘Tape is experiencing a resurgence as companies struggle with big
data archives and retention issues. StoreEver provides up to 44 petabytes per
system, while removable media makes for virtually limitless capacity. Also LTO6
cartridges offer a shelf life of up to 30 years.’ HP puts LTO6 costs at 2.5
cents per gigabyte. A 2010 study by the Clipper Group concluded that ‘energy
costs alone of disk-based systems exceeds the total cost of the average tape-based
solution.’ More from Eurotech
* network attached storage.
Energistics’ standards work continues apace with Microml, a standard for microseismic recording and a new IT architectural foundation for all of Energistics’ standards. The architecture initiative is a reprise of an earlier attempt to homogenize Energistics’ modeling languages, EnergyML (Oil ITJ December 2010) which crashed and burned. The first deliverable, a shared completion object, will be jointly released by the Witsml and Prodml communities in Q2 2013.
An update for the Java Witsml client dev kit has been released with Witsml 1.2/1.3/1.4 compatibility, multi-threading and full ‘Crud*’ capability. The Prodml production data standard, somewhat dormant of late, was revitalized with a set of change requests from SaudiAramco and interest from ExxonMobil.
Work with the Standards Leadership
Council has led to a joint PPDM/Energistics project to deliver a Witsml connector
for PPDM data stores. In a similar vein, OPC is collaborating with Energistics
on OPC-UA data exchange. The Witsml SIG is also working on its own high-frequency,
low-latency API for real time data. Work with the US FracFocus organization
will likely result in a merge of FracFocus data and Witsml. Energistics is also
coming up with a certification
program, which appeared to garner interest from one major which reported
having been ‘burned’ with out of date software.
* create, read, update, delete.
Late last year, participation in the World Wide Web Consortium’s (W3C) Oil, Gas & Chemicals Business Group (OGCBG) dropped below critical mass and has been shut down. The group has published an analysis of what went wrong and a possible path forward. One problem was the $10,000 per company cost of membership (only Statoil and Chevron were prepared to pony up). Some thought that the subject matter of the business group was already covered in existing industry forums.
One way forward for the now 17 individual-strong unit is to downsize to a W3C ‘community group.’ The advantage here is no membership fee. But, according to the OGCBG analysis, there is an obstacle to this move as it would expose the group’s semantic reflections to the scrutiny of ‘the media.’ The current state of the group is that it is billed as a ‘community group.’ Read the OGCBG’s plaintive swan song.
Blue Marble Geographics has consolidated its geospatial ‘Desktop’ modules into a new ‘Geographic Calculator.’
The 10.2 release of Caesar Systems’ PetroVR adds unconventional project evaluation across wells, facilities, drilling and EOR.
Drilling Info has announced a new reporting service DI Rigs. The service provides real-time location of thousands of rigs throughout the continental United States, currently representing around 80 %of active domestic rigs.
Enersight has expanded its eponymous drilling support system. Well planning now integrates previous plan iterations with actual results computing reserves, production, cash flow and NPV over time. A drilling and completions scheduling module supports pre-drill planning across land, site construction, rig scheduling, and fracturing.
Exprodat’s Team-GIS Discovery adds oil and gas-specific workflows to ESRI ArcGIS for Server. Team-GIS Discovery provides data-discovery tools, accessed from a web-mapping application and a suite of tools for search, map editing and annotation and high quality scaled hardcopy.
The 2013 edition Spatial Software’s FME Desktop adds support for X3D/VRML, 3D transformations, QA for 2D and 3D geometries. Point cloud tools now include calculations on intensity and echo count and filtering on color and intensity. Six new point cloud formats have been added, including ASTM E57.
Geologic Systems’ GeoScout 7.15 includes a Fluent/ribbon interface for the core analysis module with improved plotting, cutoff-based statistics based and LAS and core gamma ray data import.
GSE Systems has announced ‘3Di-TouchWall,’ a software/hardware bundle that provides a highly immersive simulation experience similar to a cave automatic virtual environment (CAVE) at a ‘fraction of the cost.’
iVizEx from Larson Software is a CGM and TIFF viewer for the iPad and iPhone allowing viewing and annotation of large well logs and other geoscience images—1009. Larson has also introduced various routines for manipulating PDF files in its Studio plotting solution.
Ipcos has a new service offering ‘Controllers@Max’ comprising a review of existing controls and an analysis of P&IDs, processes and historical performance. Ipcos then generates a list of control ‘pain points’ to be addressed in order to stabilize plant performance.
LMKR has announced GeoGraphix 2013 with improved scalability and reliability to address ‘exponential’ increases in data size, evolving geologic workflows and data mining requirements. The release is based on the highly scalable SAP Sybase SQL Anywhere 64 bit relational database. A new 3D topology engine supports more robust geomodel creation. LMKR is also working with Landmark to complete the transfer of GeoGraphix to LMKR.
Midland Valley has announced Move 2013 with new 2D/3D forward modeling and enhanced log viewing and correlation. Move now offers sector dependent nomenclatures for oil and gas and mining and a new Kriging tool.
Tofino reports that testing performed by Digital Bond has shown its security appliance resistant to the industry’s known sophisticated cyberattacks. The Modbus firewall withstood a variety of reverse engineering attacks, flooding, fragmentation and fuzzing—passing the tests with flying colors.
Merrick Systems has released a production data conversion toolkit to ease migration from Landmark’s TOW/cs to its own Production Manager.
Visual Solutions’ B2|Virtual Arena workflow collaboration tool introduces a new ‘high fidelity’ virtual presence technology for remotely shared desktops. B2|VA supports major oil and gas technical applications for science, engineering and operations.
AspenTech Software claims to have demonstrated ‘economically viable’ carbon capture at the Norwegian government’s Mongstad technology centre. Mongstad uses AspenOne software to reduce energy costs and lower greenhouse gas emissions.
Edition IV of the US Department of Energy’s carbon use and storage atlas puts the US’ potential for CO2 sequestration at over 2,400 billion tonnes. Storage in saline formations, depleted oil and gas reservoirs, and unmineable coal seams could store ‘hundreds of years worth’ of emissions from power plants and heavy industry. Some 225 billion tonnes could be used to enhance production from producing oil and gas fields.
Energy Department’s ‘FutureGen’ carbon capture and storage (CCS) initiative
has entered phase II. A coal-fired power plant in Illinois is to deploy
‘oxy-combustion’ technology to capture a million tons of CO2 per year. A sequestration
site is also under
Energy Points has announced ‘EnergyPoints Integrated Reports,’ a new application that enables businesses to issue a single, integrated sustainability report spanning the financial and environmental energy and resource use. The proprietary analytics platform converts water, fuel, waste, electricity and natural gas usage into a ‘universal’ key performance metric.
The US Department of Energy’s carbon capture simulation initiative (CCSI) has come up with a suite of computational tools and models that it expects will enable rapid deployment of carbon capture technologies. No less than 21 tools and models are available from the CCSI website. Potential cost savings from the CCSI toolset are put at $500 million.
Terra Technology has announced a ‘sustainability calculator’ to measure the environmental impact of inventory reductions, considered as possibly ‘the most important action a manufacturer can take towards meeting its corporate sustainability goals.’
The 15th edition of SMi’s E&P Data Management London conference offered a good snapshot of the state of the art, underscoring the trend to maturity of recent years. A good example of such was Jeanette Yuile’s presentation of Shell UK’s approach to data management in support of emergency response. Moving the data discipline into such a mission critical area involved changing culture. Data management used not to be perceived as a good career path compared with geoscientists and other ‘fancy people.’ Users were in general ‘too tolerant of bad practices.’ So Shell has now elevated the profession with better career paths and used a Kaizen-inspired approach to address data ‘waste,’ effect small changes and measure the results. Yuile’s vision is of DM as a ‘formal garden’ with clear boundaries and patterns and the data managers as gardeners working to ‘evergreen’ the data. The result is quality data and documents served from a central file plan and data atlas backed up with information quality metrics and a business rule repository. It was then feasible to leverage this ordered dataset on an emergency response portal with access to 20 key data types providing immediate access to a standard set of verified subsurface and facility information for use in case of a well emergency. The data improvement project and ER portal was initiated by Shell’s drilling and development teams and is to be deployed globally. Yuile attributed the success to Shell’s new information management culture, now ‘just the way we do things.’
David Lloyd and Malcolm Bryce-Borthwick updated GDF Suez UK’s 2012 presentation on the use of information management frameworks, in particular with the novel use of an ITILv3-based ‘partnership project management development framework,’ a set of Lego components to be built into different work streams. An assessment of GDF Suez’ data quality by Venture Information Management found the ‘usual things.’ Data was loaded directly to projects bypassing the data team leading to ‘skepticism and lack of trust.’ There was ‘severe Petrel project infestation’ with over 1000 projects on disk making it hard to know where an interpretation actually was. Developing the Cygnus field mandated improvement and got strong support from the CEO. GDF Suez is now working on a data quality framework and on project rationalization. This involves a migration from OpenWorks to a ‘scalable long term replacement.’ Following the lead from Paris HQ, the UK unit opted for Schlumberger’s ProSource and InnerLogix for QC. Work stream rationalization leveraged Blueback Reservoir’s Project Tracker, ‘an absolute lifesaver.’ This catalogued and rationalized 1,100 projects down to under 200 active on disk. Now new projects are monitored at creation to ensure that regional reference projects of QC’d data are used right, offering users a ‘friendly conversation’ if needed. Along with its QC role, InnerLogix is used to transfer data from ProSource to Petrel and Kingdom—OpenSpirit also ran. Approved tops and horizons go back to the reference project via InnerLogix too. ProSource had, to an extent, been ‘oversold’ and work by third parties Venture and DataCo was required to debug. Borthwick advised, ‘Be wary of vendor claims.’
Mario Fiorani reported that, a couple of years back, ENI’s users were having a hard time accessing basic geoscience data and were asking for access à la Google maps. Enter ENI’s ‘InfoShop Maps’ offering Google style search across ‘official’ data and text sources. Some 3.6 million items have been associated with an XY location and indexed with Microsoft Fast. The solution was considered more flexible than using MetaCarta. Geo-indexes are stored in a 1TB geodatabase. Fine tuning the geo-index took 60% of the eight month project. InfoShop Maps was developed by Venice, Italy-based OverIT.
IPL’s Chris Bradley and Trevor Hodges outlined how Composite Software’s data virtualization technology has been deployed by, inter alia, BP to hide the ugliness and complexity of SAP or Oracle LIMS. Virtualization ‘wraps’ such sources with a PPDM-based data model that feeds reporting and other apps. Hodges observed that, ‘If you have LGC apps they are moving towards data virtualization’ and/or a ‘data access layer.’ BP has around 1,000 applications in its exploration portfolio. Prior to the virtualization project, BP’s decision makers were creating their own data collections and storage systems leading to ‘inconsistent reporting’ and low confidence. All of which was accepted as normal. Composite’s solution has brought about a ‘90% gain in productivity and a 40% reduction in development costs. BP has implemented a high quality data model** addressing the MRO*** space and is now working on real time drilling with Witsml. Bradley concluded that data virtualization is only one component of enterprise information management, referring to a suite of papers he presented at another data conference recently and which we will be reporting on next month.
Samit Sengupta (Geologix) related a tale of ‘how Witsml saved the day’ for a deepwater West African operator. Rig sites still usually only supply binary Wits which makes it harder to use and error prone. Geologix managed to add metadata and translate real time Wits feeds to Witsml, creating curve data and mud log objects on the fly. A ‘cloud’ infrastructure, WellStore On-line feeds conditioned Witsml on to users, with some log processing for gas, pore pressure and net pay en route. The system is still exposed to the inbound Wits feeds and care is required on a crew change when the channels may get swapped.
Jess Kozman (Westheimer and Adnoc unit Mubadala Petroleum) described a ‘standardized data platform for an ambitious company.’ Previous case studies have failed to establish a correlation between data management and financial performance. Kozman believes he has discovered why. He has added a ‘complexity’ metric to the analysis, encompassing ‘technology, company size, geographic diversity and focus.’ For instance, the complexity of a pure play domestic exploration shop will be less than an international outfit.
Armed with his findings, Kozman showed management how they could increase production with better data management. The study also pinpointed a lack of resources (not technology) as Mubadala’s true problem. A change management process à la Harvard Business School was enacted. This process focused on three ‘quick win’ projects, all people/process related rather than software or technology. ‘Folks expected me to recommend a new data model…’ While there was no technology spend in the first six months of the project, a company-wide rollout of SharePoint 2010 was hijacked and used to plot operated licenses and track progress with graphics and Google Earth.
Jim Whelan took a very top down approach to ExxonMobil’s data management, drafting a letter to be signed by CEO Rex Tillerson that established a strong data ownership and governance model. This has resulted in a standard, global environment of processes and tools and an ongoing effort to continually enhance data in Exxon. Incoming data goes to the ‘data room’ for QC and legal ownership verification. Raw data goes to Exxon’s ‘InfoStore’ environment before loading to Petrel. InfoStore comprises the Exxon subsurface master DB, a log database (Recall), a seismic database (PetroWeb) and a standardized LAN. Interpreted and cleansed data goes to the ‘cross function database’ (XFDB) which is used in exploration, production, development and by the asset. Chunks of XFDB data can be carved-out for sale as required. The system has proved its worth in the non conventional arena (see this month’s lead). A good dataset can add tens to hundreds of millions of dollars to the sale price in a major transaction. Front end tools such as ‘ShowMe’ (SharePoint-based) offers a GIS-based interface to Exxon’s ‘prize jewels.’ Blueback’s project management package is being tested on Exxon’s plethoric Petrel projects (5,000 at the last count), ‘There is so much to do and so little money!’
Kishore Yedlapalli revealed that today Shell has around 50 petabytes of data online and this is set to grow tenfold in the next three years. In general, the industry’s data is in poor shape. Often, quality is not even measured and folks shy away from corporate data sources. Many are too busy to reach out to data suppliers to explain requirements and fix errors up front. The push for improvement is coming from the top. CEO Peter Voser said recently that Shell needs to ‘improve processes and use data better.’ Shell is implementing a single top-down business oriented KPI per organizational unit, along with bottom-up reports of errors and remedial hints. These are rolled-up into global, regional, asset and data source traffic light/KPIs. Data management is an ‘eternal’ issue, and should not be treated as a ‘project’ but as continuous improvement. Despite Shell’s KPI fixation, ‘You should not believe in green traffic lights, often only a small data sample is actually checked.’
Neuralog’s Robert Best reports that support for Finder will stop at the end of the year, although no one’s sure which year! In any event, the time is ripe for a shift from an ‘end-of-life’ legacy system like Finder to a PPDM-based solution (read NeuraDB). Such a modern solution allows reference value management, business rules and versioning to promote operated wells over public data sources. Upstream IM presents a broad problem requiring a customizable solution. Mapping from Finder to PPDM can leverage NeuraDB, Informatica or ETL. After migration workflows can be tested to SAP, the CDB and on to Petrel. A straw poll showed that maybe three still use Finder although there are still users in the States and lots in the Middle East.
Dave Wallis (OFS Portal) traced the history of e-business standards in the upstream ending up with PIDX, the global forum for IT standards for oil and gas e-business. PIDX was originally based on EDI but is now ‘all XML.’ It has been successful, one major’s annual PIDX invoices are worth $1bn and Chevron uses the protocol for 98% of its business. PIDX documents exist for purchase orders, invoices and field tickets. For smaller suppliers, connectivity extends to QuickBooks and Excel. PIDX also manages units of measure, currency and non repudiation. The free, technology-neutral standard works with SAP and Oracle Financials. PIDX 2.0 has just launched with mobile transactional capability and the inclusion of UNSPSC codes for ‘more granular’ asset tracking. Downstream, PIDX maintains the refined products code list for the industry covering ‘100% of the US Market.’ The next PIDX meet in April will be in the prestigious RAC club in London.
special interest group.
** a reference to Matthew West’s book?
*** maintenance, repair and operations.
The 2012 EMEA Honeywell Users’ Group held in Istanbul, Turkey late last year heard Ignace Verhamme’s overview of Honeywell’s Experion PKS Orion, a new ‘universal’ control system that is claimed to replace plethoric legacy IO systems. Orion offers enhanced operator displays, alarm management, processing and engineering efficiency. On the IT side, virtualization is the name of the game as was explained by Paul Hodge.
Virtualized system engineering is recommended for new builds or major retrofits where it reduces computer infrastructure and improves project ‘agility’. Virtualization allows designers to separate functional design from physical deployment. Virtual field acceptance testing on an onshore staging in the data center allows design freeze dates to be extended. This impacts hardware procurement as final equipment is not required until late in the project cycle when its specifications are well established. On site, virtualization and thin client operator workstations reduce the hardware footprint and energy requirements. High availability, fault tolerant blade servers can be reconfigured from engineering design templates to provision additional virtual machines on demand. Already, over half of Honeywell’s major clients are virtualizing and an estimated 75% of all servers will be virtualized by 2014.
Richard Siereveld’s presentation on terminal automation highlighted use cases in Saudi Aramco and ConocoPhillips. He argued against the supplier fragmentation that typifies many major terminals. Multiple vendors make for multiple ‘point solutions’ and, ‘point solutions are pointless.’ Using Honeywell as the main automation contractor, ConocoPhillips reported safer, more efficient operations with a tenfold improvement in a batching KPI. For Aramco, Experion KPS enabled a new loading system along with SAP integration to be deployed in record time.
Andy Coward’s case history of Holly Frontier’s Navajo refinery described how a long-established ‘mandraulic’ control by experienced operators was incapable of adapting to new processes. As time went on, more and more loops were stuck in manual mode and safety incidents were rising. Navajo decided to implement a regulatory control improvement program using Honeywell as MAC. This required fixing a lot of out-of-spec equipment before implementing a very successful loop tuning program with OpeTune. This has resulted in much more stable loop performance, better refinery throughput and environmental compliance. Read the HUG presentations.
Thomas Voytovich is to assume the newly created position of executive VP of international operations for Apache.
Stephen Jennings has resigned as chairman of the AspenTech board. He is replaced by Bob Whelan.
Board chairman Chad Deaton is to retire from Baker Hughes. Martin Craighead will take on the post in addition to his current roles of president and CEO.
Dustin Auch is to head-up Braun Intertec’s new office in Minot, North Dakota.
Sophie Zurquiyah has joined CGG as senior executive VP of its Geology, Geophysics & Reservoir. She was previously with Schlumberger.
Co-founder, CEO and president of Chesapeake Energy, Aubrey McClendon, is to retire. He will continue to serve as CEO until a successor is found.
Maria Lindenberg has been appointed chief procurement officer of Chevron, succeeding Leo Lonergan, who has retired. She will be based in Houston.
Andrew Lundquist has resigned as a director of Pioneer to join the executive leadership team of ConocoPhillips. He succeeds retiree Red Cavaney.
Herbert Pohlmann has stepped down as a director of Coil Tubing Technology, but will continue to serve as a member of its advisory board.
Richard Pattarozzi has been appointed to the board of Environmental Drilling Solutions. He is is also on the boards of FMC Technologies, Tidewater and Stone Energy.
Patrick O’Brien is the new CEO of the ITF technology facilitator. New ITF members include CNR, Expro and FMC Technologies.
Serge Rambaud of FairfieldNodal is to relocate from the US to France to promote its cable-free seismic systems in Europe.
SeisWare International has recruited Mike Tribble as senior business development manager and Hayley Marie Sands as sales consultant.
New members of the Fiatech advisory board are Peter Blake, Zuhair Haddad, Tom Hannigan, Patrick Holcomb, Bill Muldoon, Cameron Rezai, John Sanins and Ray Simonson.
Facility Solutions Group has promoted Tiago Silva to commissioning Manager.
Franz Cremers has quit the supervisory board of Fugro, sparking a 10% slide in the company’s share price.
Hercules Offshore has appointed Terrell Carr as senior VP drilling operations.
Jim Martin is now COO of Rock Solid Images. He hails from Spectrum.
Jim Pearson of ConocoPhillips is now oil and gas community head of the IACCM.
Ikon Science has recruited Denis Saussus as VP global development and Cristian Malaver as VP Americas QI services. Don Basuki joined Ikon as geopressure manager, Asia Pacific.
Glenn Hauer is now president and CEO of Inova Geophysical and Tim Hladik is senior VP of product development.
Knowledge Reservoir has appointed Dan Gualtieri as VP production solutions. He joins from Baker Hughes.
LoneStar Geophysical has named Doak Anderson as business development executive. He hails from Bertram Drilling.
Umair Khan has joined New Digital Business.
SGI has appointed Cassio Conceicao as executive VP and COO
Sigma3 has appointed John Ughetta as Executive VP of Sales & Business Development. He was previously with MicroSeismic.
David Grenier has joined Spectraseis as sales director. He hails from Halliburton. Richard Marcinew has also joined as engineering adviser. He was formerly with Schlumberger.
Charles McConnell, head of the DOE’s carbon capture and storage program has resigned as assistant secretary for fossil energy.
Yokogawa has appointed Takashi Nishijima as president and Shuzo Kaihori as chairman.
Aker Solutions is to acquire a majority stake in Aberdeen-based subsea well control equipment specialist Enovate Systems.
Baker Hughes has changed its mind regarding the sale of its Process and Pipeline services business. BHI announced the sale of the division, a component of its Industrial Services segment, last October.
Following completion of its acquisition of Fugro’s geoscience division, CGGVeritas has ‘simplified’ its name to good old ‘CGG.’ The Group is now organized around three divisions—equipment, acquisition and geology, geophysics and reservoir.
Reservoir Group has announced that following a management buy-out, Interica, its data management business is now operating as an independent company. Interica was formed with the combination of RG’s earlier acquisitions, InfoAsset and Enigma Data Systems. Reservoir Group is now focusing on its core business of downhole tools, technologies and associated sub-surface services. The company reports that, ‘Bespoke software and digital data management no longer aligns with our portfolio.’
Trading software house Triple Point Technology has acquired WAM Systems. TPT CEO Pete Armstrong said the deal would ‘help companies manage volatile commodity costs through the use of real-time, market-based costs in enterprise plans.’ WAM provides supply chain planning and optimization solutions for process including oil and gas. WAM flagship clients include LyondellBasell, PetroChina, and Saudi Aramco.
This month, President Obama announced the ‘timely production of unclassified reports of cyber threats to the US homeland that identify a specific targeted entity.’ Read the full address here.
UK-based ABI Research has estimated cyber attacks on oil and gas infrastructure will drive $1.87 (exactly!) billion in cyber security spend by 2018. Oil and gas Scada systems, full of vulnerabilities, are connected to the internet ‘where cybercriminals roam in all impunity.’ Researcher Michela Menting observed, ‘Lack of appropriate security has allowed destructive cyber-attacks to lay waste some of the most high-profile companies in the industry!’ More verbal scareware here.
ISN blogger Neil Meadows reports that attacks on the oil and gas industry has resulted in the theft of secrets and intellectual property by cyber thieves and so called ‘hacktivists’. A report from McAfee found that hackers have ‘run rampant’ through five oil and gas corporate networks years, stealing trade secrets. Enter Cisco’s intrusion prevention system offering real-time protection against viruses, trojans and even ‘zero-day’ attacks. More from the ISN blog.
Tofino’s Eric Byres reports that opinions differ as to what how industrial systems should be secured. While deploying the latest VPNs, anti-virus, firewalls and IDS is great, getting them to interoperate is like ‘pulling teeth’. A new spec, the Interface for Metadata Access Points (IF-MAP), is a possible way forward. The idea is for a central clearing house for network security events and information*. More from Tofino.
Yokogawa and McAfee have partnered to offer ‘holistic and value-added’ IT security solutions for the industrial automation world. The partnership will embed anti-virus software into Yokogawa’s control systems.
Industrial Defender has published a short, five step guide to planning for the latest NERC CIP V5 regulations covering cyber security of critical infrastructure. ID recommends that the lengthy procedure required to prepare for V5 audit next July should be started right away. More from Industrial Defender.
* Sounds a bit like the US ICS CERT service.
Trevor Harris and Suzanne Morsfield of the Columbia Business School report that some have questioned the usefulness of XBRL data and are ‘attempting to destroy the SEC’s XBRL regulations.’ XBRL has succeeded in providing users with free, interactively-available data as soon as it is filed. But its use and development could be improved. The authors recommend that filers, regulators, and developers focus on the data’s reliability and on value-added end user tools. Also, the FASB and SEC need to work on simplifying XBRL’s taxonomy. Filers should direct their energy on improving the quality of their own data rather than on bashing the SEC.
Also, XBRL needs to be taken over and run by technologists rather than accountants. This could be done through a partnering with the vendors such as IBM, Oracle and SAP and the web-based financial information suppliers and aggregators. Read the Columbia report here.
Energy Solutions International (ESI) has teamed with Houston-based Ingenious to embed Ingenious’ ProRPM performance monitoring software within ESI’s Pipeline Dashboard. ProRPM is to form a visualization and data integration backbone to the dashboard which can be deployed in the cloud or on a local server. The dashboard offers a window on data in ESI’s component software Pipeline Manager, Transporter and TransactionManager.
ProRPM is a cloud-based real-time performance monitoring solution that connects to third party data sources including OSIsoft’s PI stystem, simulation software and Microsoft SharePoint. ProRPM provides real time asset optimization from a ‘knowledge management and calculation engine’ and dashboards for self-service visualization. ProRPM was also used to extend Microseismic’s eponymous offering with third part data connectivity. More from ESI and Ingenious.
RWE AG is deploying the QlikView business discovery platform for ad-hoc analysis and monitoring of its KPIs*. The self-service BI approach will empower RWE’s 800 users to create their own analyses and reports. Qlik has already been used to develop an infrastructure ‘cockpit’, reports for IT controllers and project management analysts.
Qlik provides out-of-the-box access to existing data sources via standard interfaces including ODBC, OLE and an SAP connector. RWE has now consolidated information from its SAP business warehouse, Business Objects and a variety of other data sources. More from Qlik.
* key performance indicator.
Aker Solutions has signed a NOK 2 billion deal with Statoil for the supply of a subsea production system for Norway’s Aasta Hansteen field.
PAENAL, a joint venture of Sonangol, SBM and DSME, has implemented Aveva Marine for 3D design and outfitting.
Chevron Energy Technology has joined the Badger autonomous drilling group.
Belltree has partnered with Merlin Energy Resources on an integrated E&P offering.
Capgemini has teamed with EMC to offer cloud-based services in Brazil.
ESG Solutions has deployed its ‘SuperCable’ acquisition system for the Microseismic Research Consortium in Western Canada.
FMC Technologies has received an order from BP for the manufacture and supply of subsea equipment for the Thunder Horse field. It has also received an order from CNR International for subsea equipment for the Baobab field.
Evolution Well Services has selected GE’s trailer-mounted, TM2500+ ‘aero derivative’ gas turbine for an on-site power project in Canada.
GE Oil & Gas has won a $500 million contract for the supply of turbomachinery to Petrobras.
Genscape and Progressive Fuels have announced that their ‘renewable identification numbers’ (RIN) support for voice-negotiated trading is to transition to an electronic, ‘more transparent’ channel.
GasSecure has installed 20 GS01 wireless gas detectors on Statoil’s Gullfaks C platform.
Atlas Pipeline has chosen IFS Applications as its new enterprise solution.
Larsen & Toubro and Aveva are to offer owner operators and EPCs information management solutions and associated asset data cleansing services.
Mark Oil has selected KSS Fuels’ PriceNet Cloud to support fuel pricing across its network of 17 retail locations.
A McDermott unit has been awarded a $230 million turnkey contract on Pemex’ PB-Litoral-A production platform.
Oildex has announced that ‘one of the US’ largest independent oil and gas exploration companies’ has invested $3 million in its Spendworks hosted invoice workflow and approval solution.
OSIsoft and Schneider Electric are to provide a ‘comprehensive energy management solution,’ combining Schneider’s energy management with OSIsoft’s PI System. Schneider will market the solution through its Telvent unit.
Philadelphia Energy Solutions has implemented Ceridian’s Dayforce payroll and workforce management solution.
Skyline Products’ PriceAdvantage unit and OPIS are teaming to provide real-time fuel pricing and competitive data analysis to petroleum operators.
Pricelock ‘s new partnership program provides energy management consultants with access to its suite of energy transaction products and a large network of energy suppliers.
Badger Meter and ESC Services have joined Rockwell Automation’s ‘Encompass’ third-party product referencing program.
Shell UK has awarded Technip an EPC contract for its North Sea Gannet F reinstatement project. Shell’s Malaysian Sabah unit also awarded Technip an EPC contract for a tension leg platform on its Malikai Deepwater Project.
Yokogawa has been awarded a contract by GDF Suez E&P UK for the supply of an integrated control and safety system on its North Sea Cygnus development.
The Open Geospatial Consortium, OGC reports ‘strong interest’ in its web services phase 9 (OWS-9) testbed from a meeting last month at ESRI’s Redland campus. OWS-9 sponsorship totaled $2.65 million with an extra $ 5 million of ‘participant in-kind’ contributions. OWS-9 documents are in final review prior to public release on the OGC website.
PPDM and Energistics have kicked off a joint work group to investigate mapping between PPDM’s relational database and Energistics’ Witsml well data transfer standard. The work group is also to investigate standardizing depth calibrated scanned well logs images.
The Oil & Gas Producers association OGP has created an Earth Observation subcommittee in its Geomatics group to promote take up of the technology among OGP members and to develop guidelines, good practices and specifications for the use of earth observation and related products.
The Pipeline open data standards body has released PODS Open Spatial 5.1 for member comment. The release includes scripts to add optional spatial columns to several tables in the PODS 5.1 data model and scripts for sample data loading. Sample data includes a fictitious pipeline with ILI, pipe segment and valve attributes populated.
Next June, FracFocus 2.0 (a new XML format) will become the only method for submitting records to FracFocus. In the interim, IOGCC and the Ground water protection council maintain the legacy Excel-based submission capability to allow companies to prepare for the change. Checkout the beta test site. More from FracFocus.
The UK Chemical Industries Association has expressed ‘disappointment’ at the European Commission’s announcement of a review of its Registration, evaluation and authorization of chemicals legislation, Reach. CIA director Joanne Lloyd said, ‘This review fails to provide the clarity we need to tackle the regulatory minefield and will have a disastrous effect on smaller companies as the next round of Reach is implemented.’ More from the CIA.
The Austin, TX-based construction standards body Fiatech has announced new projects for 2013. One, ‘integrated workface planning and control’ will leverage previous best practices from the Construction Industry Institute’s Research Team 272 and the Construction Owners Association of Alberta’s workface planning committee. Fiatech plans to develop a data model of industry information resources.
Another project builds on the ISO 15926 standard to capture equipment data requirements and assess conformance (ERDC). This will ‘establish a common understanding’ of ISO 15926 usage in Fiatech and develop an assessment methodology for software conformance to ISO 15926 data structures. This effort will leverage the 2012 oil and gas interoperability pilot, a joint effort from Fiatech, Mimosa and the Norwegian POSC/Caesar Association.
Another ISO 15926-based project addresses piping data exchange with a ‘neutral file format compatible with ISO 15926’ for piping specifications, geometry and material standards and symbology.
Fiatech is also envisaging a cloud-based server for US building codes and standards with Compu-tecture’s ‘Madcad.com’ platform as a starting point. Finally, Fiatech’s regulatory committee and the American institute of architects are to collaborate on guidelines for state and local in the use of digital seals on construction documents. More on Fiatech’s projects.
The Fieldbus Foundation is to conduct a demonstration of its remote operations management (ROM) technology at Petrobras’ Cenpes* R&D facility. Petrobras is trialing the technology with a view to its deployment on the upstream and downstream projects that constitute its record three year $224 billion capital spending plan. The tests will be conducted on a distillation pilot plant to evaluate the use of ROM-based wireless devices for remote applications. Cenpes researcher Miguel Borges said, ‘The Fieldbus Foundation’s ROM solution is attractive to us. We want to access to diagnostic information from devices installed on our offshore platforms and other remote sites.’
Fieldbus Foundation CTO Dave Glanzer added, ‘This is the first demonstration of ROM’s capability prior to its specification in commercial projects. ROM lets end users remotely diagnose the condition of their automation assets and optimize preventive maintenance strategies.’
ROM is described as an ‘open’ digital infrastructure for asset management applications across tank farms, terminals, pipelines and offshore platforms. ROM integrates ISA 100.11a, HART and H1 protocols into a standard data management environment, extending Fieldbus’s scope into a ‘single source’ of data management, diagnostics, alarms and data QC. Other ROM tests are planned in India, Japan, the Middle East and Europe. Fieldbus members sponsoring the demonstrations include Emerson, Invensys, Honeywell and Yokogawa. More from Fielbus.
* Centro de Pesquisas Leopoldo Américo Miguez de Mello.
The US National energy technology lab has commissioned a half petaflop supercomputer from SGI to support its carbon capture and storage initiative. The system is being installed by URS Energy and Construction. The machine, an SGI ‘ICE Cube Air’ comprises 378 SGI Rackable servers with a total of 24,192 Intel Xeon E5-2670 processors, 72 terabytes of RAM 9 petabytes storage. As befits a compute behemoth addressing ‘green’ issues, energy use is a priority.
An ‘Eco-logical’ system of fans and four-stage cooling system achieves a power usage effectiveness of under 1.06. Interconnect is provided by Mellanox’ ConnectX infiniband adapters and IS5600 648-port switches. SGI’s professional services unit is to provide project management, installation support and training to URS and DOE staff. In a previous existence, URS E&C, as Washington Construction, built the US’ Minuteman missile silos and NASA’s Kennedy Space Center. More from NETL.
Following Husky Oil’s successful trials of Resman’s tracer technology last year (Oil IT Journal October 2012), Hess reports successful deployment on the Norwegian North Sea Vallhall field. Resman’s tracers are embedded in the outer body of a sliding sleeve built into the well architecture. Units are placed at strategic locations in the well. As production proceeds, the chemicals leach into the production stream and are detected at the surface. Since each location has a different chemical signature, they provide insight into which zones are producing and in what quantity.
Conny Gilbert, subsurface team leader with Hess Norway said, ‘It is like having a continuous production log without having to run a tool into the well. It’s a very elegant solution.’ Hess reports that ‘the cost of installation is low and the quality of information is high. Resman’s technology has a longer productive life span than conventional chemical tracers and is more environmentally friendly.’ Hess is now evaluating the technology for deployment at other locations worldwide. More from Resman.
Aliso Viejo, CA-based Telogis has announced a location based intelligence platform for oil and gas. Telogis for Oil & Gas (Togs) is a cloud-based service for tanker fleet operators that increases driver safety and fuel efficiency by monitoring a vehicle’s systems and sending alerts for unbuckled seat belts and aggressive driving (hard-braking, acceleration and speeding). Alerts are assembled in safety scorecards and enterprise dashboards and can be transmitted to supervisors by email or SMS if immediate action is required.
Togs also provides mobile workforce applications, monitors vehicle idle time, off-road mileage, power take-off use, miles driven and fuel used. Operators can import map layers and information on fixed assets such as wells, pipelines, tanks and leases to monitor arrivals, departures and the use of approved routes.
Telogis has also partnered with Ford motor to share diagnostic codes covering seat belt status, airbag deployment, system status faults, and tire pressure warnings. Togs also offers real-time work order management, dynamic routing, navigation and telematics for mobile workforces. Last year, Telogis teamed with FleetCor on remote monitoring of fuel card use (Oil IT Journal June 2012). More from Telogis.
An independent evaluation of pipeline leak detection technology conducted by the Pipeline Research Council International and Pacific Gas and Electric has proved the validity of Picarro’s pipeline leak detection technology. First announced last year (Oil IT Journal February 2012), Picarro’s ‘Surveyor’ field sensing system and ‘P-Cubed’ cloud-based processing platform detect the presence of minute traces of methane using patented ‘cavity ring-down spectroscopy’ techniques.
Tests on two controlled sites plus two real-world locations determined Picarro’s leak detection to be 1,000 times more sensitive than legacy methods. Comparison criteria included leak sensitivity, productivity and reporting capability. Two in-field trials located previously undetected leaks including seven grade 1 leaks that required an immediate response. P-Cubed analysis includes wind variation and atmospheric stability allowing operators to ‘sweep’ upwind for leaks.
Picarro Surveyor is an instrumentation and geo-informatics combo that operates from a moving vehicle, continuously transmitting data to the cloud. The system was originally developed for greenhouse gas detection and is used by atmospheric research institutions such as NASA and NOAA. Picarro’s isotopic technology is claimed to distinguish between natural gas and other sources of methane, such as sewers and landfills. More from Picarro and from PRCI.
Oil country tubular goods specialist Scan Systems has announced work order and billing modules for its Tubular data systems (TDS) package. New features include a web app that allows customers to log in and check inventory status prior to shipping pipe to a well site. TDS now tracks pipe movement in real-time providing work order progress and pipe status along with invoice generation, tracking and documentation.
Scan Systems VP Matt Rutledge said, ‘Operators need to track tubing and casing from reception to delivery. These new modules provide up to the minute information on footage shipped for a particular work order, what’s left in the work order along with billing for the services rendered. Inventory controllers and plant supervisors can manage API and customer specific inspection criteria throughout the workflow.’ Along with its software, Scan Systems provides pipe inspection equipment and automatic tally solutions. TDS was first released in 1994. More from Scan Systems.
SAIC has announced Digital Edge, the first component of Critical Insight, a suite of ‘big data’ optimization solutions for the enterprise. Critical Insight targets inter alia the ‘energy’ industry with a business intelligence solution that provides real-time alerts for use cases including cyber security threats, financial fraud detection and other ‘digital anomalies.’
Digital Edge enables capture of streaming real-time data and provides processing tools and ‘data enrichment’ i.e. merging of data sources. A ‘billions of data records per day’ bandwidth is claimed for the system. Enriched data is stored in a large, open source NoSQL platform which can be located in various cloud environments.
A couple of years back, SAIC divested its oil and gas consultancy to Wipro (Oil ITJ May 2011) so we asked if Digital Edge might signal a return to the vertical.
SAIC senior VP engineering JT Grumski told us, ‘Our big data solutions for oil and gas include gas sampling related to shale drilling, long-term environmental asset management and land use data management.’
‘Our technologies provide clients with web-based (including geospatial) solutions and data warehouses which organize and relate large data sets for long-term use and storage. We also provide complete lifecycle traceability of work performed for environmental projects.’ Brace yourself for a ‘big jargon’ onslaught and read the SAIC release.include ("copyright.inc"); ?>