Last December I had such a bee in my bonnet about standards that I forgot to do my annual ‘year in review’ piece. So here is a double edition spanning 2012 and 2013. In January 2012 our visit to Houston Digital Plant event turned out to a semantic surprise. The show was hijacked by the Fiatech/Jord community. Total’s 2 petaflop supercomputer was announced. In February, the Standards leadership council (SLC) was formed. In April we asked, ‘Is Microsoft MURA anything more than the SharePoint fan club?’ And BP’s HPC hit the petaflop. Re-reading my May editorial on kaizen and the cluttered desktop I have to confess that we have not got very far since then. There are still piles of paperwork all over the place and (now) four or five different computers on the go. I live in hope though. We now have a larger recycling bin for the waste and Microsoft’s switch-off of the venerable Windows XP means a forced retirement of at least two machines ‘real soon now.’
May brought the ‘internet of things’ (IoT) onto our radar. June’s editorial investigated the ‘sanctity’ of the company presentation and its counterpart, ‘reproducible’ research—a topic that I feel is worth a bit of a crusade. Hadoop got a mention at the 2012 PNEC even if it was just a teaser.
July carried shameless plug for www.oldsports.com (oops there’s another one!) and saw some more a propos developments at the EAGE in the field of computer-aided interpretation. In September we proudly featured a letter to the editor from no other than SAP’s Hasso Plattner who took issue with our review of his book on Hana. The review is up on Amazon by the way where it has been judged the book’s ‘most helpful critical review.’ Fame at last! Hadoop got a slightly more credible plug in our report from UtiliPoint.
In October my ‘training the data managers’ editorial asked ‘whatever happened to programming?’ which seems to be a good question judging from the subsequent controversy. We also extended our coverage of process safety with reports from the UK Institute of mechanical engineer’s event (Buncefield) and the US Chemical safety board (Macondo). Seismic trace counts ‘went wild’ with 100,000 deployed by Aramco and a million trace system mooted by HP/Shell/PGS. November saw Chevron follow BP down the data virtualization route and BP catch up with Total in the petaflop stakes. We also reported (rather importantly I think) that the ‘semantic’ ISO 15926 standard is ‘based on a misunderstanding.’ All of 2012 is now in the public domain on www.oilit.com in case you are not yet a subscriber.
And so to 2013, which kicked off with a statement from Emerson CEO Larry Irving to the effect that oil and gas is ‘a technology leader for other industries to follow.’ GE was also plugging its ‘industrial internet.’ We belatedly tracked down the final report from Norway’s Integrated operations in the high North to learn what many had suspected all along, that its semantic technology was ‘immature and hard for domain specialists to use.’ Lundin virtualized its IT.
Our February lead was the first of several to jump on the ‘big data’ bandwagon with a report from Shell’s Cold Lake. We reported on the demise of the W3 oil and gas semantic group. Honeywell offered a new slant on process control virtualization. In March, SAP announced Hana for oil and gas and Premier virtualized too. There was yet another attempt to get to grips with big data and Hadoop—with our best shot so far from a GE expert. We also learned that the GE’s IE ‘will never be a protocol’ which seemed curious. At the SPE Digital Energy show we heard Chevron CIO Louie Ehrlich sit carefully on the fence of the lead/lag debate, ‘While oil and gas is a technology leader and already gets a lot from IT, we are leaving value on the table because of the mixed reception given to the partnership of engineering and IT.’
In April, semantics was back with a vengeance in our review of ‘Shared earth modeling,’ from a team of researchers at IFP Energies Nouvelles. There was more safety and risk management coverage as the US Chemical safety board gave Chevron a roasting over poor refinery design and safety documentation. Rather alarmingly, the CSB’s findings ‘apply to all refineries, plants and industry.’ In an unrelated, but timely announcement, DNV Software updated its ‘EasyRisk’ risk management solution.
In May I produced my infamous ‘bunkum’ editorial on data management! In June, inspired by the young faces at the London EAGE I suggested that it was time to retire the ‘technology laggard’ story to avoid continued embarrassment. In July we reported from the Belsim user group which proved an eye-opener on data quality in the process industry—a technology lead if ever there was one. September included a report from PODS’ first EU meet showing the vigor and focus of the pipeline IT community. Thereby hangs a tale which I will save for a later date.
In October we reported from the New Orleans SPE on ‘fatal flaws in the risk matrix’ and heard more on GE’s Industrial Internet which is seemingly deployed by BP in its Advanced collaboration environment (ACE). Elsewhere we heard that the Industrial Internet is the same thing as the IoT!
Despite the lovey-dovey SLC’s efforts, November brought an attack by the OMG on the OPC’s UA protocol.
So whither now? Well, when you are through reading this, the last issue of 2013, you will have heard from UK-based researchers Ovum that oil and gas is an information technology laggard. You will have observed that just when you thought it was all over ... semantics are back with a philosophical-ontology-based swords-to-ploughshares initiative behind Tullow’s master data. Also this month, Hadoop comes of age in the upstream as a component of Landmark’s ZetaAnalytics.
A great 2014 to you all!
For oil and gas focused attendees, the highlight of the IRM UK Enterprise data and business intelligence conference held last month in London was the presentation on a ‘robust common master data foundation for oil and gas’ made by Tullow Oil’s Mesbah Khan and Chris Partridge of Boro* Solutions. The upstream presents a complex landscape of vendors, contracts and assets. Each contractor has a different system and interoperability is a problem. Tullow’s approach has been to focus on consistent terminology and clean data across drilling, development, operations, finance and legal.
The Boro methodology is a ‘Clean’ and ‘Pure’ data architecture developed for the oil and gas vertical. ‘Clean’ describes the overall process of data consolidation, loading, enrichment, assimilation and ‘novation.’ Further deconstruction of the awkward acronym reveals a sequence of data cleansing and repackaging into a single ‘Pure’ model sans duplication—the holy grail of a single source of the truth.
‘Pure’ in turn breaks out as a ‘precise, unambiguous, re-usable and extensible’ view of the data world that seeks to build bridges between islands of transactional data. This is achieved using a semantic layer and a foundational ontology. The latter builds on Matthew West’s work for Shell on extensional 4D (time based) data modeling such that, for instance, petroleum agreements can be modeled as objects that vary in time and spatial extent.
Further ontologies describe complex relationships between joint venture partners, social entity, business units and group legal structures.
After his talk we quizzed Partridge on how Boro integrates the standards space. He pointed us to a presentation showing the evolution of Boro’s industrial ontology from roots in ISO 15926 Part 2, ‘Ideas,’ the International defense enterprise architecture specification, and the US ‘DoDaf’ Department of defense architectural framework.
Ontological modeling is not for the fainthearted. Prior work for Shell on the Boro website has it that, ‘ontological understanding needs to be separated from epistemological and implement-ational gloss’ and that ‘collaboration between conceptual information systems modelers and those involved in philosophical ontology is potentially fruitful.’
Tullow’s ontology derives more prosaically from IBM Maximo. Other industry data models (Witsml, PPDM, Norsok) appear to have been found wanting. The Tullow development was a four person-year project. The IRM-UK presentation was made with Prezi, a snazzy alternative to boring old PowerPoint. Visit IRM-UK and Boro Solutions.
* Boro—Business object reference ontologies.
Halliburton has rebranded its recently-acquired UReason solution environment (USE) for real-time analytics as Zeta-Analytics. Zeta is a ‘big data’ platform architected for drilling and completions. Zeta captures Witsml feeds to a data warehouse (qualified solutions include Teradata, EMC Greenplum and IBM/Netezza). The toolset acts as a bridge to more static data in Landmark’s engineering data model, EDM. An Apache Hadoop cluster is deployed for real time model scoring.
The Zeta philosophy is that much of the real-time data streaming from the digital oilfield is discarded after drilling even though it has the potential to improve future operations. The Hadoop-based framework provides a unified data model with embedded drilling knowledge. Models can be tuned to new drilling environments and deployed across multiple wells.
UReason’s technology has previous oil and gas form, in process control and intelligent alarm management. The technology is claimed to provide operators with more time to react to ‘incipient abnormal situations.’ UReason underpins Invensys’ WAM alarm management. More from Landmark on www.landmarksoftware.com.
What’s your background?
Energy and utilities. I was CIO for hydrocarbons and chemicals with Koch Industries. Around year 2000, the company was looking into its e-commerce strategy, trying to figure out which standards were for real—around the time of the dot com bust! We adopted the CIDX* protocols which were based on EDI at the time. But there was a fever catching on with the new XML technology. I was then recruited by Sheridan Production to develop its IT capability. I took on the role of PIDX chair last August.
How did that come about?
Following PIDX’s independence from the API last year, the organization realized it needed full time leadership. I was hired to focus on increasing adoption of the standards and to invigorate the volunteer base.
How does PIDX fit with its members companies and e-business hubs?
There are three member types, buyers, suppliers and trading partners. The latter could be hubs or solution providers. We try to balance the three with a focus on commercial transactions and supply chain interaction. We stay clear of geoscience. It is important to define the boundaries and build on what are already well-defined EDI standards. There is good take-up in the downstream.
EDI is something of a legacy technology?
Yes, but it is still used a lot even though XML has been around for ten years. It is up to the users to chose.
Is the XML side of PIDX struggling?
There is good take up in North America, less so globally—which is why we have embarked on our outreach program.
What would be a flagship XML project?
The TDXS terminal data exchange and global bill of lading/right to lift initiative is getting a lot of attention today. As is the supplier KPI project which is developing a data dictionary and automated data transfer between suppliers and operators. But the AS2 and RosettaNet protocols remain current. They are strong and secure. What the future holds is still a matter for discussion.
Are standards less necessary today than a decade ago, now that apps understand so many formats?
Not at all! We still want to promote standards as enabling better supply chain integration. Chevron for example directs billions of dollars worth of transactions across PIDX.
Do you see members mandating PIDX use in contracts?
We don’t tell our members how to do their business.
* The chemical industry data exchange was wound up in 2009 and its IP is now managed by the Open applications group.
Speaking at the 2013 Pipeline open data standard user conference in Sugarland, TX earlier this year, Susie Sjulin (DCP Midstream) provided the keynote on breaking down the data silos with GIS and the Pods/Esri spatial data model. Pods has provided DCPM with a common data language and a single version of the truth. Sjulin’s silos (commercial, operations, land and compliance) have been bridged by using Pods for both records management and mapping. The Pods/Esri framework is also a foundation for third party applications. Documents are housed in a Xerox transactional content management database with links to the Pods master. Delorme’s XMap has proven popular application with 1,500 users across engineering operations and compliance. XMap provides a robust way of synching field operational data and the Pods master. Other key apps include IDV’s Visual Fusion, MapSearch Envision and oil country software from New Century and Coler and Colantonio. Executive backing has enabled DCPM to acquire data of the requisite accuracy and the effort has built a trusted data source and an open data environment for decision support.
Further support for the Pods/Esri combo came from BP with presentations from Craig Hawkins and Narmina Lovely. Hawkins’ presentation was covered in our September 2013 report from the EU Pods Vienna meet. Lovely built on this with spectacular imagery of a complex subsea development, the result of a 2 ½ years program implementing the Pods/Esri spatial model in BP’s major projects. The Pods model has been extended to cater for offshore jumpers and flowlines with a ‘pipe in pipe’ structure. Risers, umbilicals and subsea structures are all accurately modeled in 3D. The resulting enhanced Pods/Esri spatial model is now used to capture subsea inspection data and to provide detailed maps of assets along with detailed bathymetry. The latter has been captured leveraging the OGP’s subsea data model—the subject of a further presentation by Lovely.
Eagle Information Mapping’s Tracy Thorleifson provided a limpid introduction to the Pods/Esri spatial data model co-authored by Lucas Hutmacher of Willbros Engineering. Pods/Esri spatial is the Pods data structure implemented in a ‘geodatabase.’ The geodatabase is a software product from Esri that extends the standard relational database to store geometries and allow the rapid display of geographical information. Thorleifson outlined some of the compromises and differences between relational and spatial editions of Pods to conclude with a strong endorsement for the spatial version. This provides ‘out of the box’ functionality for editing, geoprocessing, web and mobile deployment. ‘With Pods relational you bolt on a GIS, with Pods/Esri spatial your data is the GIS.’
David Harrison (Pacific Gas and Electric) provided a compelling case history of Pods deployment that began with a ‘developing storm’ following the 2010 San Bruno Incident when a 30 inch pipeline failure occurred in a residential neighbourhood of San Francisco, leading to strengthened legislation from the PHMSA/NTSB. PG&E have addressed their data issues with a ‘pipeline features list’ (PFL) spreadsheet that acts as a staging post for data load to the Pods master. The PFL has captured many complex data ‘quirks’ relating to pipe fittings data that PG&E is to share with the Pods organisation. Read the Pods presentations here.
A new unit of Acacia Research Corp., Dynamic 3D Geosolutions, has partnered with the owners of patented technology for 3-D geosciences modeling. The patents cover ‘methods and systems for performing dynamic, 3-D geological and geophysical modeling used in oil and gas exploration and production.’ Acadia declined to say whose technology and patent was in play but VP Charlotte Rutherford told Oil IT Journal, ‘This is the first of many opportunities in our roll-out of a world-class energy sector platform. Our mission is to empower companies and inventors to realize revenue via the licensing of intellectual property. The digital oilfield market is key area of focus.’
A Business Week article
described Acacia as a ‘non practicing entity,’ a polite way of saying ‘patent
troll.’ (Acacia has elsewhere
been described as ‘the mother of all patent trolls). Patent trolls use
the courts to enforce patents that they own. If your modelers are feeling the
heat you may want to talk to Yetter Coleman (
Emerson’s Roxar unit has announced RMS 2013 (sic), the latest edition of its flagship reservoir/geomodeling package. The new release heralds a ‘move’ into seismic interpretation along with ‘model driven’ interpretation capabilities and tools for quantifying geologic risk early in the interpretation process.
RMS 2013 automates the model building process by generating thousands of models from estimates of uncertainty in an interpretation. These are then clustered into statistically significant ensembles that best characterize the underlying data. ‘Uncertainty maps’ of key risks in the prospect can be generated to pinpoint areas for further study.
Roxar MD Kjetil Fagervik said, ‘Quantifying risk has never been more important than with today’s complex geologies and marginal prospects. RMS’ model driven interpretation and uncertainty management make for streamlined, validated workflows where the interpretation itself is the model.’
As revealed in Oil IT Journal (
Teradata, along with partners Tibco/Spotfire and Hortonworks invited Oil IT Journal along to their EU roadshow last month. Tibco’s John Guthrie kicked off the proceedings with a reprise of a Spotfire Energy Forum presentation on Chevron’s water flood surveillance. This mashed data from operations and finance into a holistic asset view. Spotfire has application everywhere—from E&P to O&M. The latest release (V6) provides metrics on handhelds, multi-layer maps and event stream analytics. In the non-conventional space, Guthrie cited a recent webinar where Tim Yotter (Encana) discusses big data analytics for production surveillance and completions optimization. More from Spotfire.
Next up was Hortonworks’ Ben Rudall on the business value of Hadoop. Hadoop is an open source data architecture tuned for ‘big data’ analytics. The tool has been in regular ‘at scale’ use chez Yahoo since 2008. Hortonworks was formed in 2012 with the objective of marketing and supporting ‘enterprise’ Hadoop with a RedHat-style business model. Rudall positioned Hadoop as a component of a data architecture alongside the RDBMS, Teradata and data warehouse. With Hadoop, it is apparently possible to ‘manage seismic data in under 15 minutes!’ More from Hortonworks.
Teradata’s Niall O’Doherty continues in his quest to convince the upstream on the need for a better handle on its big data. Oil and gas has been a consumer of big data for years, especially seismic. But with the PC revolution, much data is now is silos (read Petrel). Other verticals (such as retail and telcos) are leveraging novel architectures for their big data. Oil and gas has made the first steps, with better handling of metadata, but it has not yet made the move to the data warehouse. Today, interest in ‘big data’ has revitalized O’Doherty’s crusade to sell Teradata into oil and gas. ‘We are a fork in road and need to decide whether we need more big applications or better analytics.’ The true potential of better data availability is in facilitating deeper analytics on new and different data types. O’Doherty presented a case study performed for an EU major on 4D seismic data management that leveraged a combination of Hadoop and Teradata to accelerate seismic processing between successive surveys. The project leveraged the Mahout machine learning application to classify and score seismic anomalies. This represents a new paradigm for seismic data management—breaking down data and functionality into small chunks. There are also applications of the technology in production monitoring which is today ‘like watching TV.’ Here Teradata has developed an integrated data environment for hauling, production surveillance, maintenance and digital oilfield that is being trialled by an Eagle Ford operator. This enables queries across 12 domains and 120 data sets. Within a year the project gave a 90% reduction of shut in wells and a $6 million per month savings due to better logistics and production management. ‘You don’t have to throw away your investment in (say) Petrel, just build a unified data architecture with Teradata, Spotfire and Hortonworks.’ More from Teradata.
Eliis has announced a data link between PaleoScan and Schlumberger’s Petrel. Both Petrel and PaleoScan data is displayed in the tree view.
The 4.3 release of Recon’s eponymous geological interpretation toolset targets multi-disciplinary unconventional reservoir interpretation within a stratigraphic framework. Recon integrates seismic, log, core, frac and production data adding strata-slicing technology for analysis of prospect potential.
The new Pason rig display is a ruggedized touch screen computer for use outdoors and in hazardous locations. The display allows the rig crew to view the Pason EDR, directional system and third-party drilling applications from anywhere on the rig. The Class 1, Division 2 hazardous area certified unit is IP66 water and dust resistant. The unit runs Windows 7 on a 19 inch touch screen.
Safe Software and Caris have teamed to provide an interoperability solution for gridded bathymetry and elevation data. Safe’s FME 2014 now offers support for the Caris spatial archive (CSAR) raster format offering ‘cutting edge’ data storage technology for high volume raster data. More from FME and Caris.
Oildex has announced enhancements to its online invoicing application for suppliers submitting electronic invoices through the Oildex SpendWorks platform including a new user interface and account features that improve interactions between suppliers and operators. The company has also released Spendworks Mobile Approval, a mobile invoice solution for E&P operators.
Exprodat has released its Petroleum data assistant for ArcGIS Desktop, an addition to its Team-GIS suite of oil and gas software. The assistant supports data transfer between oil and gas data formats and ArcGIS.
American Industrial Systems has introduced a range of intrinsically safe, 15” stainless steel monitors for deployment in oil and gas Atex zone 2 hazardous areas. The displays are compliant with the UK dangerous substances and explosive atmospheres regulations 2002 (Dsear) and the corresponding US Occupational safety and health administration regulations.
CGG’s Hampson-Russell unit has announced HRS-9/R2 with an early release versions of ProAZ, azimuthal frac data analysis module and new software development kit. A Petrel data link is also planned.
Gems’ new electro-optic liquid level sensors are designed for use in flammable and explosive atmospheres.
Tessco unit Ventev Wireless Infrastructure has announced a line of outdoor wireless enclosures for Scada, wireless backhaul and broadband communications. The preconfigured units include battery backup and are tailored to the oil and gas industry’s challenging environments.
Startup Mobile Fueling Solutions’ ‘virtual pipeline’ sets out to make compressed natural gas more accessible in the US. What is a ‘virtual pipeline?’ A convoy of specially engineered trucks!
Some 200 attended the 2013 Dome Exhibitions’ oil and gas ICS cyber security forum held earlier this year in Abu Dhabi. For ADCO’s Riemer Brouwer, the secret to ICS security success is a risk-based approach. While this is generally recognized in IT, it is often overlooked in Scada systems. In one review it was found that ‘pantries are often better protected than control rooms!’ To IT, Scada systems are complex and can be hard to understand. But the reality is that they are relatively simple and only a few core components are security-critical. On the other hand, several commonly held beliefs regarding control systems are actually misconceptions that expose companies to unnecessary risks. For instance Scada systems are not separate from corporate IT, they are not necessarily well protected from unauthorized access and the knowledge required to effect an attack is not all that hard to obtain.
Brouwer recommends the ISO 27001/2 standards as a framework for security and risk management. While developed for corporate IT, the same principles can be applied to control systems. Implementing the standard also facilitates a fruitful dialog between IT and control system specialists.
Ayman Al Issa (ADMA-OPCO) tempered Brouwer’s optimism, warning that rendering industrial automation control systems secure has failed because of the diversity of control systems and the use of components of various vintages. Three years on from Stuxnet, we are aware of the problem but have made little or no progress, ‘We still don’t know what we want to do!’ Automation vendors cannot provide a defence in depth solutions. While cyber security vendors do better, they are not yet truly ‘on board.’ Securing existing systems is hampered by an over reliance on procedures and guidelines. Controls are weak and there is a risk of conflict between automation systems providers and cyber security solutions. While vendors may offer a solution, the big question is, how can it be implemented and supported throughout the plant’s life. This is a ‘gigantic problem.’ Al Issa advocates engaging a main cyber security contractor—à la main automation contractor—to deploy and support cyber security across the plant’s lifecycle. The new MCSC should be under the MAC’s responsibility and only deploy tested components and fixes.
David Alexander, who is with EADS unit Cassidian, told an industrial detective story involving control system forensics. Forensics can be done on databases, log files and mobile devices to detect and investigate intrusion. Objectives of such investigations allow for identifying the duration and mechanism of an intrusion and, most importantly, how to stop it from happening again. Investigations must comply with legal constraints to be admissible in court. Alexander outlined some basic investigative principles—data must not be altered, all electronic evidence needs an audit trail and all investigative personnel need formal certification. Cassidian has developed a methodology for collecting, analysing and presenting incident-related data in a compliant manner. Alexander presented the results of a real-time investigation into a hack on a Siemens traffic light controller. More from Dome.
PIDX, the petroleum industry data exchange standards body describes itself as the global forum for oil and gas e-business. PIDX was spun out of the American Petroleum Institute in 2010 and is now an independent organisation.
Statoil CIO Sonja Chirico Indrebø kicked off the session with a presentation on creating business value through standardization and process automation. This is a delicate balancing act as, ‘you can’t force the business to be standard.’ Previously Statoil has a 3-5 year IT project cycle time but this ‘did not give what we wanted.’ Projects now are allocated 12 months max with prioritized goals. Statoil is at the forefront of big data but, warns Indrebø, digitization per se creates a lot of data but no value. Statoil is now working on algorithms to realize the value, leveraging information standards to correlate data across domains. Regarding cyber security, Statoil is frequently targeted, for financials, intellectual property, business information. ‘People will steal anything.’ Key here is analytics on log files and network activity and a risk based approach. The ‘bring your own device’ (byod) situation is interesting—for senior managers, IT means mobile and they like the latest gear but these need to be used in a secure way. In the Q&A, Indrebø observed that taking away manual data activity is good as it improves data quality and frees people up to do higher value stuff. ‘It makes sense to digitize as far as possible.’ If as some believe, oil and gas is a twilight industry, sustainability can only be achieved by a continued focus on lowering costs. IT is to make a significant contribution to Statoil’s goal of over 2.5 mm boepd by 2020.
Supply Chain Insight editor Pete Loughlin observed that 90 days payment terms could be tough on smaller companies that pay high interest rates on working capital or around 20% factoring fees to their banks. Meanwhile a large buyer may be lucky to get a 1% return on cash on hand. The difference—the 19% ‘leaks’ into the banks’ pockets! But supposing the customer is the supplier’s banker—with help from a trusted third party. The only problem is that companies’ purchase to pay (P2P) and e-invoicing systems do not in general support such a holistic workflow. But there is help from third party service providers like Loughlin’s favourite, an unnamed technology provider that brought 95% on time payment to a US utility along with $46 million from the win-win ‘dynamic discount program.’
Those curious to know who were the unnamed utility and service provider did not have long to wait as Bertram Meyer, Taulia CEO and co-founder took the stage to explain his company’s ‘dynamic discounting’ technology. The utility in question was Pacific gas & Electric which decommissioned its accounts payable website last year and has partnered with Taulia to provide its suppliers with a free online tool for invoice submission along with the early payment discounts. Other oil country clients include National Oilwell Varco, Halliburton and Johnson Controls.
Jan-Erik Pihl presented on Statoil’s supply chain execution, planning and collaboration. Statoil has some 10,000 ‘cargo carrying units’ (CCU) on hire making around 14,000 port calls per year. The company is working to change its mindset to just in time—with a vision of where everything (supply base, trucks, offshore) is located. Suppliers are required to deliver ‘on time in full.’ Goods need to be held on base until everything is there. Goods are tracked with RFID/Bar code tags and via the Logistics Hub (more below). Supply chain event management monitors activity continuously. The system flags late delivery and offers penalties and bonuses as appropriate. Statoil leverages tracking solutions from Identec Solutions. These include RFID CCU tracking with ISO 18000-6 Gen2 certified kit and GPS devices on high value assets. Here Statoil is an active member of the Norwegian standards body GS1. The problem is that RFID is a technology not a standard. Implementations use different numbering/naming conventions, frequencies and IT interfaces. Some areas (drill pipe, container) have their own standards for frequency and codes. Statoil is working towards full asset visibility for CCU owners, facility and fabrication yards across transport, supply base and offshore installation.
Thore Langeland, strategic advisor with Norway’s Epim (E&P IM) organization presented the EPIM LogisticsHub (ELH), a knowledge base for cargo carrying units (CCUs) and offshore equipment. Norway’s integrated operations project which ran from 2005 to 2010 begat the automated identity and data capture standard AIDC and the oil and gas ontology (OGO). These leverage the W3C’s semantic web and the ISO 15926 generic information model, a.k.a. the GIM reference data library. The GIM RDL covers ‘pretty well everything,’ from HSE, through seismic to drilling and completions—offering a ‘holistic view of asset management.’ On the RFID front, the Norwegian oil and gas association has issued guideline 112 which presumably relates to GS1 above. EPIM’s logistics hub provides an event-based conceptual data model that leverages other data sources including the Brønnøysund company register, the NPD Fact Pages and more. The only catch is that users have to install equipment at all their sites. But this is ‘a very good investment for the industry.’
Exhibitor Christian Hjorth-Johansen gave a good pitch for CMA Contiki’s system for the management of formal contracts and commitments. Contiki’s contract management systems (CMS) has been around since 1981. French major Total is a user since 1997 ‘and we are still friends.’ Much key contractual information is informal—kept in emails, minutes, letters of certification and bank guarantees. Often these are ‘organized’ in personal folders all over the shop and stuff gets moved and/or deleted. Enter Contiki CMS which is used to write contract-related letters and emails. Contiki structures and manages the process of capturing all contract information and commitments. More from CMA Contiki.
As for PIDX, current hot projects include the supplier key performance indicator (KPI) initiative now in its second phase. The KPI group is working to standardize KPIs such as total recordable injury rate and non productive time. Read our interview with PIDX CEO on page 3 of this issue and access the PIDX presentations here.
Speaking at the SAS day at Texas A&M university, SAS’ Keith Holdaway offered three case studies on the use of analytics—specifically, SAS’ Semma methodology. Semma, ‘sample, explore, modify, model, assess’ has been applied to complex upstream problems by Shell, Saudi Aramco and an unnamed ‘major Barnett Shale operator.’ The commonality between these case histories are that they involve multi-variate optimization and are therefore amenable to a statistical approach.
Shell used Semma to identify the most significant variables in a set of ‘fragmented, unreliable, sparse data’ from 211 wells in the Pinedale anticline field. Some 2400 frac stages were studied to provide Shell with an understanding of what distinguished poor from exceptional wells. The program also allowed the company to develop new models for pressure depletion and to eliminate unnecessary completion stages. Overall, optimization opportunities were identified in 25% of the wells studied. The results of this study have been presented as SPE 135523.
The unnamed Barnett Shale operator was confronted by a large number of interacting parameters including the impact of proppant volume on production, the difficulty of isolating significant variables that impacted the fracking process and how multiple facets of the region’s complex geology interacted with its engineering approach. Semma was applied to an 11,000 well data set to show which variables made the most important contribution to key performance metrics such as drilling cycle times and well lifetime production. Semma allowed the operator to develop ‘consistent and repeatable’ workflows and to identify opportunities for future cost reduction. The study brought a 30% reduction in proppant costs and a new modeling methodology for de-risking new plays.
Operating a mature field, Saudi Aramco realized the need for a change in its business process to combat production deferments. Previously, multiple domain silos created process inefficiencies, slowed analytical efforts and led to inconsistent data quality. The tools available to analyze the large numbers of wells were limited and results from ‘deterministic’ well studies were inconsistent. Again, SAS’ technology was leveraged to achieve a 25% reduction in deferred production and improved forecast accuracy. Aramco now has speeded data collection with standardized and automated production surveillance. This presentation was extracted from SPE 141110.
Josh Wills described how ‘data science’ is performed using SAS and Cloudera’s Hadoop implementation. Data science falls somewhere between statistics and software engineering. For big data, whether from click streams or oilfield sensors, the value is only realized when data scientists can access all the data at once. Here the tools of the trade are SAS’ Lasr and Cloudera-ML. Read the SAS day presentations here.
Speaking at the 2013 IRM-UK Enterprise data governance and business intelligence conference in London last month, in what was billed as a joint presentation with PDO, Walid el Abed (Global Data Excellence) described data governance as a means of enabling ‘value driven leadership.’ In drilling and production, it can be hard to agree on key metrics. But it is necessary to settle on some tangible business objectives. El Abed advocates using a ‘key value indicator’ to judge overall data quality and usefulness. This can be rolled-up from data quality and volume metrics. Typical KVIs for the upstream are lost and deferred production and non productive time. When using past data to predict the future, el Abed recommends working from a ‘near real time through tomorrow’ time-frame to build a data predictivity time table.
Ian Pestell (Cisco unit Composite Software) reported that using extract transform load (ETL) technology to load the data warehouse is ‘not the best agile approach.’ For one client, adding a single column took six months! Moreover, half of all data warehouse projects fail and, where they do work, around 70% of data is ‘dark’ i.e. is never queried. The conventional approach suffers when there are changes to operating systems. It is also fails to incorporate many external data sources like weather and social media feeds. By combining data virtualization (thanks to Composite) along with Cisco’s query optimization technology a truly agile data environment is possible.
Lee Edwards of Data Management International (Dama) described how to ‘thrive’ in global data management. This involves combating many silos—cultural, budgetary and physical. The cultural silo is defended with the ‘not invented here’ approach and the fact that all use the same words to mean different things. Dama’s approach involves listening to and observing users, looking out for common themes. Other tricks include anticipating requirements and delivering a first cut data model early. Later key data sources can be identified, refined and embedded in a preliminary physical model. Edwards advises repeating the questions ‘how do you know?’ and ‘what is it you need to understand?’ This is because ‘business are good at hiding their intelligence.’
Exhibitor Information Builders told us that they have been developing business intelligence solutions for Petrobras, Vale and Upstream Professionals. Tableau Software’s visual analytics although not industry specific, is used in oil and gas. Mark Logic’s application underpins BP Trading’s ‘Where’s my ship?’ application for tracking vessels along with weather data and tweets. More from IRM-UK.
Sally Gould has taken retirement from The Data Room. She plans to continue occasional conference attendances for Oil IT Journal.
Bryan Sabegiel has joined EnergyIQ as chief solutions architect. He was previously with Chesapeake Energy.
Ed Loven is the new CEO of LoneStar Geophysical Canada. He was previously with Sand Exploration Ltd.
Ariane Jayr is the new VP marketing chez Merrick Systems. She was previously oil and gas industry solutions director for Microsoft.
Chris Smith is now president and CEO and Gerry Conroy senior VP at Detechtion Technologies. Founder Brian Taylor moves over to become chairman. Smith was formerly CEO of Cygnet Software. Gerry Conroy hails from P2 ES.
Paul Verhagen has been appointed to the Fugro board of management succeeding CFO André Jonkman who is to step down. Verhagen was previously with Philips.
Lynn Chou has resigned from the PIDX board to assume larger roles in API and Chevron. Amy Absher has been elected to the PIDX board. Absher is general manager of the strategy, planning, services and control unit of Chevron’s information technology company. Peter Allwinton (BP) has been elected as chair of the membership committee. Tom Cave (SparesFinder) is chair of the catalogue and classification workgroup. Daryl Fullerton (Actian) heads-up the marketing committee and the supplier KPI workgroup. Jaime McCarthy is executive coordinator and Brittany Stone administrative coordinator. César Váldes is working on PIDX project management standards and guidelines workgroups.
Clarence Cazalot has been nominated to the Spectra Energy board. He was previously with Marathon Oil Corp.
Gary Morris has joined the board of US Seismic Systems. He hails from Paradigm Geophysical.
SPOC Automation has hired Chuck Milks as field service engineer for Texas. He was previously with Manek Energy Services.
The American Petroleum Survey Group announces the death of Jim Cain (Cain & Barnes) from cardiac arrest at the age of 70. More from www.apsg.info.
Philippe Malzac (Total) is the new chair of the Energistics board. Mark Mao is senior technical advisor and Jeremy Simpson technical analyst. Mao hails from Haliburton, Simpson from Kongsberg.
Intsok’s new oil and gas advisor to Saudi Arabia is Osama Kamal.
Dan McGonegle has joined process safety specialist Chilworth Technology as business development manager, Walter Kessler as senior process safety consultant, and Anthony Glorioso as controller.
Ken Tornquist heads-up ION Geophysical’s new GX Technology seismic processing center in Oklahoma City.
GE has selected Oklahoma City for its new global oil and gas center of innovation. Mike Ming takes the role of general manager of the new facility which will create 130 high-tech jobs.
A new report from Markets and Markets values the oil and gas automation and instrumentation market at ‘$31.24 Billion by 2020.’
Jørgen Peter Rasmussen will be CEO and Odd Christopher Hansen chairman of EQT’s new well services business. Rasmussen hails from Archer Ltd.
Mark Stappard is to head-up Metso’s new service center in Qatar, scheduled to open Q2 2014.
Following our article on the award of Diskos to CGG/Kadme last month, Landmark’s Dale Blue writes,
‘The statement ‘The new PetroBank will combine Kadme’s Whereoil platform and CGG’s Trango seismic data management’ is incorrect. PetroBank remains a Landmark solution and we have no plans to integrate Whereoil or Trango into PetroBank!’
Our apologies for conflating PetroBank Landmark’s product and Diskos the Norwegian data store.
Aker Solutions is selling its well-intervention services business area to Swedish private equity fund EQT for NOK 4 billion. The deal includes an earn-out provision whereby Aker receives 25% of any returns over 12%. Aker is selling because of ‘limited synergies’ with its core deepwater and subsea businesses.
Eco-Stim Energy Solutions has closed its ‘reverse merger’ with FracRock International. EcoStim provides field management technologies and well stimulation and completion services to the international shale market.
Fugro is to acquire Advanced Geomechanics of Perth, Australia, a supplier of geotechnical and geophysical engineering and consulting services to oil and gas.
Jacobs Engineering Group has acquired the assets of Costa Mesa, CA-based Marmac Field Services, a pipeline engineering and design service company.
New Source Energy Partners has acquired MCE in a $43 million cash and paper deal. MCE is an oilfield service company specialized in ‘increasing efficiencies and safety’ in drilling and completion.
Detechtion Technologies has received a ‘significant investment’ from private equity unit Element Partners. Detechtion’s flagship ‘Enalysis’ package supports performance optimization and fleet management technology on natural gas compression units.
TekSolv has acquired acquired Crawford Technical Services, developer of the Hydrowatch shale gas Scada system.
Bureau Veritas has acquired Maxxam, a provider of analytical services to the oil gas and environmental verticals in a $CND 650 million deal.
Two developments this month on the upstream asset management and decision support front as Portfolio Decisions and Caesar Systems form one strategic alliance while Visage Information Solutions has teamed with Rose & Associates. The first alliance sees the combination of Caesar’s PetroVR project simulator Portfolio’s ‘Perspectives portfolio management application. The new data link provides information exchange between a company’s asset development plan and its portfolio optimization. More from Caesar and Portfolio.
The other hook-up combines Visage’s big data analytics with Rose’s industry risk assessment methodologies. Rose will leverage Visage’s toolset in its training and consulting engagements and is to ‘heighten the role’ of Visage in its risk analysis. More from Visage and Rose.
Ikon Science has partnered with FUSE IM on the RokDoc MetaStore, an ‘open’ knowledge base of rock properties, relationships and analytics. MetaStore’s development was primarily funded by Ikon’s major customers, who helped scope the project.
MetaStore captures rock property data together with interpretations generated in Ikon’s flagship RokDocQED tool along with other sources. MetaStore is based on Fuse’s XStreamline platform and includes an ‘advanced metadata engine’ and GIS-based search and filter tools. The Java-based application provides cross platform Windows/Unix access to public and private cloud-based hosted data from a web browser.
Fuse director David Holmes told Oil IT Journal, ‘XStreamline runs on top of JBoss and the JBPM business process management toolset. We are neutral to the customer’s GIS engine and support publishing through PostGIS and ArcGIS Server. We also recommend the OpenGeo Suite from Boundless which brings together a number of enterprise proven open source GIS components into a single easily manageable configuration. RokDoc users faced two challenges—managing multiple project files (like the Petrel data management problem!) and building a corporate rock physics repository from a collection of spreadsheets.’
Future enhancements will capture more granular rock property data and it will be possible to use MetaStore independently of RokDoc. More from Ikon and Fuse.
Norcross, GA-based Precyse Technologies has introduced a suite of intrinsically safe, ‘remote entity’ awareness and control (Reac) for oil and gas and other verticals. Remote entities can be people or high value items such as vehicles or equipment.
Three products in Precyse’s Smart Agent and Beacon wireless asset tracking portfolio are now certified as intrinsically safe according to the US OHSA certification for hazardous areas (Class I, Division 1).
Precyse’s Drew Bolton said, ‘Companies operating in hazardous environments can quickly deploy a turn-key package that improves worker and environmental safety. Our solution set includes mobile Smart Agents, wireless network infrastructure, and real-time operations, control room, and tools to proactively monitor and manage the workplace and assist emergency response efforts.’
Precyse’s toolset includes patented, ‘assisted GPS’ technology to help emergency responders find and assess the well-being of personnel who have not reported to muster points. Accelerometer-based ‘man-down’ detection automatically alerts control room personnel when workers fall and a panic button reports incidents in real time. More from Precyse.
IDC Energy Insights’ guesses, sorry predictions, for future oil and gas IT spend are as follows. Companies will put more focus on ‘resiliency, security and innovation.’ Drilling risk mitigation and production optimization will drive big data adoption for analytics, along with new models for machine-to-machine connectivity and mobility.
The ‘smart pipeline’ (qu’est-que c’est?)concept is to gain momentum and the cloud will see an ‘uptick in adoption.’ Finally, upstream IT spend is to increase to ‘$49.4 billion’ by 2016—a figure so precise that it will definitely be wrong. Visit Energy Insights and listen to the webcast.
Statoil has awarded the US National aeronautical and space administration (NASA) a contract to trial technologies developed for space exploration in an oil and gas context. The five year contract is to be ‘effectuated’ at the Jet Propulsion Laboratory (JPL) in Pasadena, California. Statoil currently spends $550 million on research, development and innovation.
BP’s global projects organization has awarded AEG Power Solutions a five year global framework agreement for the supply of IEC compliant AC and DC uninterruptible power supply systems.
Statoil has awarded Aker Solutions a NOK 3 billion extension to their offshore maintenance and modification contract. The deal includes engineering, procurement, construction and installation modification work and corrective maintenance and studies.
US energy management specialist ACES has gone live with Allegro Development Corp.’s Allegro 8 trading and risk management solution.
Petroleum Development Oman (PDO) has extended its contract with CGG for the provision of seismic imaging services from a dedicated processing center in Muscat, Oman for an additional four years.
Cybera reports that its Cybera One flagship is now deployed at over 10,000 Shell retail outlets in the US. Shell has migrated its payment card traffic from its own VSAT network to Cybera’s broadband solution.
Kuwait National Petroleum is to deploy Honeywell’s process control, safety and alarm management systems in its new build, 615,000 bopd Al Zour refinery complex. Honeywell is also doing the project’s front-end engineering design. Honeywell also received an order from Sinopec for business management and automation technology to revamp aging petrochemical plants in Guangdong province, China.
Lukoil has deployed NCR Corp’s point of sale solution at some 170 Belgium retail outlets.
Inpex has appointed Asset Guardian Solutions to provide a custom engineering IT asset management tool set to enhance management and security of software used on the $34 billion Australian Ichthys LNG project.
Technip is to provide Sasol with FEED studies on a new build gas-to-liquids facility at Westlake, La. Final project cost is in the $11-14 billion range.
Unique System FZE is to supply an oceanographic and meteorological monitoring system to the Abu Dhabi Oil Co. The unit provides real-time wave and current information from an acoustic doppler current profiler which are relayed to the ADOC’s operations centre on Mubarraz Island.
BG Group unit QGC has awarded Jacobs Engineering a concept study and pre-FEED contract for an Australian coal seam gas to LNG project.
Talisman Sinopec has awarded Lloyd’s Register a 25 million contract for integrity management services.
BP has awarded MRC Global a contract for maintenance, repair and operating goods and services at its US downstream operations.
Canadian oil and gas logistics specialist Mullen Group has extended its 2011 contract with Amalto for a cloud-based e-invoicing solution to its PeBen and Withers business units.
CNPC Jidong Oilfield and Sinopec Southwest Branch Co. have awarded Recon Technology new automation system contracts with a total value of around $1.2 million.
A joint venture of Petrofac and Daelim has been awarded a $2.1 billion refinery improvement project by Oman Oil Refineries and Petroleum Industries Co.
Petrofac and partner Taleveras Energy Resources have signed a five year deal for the provision of funding, technical support, training and asset management support with the Nigerian Petroleum Development Co. The initiative supports NPDC’s aims to increase its indigenous capacity and technical capabilities.
VeriFone Systems’ VX 680 and VX 675 portable payment acceptance systems are being deployed by Egyptian payment gateway service provider E-Finance in the state-sponsored transition of Egypt’s gasoline and petroleum products retail market to an electronic payment system. An estimated 18,000 VeriFone units will be installed throughout the country.
The Open geospatial consortium and the Pipeline open data standards association are to work together to enhance geospatial interoperability in the PODS data model.
The PLC-open association of IEC6 1131-3 control manufacturers and the OPC Foundation are teaming on ‘Industry 4.0’ strategies—the combination of information and communication technologies with industrial automation. New ‘cyber physical systems’ underpin the ‘smart factory.’ OPC UA is to play a critical role as will ‘semantic inter-operability.’
The geomatics committee of the UK Oil and gas producers’ association, OGP has released a generalized P6 geographic information systems data model P6/GIS to supplement the 2012 P6/11 seismic bin grid data exchange format. The standard comprises an Esri geodatabase template, data dictionary, symbology stylesheet and an example dataset along with a guideline document. All are downloadable from the OGP geomatics minisite.
Energistics reports that it is developing an application programming interface that is to be the common transfer protocol for all Energistics standards.
London-based KBC Advanced technologies claims to have ‘reinvented’ process simulation with the latest release of its PetroSim package. PetroSim 5 adds scenarios for lifecycle facility modeling and decision support tools to test investment and capacity hypotheses against changing conditions such as declining reservoir feeds in the upstream. The toolset claims to release value in capital and operating efficiency that is currently obscured by process simulator limitations.
PetroSim now embeds Microsoft Windows Workflow Foundation technology so that users can customize modelling with their own design standards or methods.
This technology will decrease simulation time dramatically, allowing more time for the analysis of facility performance and plans. The simulator also includes third party KBC/Infochem multiflash technology in its thermodynamics module for hydrate mitigation studies in upstream and midstream applications.
KBC CTO Andrew Howell said, ‘For the upstream production facilities market we deliver facility models that understand your field plans. In refining we are the only provider of refinery-wide modelling using your own standards.’ More from KBC.
Todd Bush’s startup Energent group is marketing the software building blocks for mobile data integration in the digital oilfield. The Energent mobile software development kit (SDK) includes access to public oil country data along with Energent’s own research. Developers can connect to in-house and cloud-based data and services. Target workflows include completions, fracking, reporting and safety.
Bush told Oil IT Journal, ‘Based on my experiences at Chevron, oil and gas data challenges are vast and expensive. Things are even more difficult when you want to let field workers access information from different devices.’
The SDK was used to create Energent’s own ‘Field first’ drilling and completion reporting app. The app provides permit, completion and well data delivered from operators in Eagle Ford, Cline, Permian, and other plays. Data can be filtered for depth, county and date. Well completions data is also available for competitor analysis. The app tracks the whole drilling and completion lifecycle from permitting to abandonment. Download Energent's FieldFirst app.
Two interesting discussions are ongoing on LinkedIn which
should be of interest to those struggling to standardize their upstream information
technology. First an existential debate rages on the ISO
15926 LinkedIn group as to why, after over 20 years, take-up remains poor.
Subsequent postings offer a potted history of the lifecycle information standard
from its early days in Shell, POSC/Caesar and the STEP project and its more
recent morphing into a semantic web project.
Creating the reference data, a.k.a. the taxonomy was described as a ‘Herculean’ task that has not yet finished.’ One poster criticised the ISO 15926 material as ‘cumbersome’ and inconsistent. Another user defended the technology as a source of value. In general the debate splits between those on the ‘inside’ who have cracked the code and those on the outside who appear to struggle with understanding and implementing the complex novel technology.
A mirror of the ISO 1526 discussion can be observed over on the OPC-UA LinkedIn group where similar scepticism is expressed as to the benefits of moving from ‘classic’ OPC to UA. Here the problem is also one of take-up in commercial products which is an even greater issue for OPC-UA since the older standard is still working reliably and is well supported. OPC-UA has some strong defenders though and enthusiastic support as the platform of choice for future developments.
A new white paper from Moore Industries illustrates the complexity of an apparently simple piece of equipment, a tank overflow protection device. Those familiar with the Buncefield, UK tank farm explosion and fire will appreciate the importance of overflow protection.
Reducing such risks relies on accurate and reliable safety instrumented systems (SIS) whose levels of functional integrity are set out in IEC 61511-1. This has been adopted as the worldwide standard for such systems in the process industry. SIS integrity requirements span sensors, controllers, logic solvers, actuators and valves. A key part of the loop is the logic solver, which actually takes the final decision as to whether to activate the overfill protection.
The white paper explores some of the different approaches to logic solver design, along with some examples of system topologies and the associated safety integrity level calculations. Insights from the paper include the fact that the instrumentation used in the SIS should be independent from the tank gauging system to avoid interference and/or common failure modes. Because sensors degrade over time it is also judicious to deploy multiple devices with a different technology from the tank gauges. Register and download the white paper.
Rock Flow Dynamics CTO Kirill Bogachev has kindly provided us with some benchmarks comparing performance of RFD’s TNavigator reservoir fluid flow simulator on Intel’s new Xeon E5 V2 (Intel has a minority stake in RFD). The trials compared performance on the new 24 core dual Xeon E2697v2 with previous 12 and 16 core editions. Single core performance was comparable across the three generations but tests using all cores showed super linear scaling. A job that took 1hr54 minutes on the 12 core chip took only 49 minutes on the 24 core version. The extra speed up is likely due to the higher memory bandwidth of the newest chip. Bogachev concluded that the new chips ‘set records in performance and parallel scalability for the reservoir modeler.’ Comparable performance can now be achieved at around half the hardware cost.
In another test, RFD compared different accelerators with the dual Xeon to show that while Intel’s MIC exceeds Nvidia’s Tesla performance, it is 2-3 times slower than a dual Xeon E5 workstation. Western hemisphere RFD clients include Oxy, Pioneer and Tullow. More from email@example.com.
An article in The Vault, the Siemens-sponsored minisite housed by Automation World Magazine, describes how systems integrator Crawford Technical Systems (CTS), safety systems specialist TekSolv and drilling data manager Pason Systems have used Siemens TIA Portal programming framework to develop TekNet, a gas detection and safety system. TekNet provides an ‘open’ protocol that connects to thousands of manufacturers.
Internet and smart phone capabilities are used to alert personnel to hazardous situations through alarms, emails and text messages. The system monitors emissions and tracks gas sensor filter changes and calibration compliance reporting. Early system tests gave a surprisingly high number of alarms—but these were not false positives rather ‘real-world, potentially hazardous situations.’
System components include RAE Systems’ MeshGuard gas detection system and Siemens S7 1215 controller. The TIA Portal exposes a consistent programming interface across Scada and PLC systems along with Siemens controllers and Modbus communications. Programming and configuration is achieved with drag-and-drop. TekNet is presently active at 25 Pason locations, with a dozen more in the works. More from TekNet.
Houston-based eCorp Stimulation Technology got a boost last month with recognition by Time magazine of its propane-based fracturing technology as one of the ‘best innovations of the year.’ eCorp’s ‘pure propane stimulation’ is claimed to be ‘an environmentally friendly alternative to hydraulic fracturing.’
eCorp has developed a non-flammable fluorinated form of propane that suppresses the flammability risk of LPG. According to the company, fluorination is achieved ‘without chemical additives.’ eCorp CEO John Thrash claimed that using non-flammable propane avoids well sites being subject to Seveso classification.
eCorp recently presented its technology to representatives of the hitherto skeptical French parliament which has now issued a comprehensive study on the benefits and risks of future non conventional exploration in France. Government scientists came down favorably on the ‘need for further research into exploitation of non conventionals.’ Exploration and production of non conventional hydrocarbons is a ‘potentially manageable activity and there are many possibilities to improve fracking and other technologies. Some alternatives are further advanced than is commonly believed and hydraulic fracking itself is continually being improved.’ Unfortunately for industry, president Hollande declared (before the report was released) that no unconventional exploration would take place under his mandate. More from eCorp and Time Magazine.
The upstream oil and gas sector has received a dressing down from UK-headquartered market researcher Ovum. In its annual ICT enterprise insights survey, Ovum found that a majority of companies in the upstream are lagging in their use of IT. While they invest more in IT than other verticals, most E&P companies are ‘stuck with legacy practices’ and fail to exploit cloud computing and analytics.
The number of oils planning to increase IT spend by over 6% this year has doubled since 2012 and 40% of respondents plan to replace or overhaul IM systems in the next 18 months. Target areas for expenditure include enterprise performance management, data management and integration and data warehousing systems. However, ‘a majority plan only minor enhancements or simply to maintain their existing tools.’ This, according to Ovum’s Warren Wilson, ‘is not a recipe for long-term success.’
Digitization and analytics are changing E&P. Companies need to re-examine their partnership strategies, prepare for strategic IT and put analytics centre stage. Vendors can support this by understanding that they are playing to a mixed audience and tailor their products and value propositions accordingly.’ Warns Warren, ‘Preparing for strategic IT means eliminating paper records, modernizing IT platforms, and consolidating disparate application and data silos.’ More from Ovum.include ("copyright.inc"); ?>