March 2013


SAP HANA for O&G

Ex Schlumberger engineer works to tune SAP’s ‘in memory’ compute engine for new digital oilfield workflows in ‘big data’ analytics and predictive approach to MRO and workover optimization.

Speaking at the 2013 SPE Digital Energy conference this month in The Woodlands, TX, SAP Labs’ Ken Landgren described some oil and gas ‘big data’ trials that SAP has been running on its novel ‘Hana’ architecture (OITJ 12/2010). Using a large synthetic data set based on real scenarios, SAP has investigated use cases in well construction, maintenance management and production optimization.

Landgren told Oil IT Journal, ‘My previous work with Schlumberger sparked an interest in the digital oilfield. But I felt that prior efforts lacked traction because they failed to engage both operations and the business. For instance, well construction involves lots of supply chain issues as does equipment diagnostics and maintenance. Much information relating to such use cases is already captured in ERP and MRO systems. So I approached the problem from the other direction, starting with the ERP system and SAP. Just after I joined, SAP came out with a new ‘in memory’ database, Hana, tuned for this kind of analytics which has great potential for digital oilfield and intelligent operations.’

One demonstrator use case is gas production in the face of liquid loading. Landgren’s team used the ‘R’ statistical language on an in-memory Hana database to forecast production, revenues and cost for various scenarios. SAP’s visual intelligence tool was used to find the optimum time along the decline curve to intervene with a swabbing operation. The optimized program dovetails neatly with other maintenance management SAP functionality. Workover plans are pushed to mobile workers and documents signed-off with SoftPro’s SignDoc.

According to Landgren, Hana avoids the need to ‘massage’ data into a hypercube by direct connection to native data sources. A second use case investigated compressor failure. Here key information was tucked away in an obscure text field of an MRO database. Using Hana it was possible to correlate equipment failure with gas production losses. Root cause analysis identified clogged fuel filters as the culprit and a smart maintenance program was kicked off, addressing high value wells at risk of imminent failure.

So will SAP’s foray into the digital oilfield get traction? It’s not the first time the company has tried. We reported in our February 2007 edition on a previous ‘digital oilfield’ drive. Back then, SAP floated the idea that ‘increased systems integration and cross discipline collaboration will enhance production, improve economics and reduce costs.’ At the time we asked, ‘who, among the many stakeholders, will hold the keys to the digital oilfield?’ Maybe now the answer is, ‘whoever holds the keys to the data center.’ More on Hana in our review (OITJ 06/2011) of Hasso Plattner’s book on the topic and his response (09/2012).


Premier virtualized

ISN migrates legacy Windows servers to VMWare vSphere-based data center. NetApp upgrade and new Cisco VLANs performing ‘as promised.’

Following its remote desktop development last year (OITJ 07/2012), Premier Oil has extended its virtualization effort, executed by UK-based ISN Solutions, with the virtualization of its corporate data center. ISN has consolidated Premier’s legacy Windows servers onto a high-availability cluster running VM Ware’s vSphere. Premier can now dynamically move workloads around the new environment to allocate resources according to business needs.

Offsite replication and failover provide disaster tolerance and snapshot, archiving and imaging technology improve performance and manageability. Premier’s NetApp storage was upgraded with a new tiering capability that stores critical data and virtual machine files on the most reliable and fast media. Dedicated VLANs were assigned to subsystems to optimize throughput and optimize backup. ISN managed to install the system without disrupting Premier’s day-to-day operations.

Premier CIO Dave Edwards said, ‘The solution has delivered everything it promised. If we open a new office or need to bring a new site online it gives us the potential to provision new systems and be sure they’re protected all under the umbrella of our centralized data centre.’ More from ISN.


Big data, Hadoop, the hype … and a compelling use case explained

Neil McNaughton reports from Digital Energy on ‘big data’ where his skunk works detector was showing red. Can Hadoop best years of established technology and confer a real ‘competitive advantage?’ A conversation with GE shows what Hadoop is good for and where the hype takes over.

Judging by the plethora of ‘big data’ presentations at the SPE Digital Energy (DE) conference this month, the industry is about to be taken by storm by Hadoop. Not just our industry, but, according to a new book from the Open Technology Institute, ‘How we live, work, and think.’ Vendors at DE intimated that several majors had Hadoop trials running on seismic processing and real time data analysis—and indeed that these were so successful that the said majors were not speaking about these tests because of the major ‘competitive advantage’ they were gaining. My hype radar was beeping loud—after all, when nothing leaks from a project, it could be because it is a skunk-works flop.

IDC’s Jill Feblowitz did a good job of fanning the flames of the Hadoop blaze with some chapter and verse. A ‘Seismic Hadoop’ project initiated by Cloudera and oil and gas data mining R&D performed at Stavanger University. Neither of which from a cursory look would appear to offer much ‘competitive advantage.’

So what is all the fuss about? Some good introductory material is to be had in a recent paper from Atos which offers a short history of the technology’s evolution from Google’s Big Table, along with a reasoned taxonomy of big data solutions, what Hadoop, MapReduce and NoSQL are really about and a league table of other parts of the ecosystem.

On the hype side of the equation, the mantra is the four ‘Vs,’ i.e. data volume, velocity, variety and … value. Volume and velocity are fairly self evident. Think lots of high bandwidth real time data. Variety is the promise of being able to process across multiple sources—documents and databases. But the ‘value’ part of the equation is more subtle. For some this refers to the high value that the ecosystem promises. For others, the ‘big data’ movement derives information from low value data. Think of all those log files that accumulate on your systems before being deleted unread or stuff disappearing down the ‘data exhaust’ of an offshore platform (OITJ 04/2011).

From the Atos paper it appears that the technology is best at doing fairly dumb stuff, but on very large sets of somewhat inaccessible data. Seismic data sort is one use case that has been suggested. What is curious here is that for Hadoop to bring a competitive advantage, it would mean that decades of geophysical research targeting the exact problem have been bested by a serendipitous generic approach. This to my mind seems unlikely.

In the same vein I thought, how come decades of research into the data historian could be bested by big data? After his talk on Hadoop at the SMi E&P data conference last month I put this question to GE’s Brian Courtney who is well placed to answer as GE provides both a data historian (Proficy) and a new Hadoop-based solution. He said, ‘Historians are great at storing and retrieving time-series data. Proficy can capture millions of tags per second. Historians are also excellent for operational queries—provided you are asking for tags in time sequence order.

Historians are not so good at ‘ad-hoc’ queries such as ‘have I seen this five second start up pattern before?’ Here you would need to query across all your data looking for the key tags on the same equipment. In a historian this can a) return more data than you can handle and b) take a very long time.

Hadoop is more like a warehouse than a historian. Moving data from the historian to Hadoop is very time consuming. Typically you only send data once it is clean and complete. A typical time frame might be every three days. Hadoop takes the data and spreads pieces of it over lots of computers in the cluster. Note that if you had to update a record, you would need to delete the entire archive. Updates are very expensive in Hadoop.

The other challenge with Hadoop is that it takes maybe thirty seconds or more to process a query, figure out where you put the data, move the query to the right nodes, process the query, concatenate the results, and return them. In other words, Hadoop is not good for rapidly changing data or for querying small amounts of data in near real-time.

What Hadoop excels at is a query such as ‘For all temperature settings for all turbines, determine the average temp setting five minutes before and after a particular type of alarm.’ Again, you couldn’t do this with a typical Historian. Hadoop can do this very quickly as it can decompose the query, send it to hundreds or thousands of nodes in the cluster and process the request in parallel. We have run a query like this that returned 12 signals in a 4 terabyte data set in 2½ minutes. The same query running on a historian crashed our clients and the server!

Hadoop is great as a data warehouse and allows you to do deeper, more meaningful analytics, data mining and discovery. Historians are best used as operational data stores—storing data in near real-time and for querying trending data.’

Well that cleared things up for me, I must say. Looking back over the four ‘Vs’ it would appear that the volume part is well handled by Hadoop which is after all based on a highly scalable file system. Velocity? Courtney’s analysis is rather nuanced here. The data may be streaming in to the historian at great speed, but the ‘warehouse’ approach means that the data mining is an off-line process. Although once you’ve found your critical pattern, this could be embedded in real time monitoring process inside the historian. Variety? Not in the GE use case which is dealing with standard historian data. And Value? This is a good example of adding value to relatively low value data.

So what about seismics? A lot of the Map/Reduce data manipulation sounds rather like the sorting and re-ordering performed in in seismic imaging. Does Hadoop offer anything new here? If you know the answer, we’d love to hear from you.

@neilmcn


Standards Leadership Council first EU meet

Standards bodies gather to compare notes and seek gaps, overlaps and points of intersection. Metadata, units of measure likely candidates for cooperation. PPDM to Witsml mapping put forward. But with XML, RDF, and other protocols on display, IT convergence may prove tricky.

The Standards Leadership Council (OITJ 09/2012) held its first EU gathering in London Heathrow last month and heard keynote speaker Malcolm Fleming of the UK’s Common Data Access upstream data portal speak on professionalizing E&P data management. Fleming observed that data managers are perceived as the Cinderella’s of the E&P business and that data in general is undervalued. Geoscientists are ‘entirely application driven, with a focus on tools such as Finder and Petrel. We need to develop generic skills and to identify commonalities.’

Last year, CDA and Schlumberger found that ‘28% of business value comes from data and its management.’ Data ought to be a corporate asset. Mining and seismic companies do this. PGS has $300 million worth of data on its balance sheet. There is also a need to professionalize the discipline with certification, ethics and ongoing development. CDA has proposed a ‘canonical’ training program, building on the DAMA philosophy, and a data competency portal which has just gone live. The results of the initial trial are to be presented at next May’s PNEC conference. Fleming concluded with a few remarks on the scope for standards in professionalizing data management. One, the EU’s Inspire standard-cum-regulation for geographic information (32 specifications each over 100 pages long) he described as ‘a nightmare!’

Convener Jerry Hubbard (Energistics) described the SLC initiative as a move away from standards competition. Initial reactions from the constituent bodies has been positive. The question now is, ‘what do we do next, what are we trying to accomplish?’ The current game plan is to collaborate with each other to benefit industry, working on gaps and points of intersection and making it easy to use multiple standards. A second objective is to collaborate on common challenges like sustainability, retention, and IT best practices. Hubbard went on to enumerate some current Energistics projects that may inform future SLC work. The ‘unified approach to metadata standards*’ project, an industry profile of ISO 19115, a protocol for the discovery evaluation and retrieval of structured and unstructured information. Energistics is also putting its comprehensive units of measure standard into the SLC pot. The UoM has a venerable history, from the API RP66, through Epicentre and latterly it has been leveraged in the latest tape standards from the Society of Exploration Geophysicists (see below). An SLC work group of all 10 organizations is setting out to update and expand the UoM list—volunteers are needed for this ‘very complicated issue.’

Trudy Curtis (PPDM) outlined a push for a standard well log raster calibration file format. This seemingly minor issue spawned a meeting in Houston that ‘packed the room out.’ The project includes embedding depth values in TIFF images, GCM and PDF files and core photos. The now defunct WellLogml protocol lost its calibration information when it was subsumed into Witsml. Another ‘intersection’ project seeks to map PPDM’s ‘data at rest’ to Witsml’s ‘data in motion.’ Various users are extending PPDM and Witsml—but currently ‘all implement differently.’ More from PPDM.

Alan Johnston described the oil and gas interoperability pilot, an ISO TC 184 WG6 specification jointly developed by Mimosa and POSC/Caesar Association (PCA). The spec covers physical asset management—particularly the construction and handover of major facilities, plants and platforms. ISO 15926, the SPE drilling systems automation technical section (DSATS) and Open O&M also ran. Johnston wound up asking ‘Did I confuse you?’ He did.

PCA’s Nils Sandsmark offered ‘something different’ in the form of Ontologies for oil and gas. OOG has support from Mimosa, PCA, PODS and and an expression of ‘interest’ from PPDM & Energistics. The use of ontologies promises to add ‘semantic context’ to engineering data along with constraints and relationships. It is ‘all about limiting ambiguity in the model.’ The ontological approach in Norway has begotten a plethora of ‘products,’ License2Share, EquipmentHub, Environmental Hub, ReportingHub and LogisticsHub, an RDF triplestore that communicates to users via web service delivery of XML documents.

Tom Burke’s presentation of the OPC Foundation’s involvement in oil and gas also revolved around the Society of Petroleum Engineers DSATS initiative (OITJ 11/2012). DSATS started in 2007 to accelerate uptake of automation in drilling. DSATS has produced a reference document and is now working on best practices and guidelines rather than ‘standards.’ Suppliers ‘hate standards’ and would prefer us to ‘buy everything from one supplier.’ But the world is (fortunately) not like that. DSATS is working on a ‘Drill-a-stand’ use case addressing reliability and safety. OPC, Energistics, Mimosa and the IADC’s advanced rig technology group are involved. The initiative addresses automation and machine to machine (M2M) interactions on new builds and existing rigs. The idea is for an overall automation architecture that applies to drilling companies and their suppliers. There is a ‘lot of money’ going into the initiative which is to deliver a generic API for automation using OPC-UA as transport along with Prodml and Mimosa components. In the ensuing discussion it was observed that automation generates much more excitement than data management.

Richard Wylde enumerated the recent additions to the OGP’s spatial data formats, P2 and P6. Prior to the SLC, an Energistics SIG was set up to look at seismic velocity formats. Wylde is working with Energistics to ensure that there will not be a new format for seismic velocity which is covered by the new SEG-D and OGP P-formats. We need to ‘make sure we don’t tread on each others’ toes and re-invent the wheel.’

On behalf of the SEG, Jill Lewis described the SEG-Y Rev 2 initiative that is underway. This has involved reverse engineering of the existing standard to align the spatial aspects with OGP and pay more attention to units of measure. Rev 2 addresses high capacity tape, sample rate trace length precision and real time applications. The new spec is not XML but will ‘handshake’ with OGP, PPDM and Energistics UoM.

* It is our understanding that this refers to geographic metadata standards, but the omission has been made so often that there is clearly a sub text here. Dublin Core, a document metadata standard, likewise tends to omit this scoping detail.


Exprodat on GIS strategy, maturity

White paper offers instruction and benchmark metrics for GIS implementers.

A new 24 page publication from Exprodat Consulting, ‘GIS strategy review (GSR): approach and methodology’ investigates the rationale behind geographic information system strategy and its design and implementations. GSR also proposes a GIS maturity model and provides some anonymized benchmark metrics from a recent survey of Exprodat’s clients.

GIS is an ‘inherently horizontal technology’ with the capacity to link all parts of the business. But sound planning is required for success. Many projects stall early on because of poor initial scoping or the lack of a GIS roadmap.

Enter Exprodat’s GIS strategy model, a technology independent, scalable approach that starts with ‘functionality value gradient’ analysis. This runs from low level data management through to high value activities such as spatial analysis and prediction. GSR describes how Exprodat goes about determining the functionality required by a client and how to involve all stakeholders in achieving such.

While GSR does not delve into technology issues, it does provide an overview of GIS potential. A good target audience might be management unaware of GIS’ potential or complexity. One thing is for sure, most oil company GIS deployments are way short of realizing GIS’ full potential as is borne out by the benchmarks. To a large extent, fancy geoprocessing and GIS intelligence remains the stuff of ESRI PUG demos.


iOra GeoReplicator for SharePoint 2013

SharePoint replication specialist trials 2013 Microsoft product release.

UK-based Microsoft SharePoint specialist iOra reports that its ‘Geo-Replicator’ technology, ‘with a few tweaks,’ can replicate SharePoint 2013 and Office 2013. Said tweaks will integrate the 2013 release due out ‘real soon now.’ This will include a new GUI and new capabilities for monitoring replication both locally and remotely. Geo-Replicator is used to synchronize data from oil and gas platforms and other remote facilities. Patented replication and compression technology synchronizes content enterprise wide. Use cases include delivery of seismic surveys, drilling data and data logs to and from rig to HQ. Users include Shell, Ofdjell and Chevron. Worldwide, some 65,000 machines are licensed for iOra replication.


New TemisFlow modeling package from Beicip-Franlab

IFP Energies Nouvelles and French atomic energy agency’s Arcane compute infrastructure leveraged.

A new version of Becip-Franlab’s ‘TemisFlow’ oil and gas reservoir fluid flow simulator developed by the French Petroleum Institute IFP Energies Nouvelles (Ifpen) has just been released. The latest version embeds a numerical platform ‘Arcane,’ co-developed by Ifpen and the French atomic energy agency CEA. Arcane, under development since 2000, was initially conceived under France’s ‘Simulation’ program that set out to replace real-world nuclear weapons testing with numerical modeling. Ifpen has been collaborating with the CEA on Arcane since 2007.

Arcane is a high performance computing platform that promises to free high performance computing (HPC) developers of the burden of low-level parallelism, memory management, I/O and 3D unstructured mesh management. Arcane optimizes code for performance across massively parallel HPC clusters with thousands of cores. An object oriented design provides ‘modularity and agility’ to developers. The CEA/Ifpen collaboration focused on the management of 3D meshes and graphs and on multi-core best practices. The result is that TemisFlow now accommodates more complex geological structures along with special case simulation such as carbon dioxide sequestration.


DMG Events’ Marrakesh E&P data governance workshop

Total leverages Innerlogix in major upstream data cleanup. Saudi Aramco’s DMBoard initiative.

The inaugural DMG E&P data governance and information quality workshop held in Marrakech earlier this year heard from Jean-Michel Amouroux on Total’s experience with Schlumberger’s Innerlogix (ILX) data quality management toolkit. Total has used ILX along with in-house developed technology on several multi-year well and seismic data clean-up and management projects. Projects included harmonising data across reference projects, master projects and the working environment and consolidating to a single well master. A seismic project targeted speeding up data loading and the identification of duplicate navigation data.

ILX proved successful at cleaning, de-duplication and consolidating well data for Total’s new ventures department, providing data quality KPIs, job scheduling during off-peak hours and automatic update of corporate data stores. ILX’ scripting function was used in the seismic project to automate line matching on a statistical basis. The technique will likely feed-back into future ILX functionality.

Much of the rest of the Marrakech event was devoted to high level information governance activity led by IPL’s Chris Bradley. The event also hosted a gathering of the new, Saudi Aramco-sponsored DMBoard, an energy special interest group focusing on E&P data management. The DMBoard plans to coordinate efforts towards comprehensive and aligned E&P Data management strategy and to influence E&P data management best practices. Interested oil and gas company data managers are invited to join up. The next DMBoard meet will be at PNEC.


Software, hardware short takes

Blueback, UpstreamProfessionals, IDS, Katalyst, Baker Hughes, Palisade, eLynx, Geologix.

V 4.0 of Blueback’s Petrel project tracker now issues rule-based alerts on dodgy data management practices affecting project integrity. A new seismic file manager speeds searches across network locations.

Upstream Professionals has added a ‘Production’ module to Total Asset Manager, its central repository for integrating information across drilling, production, land and other sources. TAM provides dashboards, scorecards and BI tools for insight into business performance.

IDS has announced a new component of its DataNet2 suite, TourNet Pro. TNP provides reporting and analytics to drilling contractors to track and analyze operational performance. TNP is a customizable web-based system and ‘collaborative database’ spanning daily reporting, maintenance, downtime analysis and an electronic IADC report.

Katalyst (previously Kelman) has announced SeismicZone, an e-commerce website for North American seismic data. The system will streamline seismic data trade and purchase.

Baker Hughes has enhanced JewelSuite with a 3D geomechanics capability targeting complex field development. The system combines JewelSuite with Dassault Systèmes Abaqus finite element analysis package to simulate non-linear stress, compaction and subsidence during production.

Palisade’s @Risk/DecisionTools Suite 6.1 supports Excel 2013, Project 2013, and Windows 8 (32 and 64-bit). A speedier calculation engine for Microsoft Project and improved localization complete the picture.

eLynx Technologies has released native iPhone and iPad apps for the oil and gas market. Most of the functionality of its ScadaLynx flagship is now available from mobile devices.

The 7.0 release of GeologixGEOSuite log authoring package adds pore pressure monitoring, horizontal well, image log displays and shale gas calculations. The company also offers a drilling data hosting service, WellXP.


GE Oil & Gas International Annual Meeting, Florence

More on the ‘Industrial Internet’ and ‘smart’ machines. Data—when is enough enough? Going through the CIO’s cyber security hoops. BP, ‘Who owns the digital oilfield?’

The 2013 GE Oil and gas international annual meeting held earlier this year in Florence, heard Brian Palmer put flesh on GE’s ‘industrial Ethernet’ concept (OITJ 01/2013). IE is not a specific protocol, ‘you’ll never be able to write a spec for it.’ But just as the 19th century Industrial revolution and the 20th century Internet revolution have changed economic paradigms, ‘so will smart machines feeding collections of data and analytics.’ A modern aircraft collects around a terabyte of information per day. How do we store, let alone make sense of such data volumes? In oil and gas, instrumented subsea systems are streaming data from acoustic monitoring systems and multiphase flow meters into the cloud where it is available for the same type of analysis as in avionics.

BP’s Steve Beamer asked ‘Who thinks they’re getting the most out of monitoring?’ Maybe one hand went up in an audience of a couple of hundred! Data gathering has to lead to actions. Early equipment anomaly detection lets BP plan outages and order parts. On the venerable Magnus field in the North Sea (30 years old and to about to produce its billionth barrel), BP has a Bently Nevada 3500 system, Smart Signal, System One and a remote monitoring contract with GE. Smart Signal’s statistical modeling identified a worn out thrust bearing even though the instability was below the alarm detection threshold. A debate ensued as to the use of several different GE products covering related domains. GE’s Brian Palmer observed that while a common platform and GUI was desirable, GE did not want to lose the individual tool’s functionality. Developers have been instructed to offer a common look and feel while keeping tools’ specifics. Users can customize the interface and integrate information at the console. There was also a lot of discussion as to when enough data was enough. More and more data leads to practical problems of storage and analysis. It can also pose a security risk. Palmer acknowledged that security is a concern for the whole industry. Client IT departments have various philosophies and operating system preferences. GE has to ‘go through the CIO’s cyber security hoops.’

The alarm issue struck a chord with many as the status quo of alarm systems was deemed sub optimal. Palmer agreed and asked that clients ‘push for more information.’ Beamer saw a time when BP would bring the equipment monitoring function in-house. Asked ‘who owns the digital oilfield?’ Beamer replied that BP’s business is really about the big picture—from producing oil from the reservoir to refining. But ‘terminology, people, and tags don’t match between the reservoir and the facility.’ This was seen as both a problem and an opportunity, although here, GE’s role is hampered by its focus on rotating equipment. Palmer summed up, ‘We would like to do more but today we do not have the domain knowledge of the reservoir and the well. Ultimately we believe the same approach will be applicable.’

BP’s Fawaz Bitar traced the company’s journey along the road to operational excellence. BP has backtracked from its prior decentralized asset-based organization to a more functional structure with standardized, safe, reliable and compliant operations. BP’s operating management system covers corrosion risk, emergency shutdown and flaring systems. Production is now owned by multiple functions facilitating investment in turnarounds and defect elimination. This has brought a 50% reduction in unplanned outages.

Melody Meyer, now president of Chevron Asia Pacific, described the Gorgon LNG Australian megaproject. This ‘fantastic jigsaw puzzle’ represents the summit of the blend of art and science that is supplier relationship management. Gorgon is located on Barrow Island, described as Australia’s ‘Ark,’ a hugely sensitive environment. Process safety is at a premium with great attention to leaks, spills and corrosion prevention—all targeting the elimination of the low probably, high consequence event.


SPE Digital Energy, The Woodlands, Texas (Part I)

Is oil and gas at the bottom of the digital league table? Or is it the ‘greatest show on earth?’ Former Texas railroad commissioner on ‘getting regulation right.’ IDC on big data’s potential. BP, Chevron Devon, Intelligent Solutions and USC on data mining. IBM, Cloudera, PointCross bang Hadoop drum.

Around 600 attended the 2013 edition of the Society of Petroleum EngineersDigital Energy conference and exhibition. An almost chance remark in Louis Ehrlich’s keynote set the tone for much of what followed. A ‘survey’ from Gartner seemingly places the oil and gas industry at N° 38 out of 39 on ‘adoption of digital strategies’ as compared with other industries. Many (including ourselves) questioned the veracity of the Gartner finding and there was a suggestion that Gartner should be invited to next year’s Digital Energy to set the record straight.

Chairman Phiroz Darukhanavala (BP) traced the recent turn around in the US oil and gas landscape from a ‘grim’ picture only five years ago to the current boom and promise of energy independence within a decade. Regarding the decade old digital oilfield concept, industry has made slow and steady progress toward the vision. We now have more instrumented operations and sensors ‘everywhere’ generating enough data for us to drown in! Daru twisted the Ancient Mariner’s words thus, there is ‘data, data everywhere, but not a bit to use.’ Thus the focus of the 2013 Digital Energy show is ‘from information to insight.’

Chevron CIO Louie Ehrlich set out to be provocative, emphasizing that we need to be better at combining the strengths of engineering and IT. The healthcare, retail and financial services sectors recognize the critical nature of IT and do a better job of embedding IT into their businesses. While oil and gas is a technology leader and already gets lot from IT, ‘We are leaving value on the table because of the mixed reception given to the partnership of engineering and IT.’ Digital technology has a lot to offer in support of projects, execution and capability with the multiplier of the strategic partnership. Our ability to gain insights from information is increasing and we’ve only just begun. We have made progress in the fields of production reliability, equipment status, reservoir condition and more but are still not fully leveraging the IT multiplier. Oil and gas information is growing exponentially and we should be using all of it along with our knowledge, experience, tools and technology. There is light on the horizon. Today’s petroleum engineers are much more digitally literate—they are digital ‘natives’ as opposed to digital ‘immigrants.’ The natives are leading the paradigm shift from traditional IT to ‘integral IT’ by acting as ‘grand connectors’ and ‘data-driven’ engineers. In the Q&A, Ehrlich was asked if field automation should reside IT or operations. He replied that it doesn’t matter. What is important is to be clear on how it will work and to enable information and work flows. Looking to the future Ehrlich sees a world where the big game changer will be the impact of IT on knowledge workers. Computers may get to be better than humans—’that’s what we need to watch.’

Halliburton’s Duncan Junor moderated a debate on the challenge of regulation in the face of a booming industry. Former Texas Railroad Commissioner Elizabeth Ames Jones (now with Patton Boggs LLP), who describes herself as a ‘recovering regulator,’ paid homage to George Mitchell whose company, Mitchell Energy, was first to frack the Barnett shale and also came up with the idea of a planned community at The Woodlands. Jones congratulated the audience on progress driven by digital and petroleum engineers who have created the ‘greatest show on earth.’ Jones is determined that regulation keeps up with the activity and avoids destroying jobs. ‘You are either with us or against us. I believe in energy security, energy is essential for our quality of life. We had better start being one issue people now.’

Michael Krancer of the Pennsylvania Department of Environment Protection observed that Halliburton now has 3,000 employees in Pennsylvania, wisecracking that he sees Houston as ‘the Pittsburg of the Gulf Coast.’ Krancer doubted the 38th out of 39 rating, comparing energy with the space program’s blend of engineering and IT. Krancer is no liberal, ’ the power to tax is the power to destroy.’ In Pennsylvania regulation is based on ‘sound fact and sound science—not on the opinion of Hollywood movie stars who come here and tell us they know more about it than we do.’

Darren Smith (Devon Energy) observed that ‘digitization and data can alleviate green attacks on industry.’ Devon’s move into non conventionals, with the acquisition of Mitchell Energy, has not gone unnoticed. Some of the new stakeholders are hostile and would like to ban fracking. Concerns revolve around water contamination, methane migration, air quality, flaring and induced seismicity. Devon counters this with a data driven social license to operate. Data transparency can counter ‘factoids,’ i.e. tidbits of misinformation. The fracfocus.org chemical disclosure registry now has some 39,000 reported wells and has been adopted by eight states. A balance needs to be struck between reporting and confidentiality. From the regulator’s perspective freedom of information requests mean that everything can become public.

Bob Moran reported from Washington, where he represents Halliburton. Halliburton has hired 13,000 people in the US on the strength of the non conventional boom and the offshore. What will it take to make shale go global? Local culture is key. In Texas and Pennsylvania the activity is well received. Elsewhere, less so. Tax regimes, expertise, market and favorable geology are also key. Texas is good on all fronts as is Australia. In Ohio, the Utica oil shale is hampered by regulations. There are 14 federal agencies trying to get a toehold in fracking. The States are should be in charge. Moran agreed with Ames Jones, ‘We need to turn up the heat on elected officials who are not on message.’

If non conventionals dominated the politics and economics of the conference, the main technology theme was the old chestnut of data mining. Data mining, a.k.a. analytics, predictive optimization and so on has been around for years but has been rejuvenated by the buzz and hype surrounding the ‘big data’ movement.

One of the best turn-outs for the show was for Jill Feblowitz’ (IDC Energy Insights) look at the ‘big deal in big data.’ Feblowitz traced the big data movement from its origins with Google and Amazon’s requirements for massive ‘ingestion’ and analysis. This has sparked a ‘new generation of technologies and architectures running on commodity hardware.’ Big data is characterized by the three (or maybe four—see this month’s editorial) Vs—volume, variety and velocity. Volume as in a wide azimuth seismic survey, variety as in the jumble of documents and data sources that support a company’s decision making process, and velocity as in real time data streaming from a production platform or drilling bit, perhaps vie a wired drill pipe. IDC has surveyed 144 companies to find, unsurprisingly, that ‘big data’ is not a ‘familiar concept’ in oil and gas. However, some trials have been reported, Chevron is using IBM InfoSphere BigInsights (a Hadoop-based environment) in its seismic data processing. Shell is piloting Hadoop in a private Amazon cloud. Researchers at the University of Stavanger have demonstrated the application of ‘Chukwa,’ a Hadoop-sub project devoted to large-scale log collection and analysis to perform oil and gas data mining. Cloudera has launched a ‘seismic Hadoop’ project and PointCross has Hadoop based solutions for well and seismic data. Feblowitz suggests that use cases include data mining 3D seismic volumes for particular attributes, geospatial analytics on non conventional acreage, drilling data anomaly detection and production forecasting. So where should you start? By recognizing the value of your untapped data asset, performing a gap analysis for technology and staff and formulating a big data strategy.

Shawn Shirzadi outlined BP’s data mining and predictive analytics effort, parts of its ‘field of the future’ program. Since 2010 upstream data and applications have changed significantly with the advent of ‘high velocity’ real time data, unstructured data, ‘supercomputers’ on desktops and the sensor ‘explosion.’ All of which present data-driven analytical opportunities. In the field of pipeline corrosion threat management, BP estimates that software could contribute $4 mm/year in risk reduction leveraging in-line inspection data. Seven wells in the Gulf of Mexico now are equipped with autonomous production allocation, 24/7 reconciliation of sales meter data with production using 95% accurate virtual gauging. A third data workflow around waterflood optimization uses a ‘top-down’ workflow combining data from the historian and performing injector producer connectivity analysis with a ‘capacitance-resistivity model.’ This uses a non linear solver to estimate injector allocation factor and to comute a ‘value of injector water’ factor used to rank injection opportunities in the face of a shortage of injection capacity. For Shirzadi, real time data-centric decision making has only just begun in the upstream.

Derrick Turk described Devon’s venture into ‘analytics’ which he defines as ‘the discovery and communication of meaningful patterns in data,’ opening up the field for classical statistics, computing, AI, machine learning and domain expertise. As the industry is now drilling large numbers of wells in poorly understood reservoirs a ‘big data’ approach is appropriate. Devon encountered analytics at the 2011 Digital Energy conference and engaged a consultant on a proof of concept analysis of the key drivers for a major new play. This produced enough encouragement to make Devon decide to bring the evidence-based decision making capability in-house. The technique has evolved into what is now the Devon analytics cycle—moving from what data is available, to hypothesis testing and finally predictive analytics and machine learning. The technique, still at the pilot stage, is being trialed on KPI drivers in a resource play. To date the main impact has been in spotting data quality issues. It can be hard to demonstrate quick wins.

Lisa Brenskelle described research into ‘advanced streaming data cleansing’ (ASDC) conducted by Chevron and the University of Southern California. The result is a scalable system that intercepts data from control systems and detects erroneous/missing data, flatlining readings, spikes and bias. Data is read in from the historian, cleansed, reconstructed and stored back with a new tag number. The system was developed because nothing was commercially available for dynamic data. The technology stack includes OSIsoft’s PI System, Microsoft’s Stream Insight’s complex event processing tool and custom code from Logica (now CGI). The cleansing algorithm is ‘empirical’ and uses multivariate statistics and dynamic principle component analysis. One questioner asked how the system distinguishes between bad data and a real process fault. This is currently an unresolved issue that the researchers are working on.

In his second day keynote, Greg Powers (Halliburton CTO) observed that the next trillion barrels will be really hard to access, will come in smaller amounts and be more costly and risky for oils. Exploring and producing the new resources (hydrocarbons) will be performed with fewer experienced resources (people). Half of all discoveries are now in deepwater where spend is rising at 18% per year. In terms of value creation, Petrobras is ‘off the scale,’ BG is N° 2 and Chevron N°3. ‘The harder we look, the more we find.’ But, ‘You don’t get to drill deepwater without a safe, consistent and sustainable process.’ To assure this, we need more automation leveraging real time data. We are not there yet because we’ve not got the telemetry and communications bandwidth. Powers bought in to the Garner rating, ‘We are at the end of train and we cannot stay there.’ Remote operations centers represent a sea change for industry offering the best expertise available remotely. Dual fuel vehicles and rigs running on natural gas are greening the non conventional drive as is ‘drinkable’ frac fluid. Oil and gas is piggy backing the miniaturization of electronics with better, higher capacity units—but ‘We’re doing it downhole’ in a much tougher environment. ‘Try putting your iPhone in the oven overnight!’ The fiber optic revolution promises many new applications. Fiber resolution is now down to atomic levels. A video showed how just talking into one end of a fiber can be picked up kilometers away. Drilling automation will be enabled by sensors, telemetry, actuators and ‘a lot of math.’ Autonomous machines will perform better than humans. This will greatly enhance capital efficiency—but also will force us to rethink much of our assumptions as to what wealth creation is about. All the rules of the game are changing. In the Q&A, the cyber security question was raised. Powers responded that all communications from the well site are encrypted and transit via a private satellite network. ‘Everything is secured by ourselves, we assiduously stay clear of public networks.’ Powers was also asked if he was comfortable with the idea of machines taking over. He answered, ‘Did you fly here? If it was a short hop, your pilot did not fly the plane!’

Shahab Mohaghegh (West Virginia University and Intelligent Solutions) returned to the data mining theme with a presentation on synthetic geomechanical logs generation for Marcellus shale development. Mohaghegh’s Petroleum Engineering and Analytical Research Lab (PEARL) specializes in ‘data driven analytics (DDA). DDA derives relationships directly from large data sets with no attempt made to understand the underlying physical processes—which are often too complex or poorly understood to qualify. Geomechanical properties are key to designing frac jobs and completions. But the factory drilling paradigm means that geomechanical logging is not a common practice. Following calibration against the few existing wells with geomechanical logs, neural networks and sequential Gaussian simulation are used to convert regular (sonic, density, gamma ray) log data to geomechanical properties over the whole region.


Folks, facts, orgs ...

AGR, BGS, CSC, Chevron, Energistics, Energy Navigator, Ensyte, IBM, Clean Harbors, Exco, ExxonMobil, KSS, Mustang, Oildex, WellDog, Glori Energy, Haztek, EIC, MIT, NDB, Noah, DAMA, OGP, OPC Foundation, Panoptican, Petrosys, PII Pipeline, Rapid Response Institute, Westheimer, Rockwell, Mahindra Satyam, Senergy, SGI, Sigma3, Triple Point, Tsunami, Woodside.

David Grant heads-up AGR’s new subsea project unit in Perth, Australia.

Mike Stephenson is now CTO of the British Geological Survey.

CMG VP finance and CFO, John Kalman, is to retire in August.

CSC has named Dan Hushon as CTO. He was formerly CTO with EMC Corp. and Sun Microsystems.

Segun Oyebanji, general manager of technical computing for Chevron and CIO of Chevron’s Energy Technology unit has been elected to the Energistics’ board.

Janet Tremblay, VP international business development is to spearhead Energy Navigator’s Asia Pacific expansion.

Lili Fischl has joined Ensyte as senior consultant for its natural gas practice.

Amit Bhandari is now associate partner, business analytics and optimization with IBM. He was previously with PointCross.

Laura Schwinn has been appointed president, oil and gas field services for Clean Harbors. She was formerly with Halliburton.

Hal Hickey is now president and COO of Exco replacing Steve Smith who is to retire. Mark Mulhern moves up to CFO.

Tom Walters is now president of ExxonMobil Production Company, succeeding Rich Kruger, who moves over to president and CEO of Imperial Oil.

Aaron McHugh has joined KSS Fuels as VP sales for North America. He comes from Skyline Products.

Charles Durr has joined Wood Group Mustang as VP LNG. He joins from KBR.

Michael Corbett has joined Oildex’s executive management team as VP Marketing.

James Walker has joined WellDog as EVP, Commercial Development. Walker was previously with Baker Hughes.

Michael Pavia is now CTO with Glori Energy. He succeeds Thomas Ishoey, chairman of the science advisory board.

Haztek International has joined the Energy Industries Council.

MIT’s Ernest Moniz has been named US Secretary of Energy.

Faten Halit has joined NDB as senior geophysical consultant. Geophysicist Umair Khan has also joined and will replace Patrick Sullivan at North Energy. Sullivan is off to a new contract with VNG.

Stewart Nelson of Noah Consulting has been elected President of the Houston Data Management Association (DAMA) Chapter.

Patrick Toutain has joined OGP as Safety Director.

The OPC Foundation has recruited Tim Donaldson as new Marketing Director. He hails from ICONICS.

Nadeem Chowdhry has joined Panopticon as senior VP. He was previously with QlikTech.

Chris Lund has joined Petrosys in Glasgow as support geoscientist. Laura Anderson is now accounts administrator at the Australia HQ.

PII Pipeline Solutions has appointed John Collis as sales manager for ANZ ops.

Brad Donovan is to head-up the Rapid Response Institute’s new office in Guadalajara, Mexico.

Stephen Jepps and Kevin Reid are managers of Westheimer Energy’s data centers in London and Houston.

Tom O’Reilly is now president of Rockwell Automation Asia Pacific. Joseph Sousa presides over Latin America.

Mahindra Satyam has appointed Bobby Gupta as Head of ANZ operations.

Riser specialist Emily Eadington has joined Senergy Development Solutions. Clarke Shepherd and Colin Wilson have joined the project services group.

Bob Braham has joined SGI as senior VP and chief marketing officer. He hails from EMC.

Sigma3 has appointed David Abbott global director of operations for microseismic and borehole seismic. Todd Deering has been appointed as director of software quality. Deering hails from JP Morgan Chase.

Triple Point Technology has established its new Latin American headquarters in Rio de Janeiro, Brazil.

Tsunami Development has recruited Nathan Sahlin to product support.

Leah Barker is procurement manager with Woodside.


Done deals

Aker, AspenTech, Axon, Barco, Chemical Flooding, OPIS, Fugro, Intercore, SAIC, Leidos.

Aker Solutions has acquired Managed Pressure Operations from NGP Energy Technology Partners.

Aspen Technology has acquired Refining Advantage’s pipeline and dock scheduling system software unit.

Axon Energy Products has acquired Merrick Systems RFID technology and related assets.

Barco has acquired Norway-based ProjectionDesign.

Mid-Con Energy unit Chemical Flooding Technologies has acquired Tracer Technologies International.

Oil Price Information Service (OPIS) has acquired GasBuddy. Horizon Partners served as financial advisor to the deal.

Fugro received some EUR 700 million in cash for the sale of its Geoscience division to CGG.

InterCore Energy has acquired Alertness Detection Software (ADS) developer SRG.

SAIC is to split into a technical services and IT business which will continue the SAIC name and a new ‘Leidos’ unit addressing national security, health and engineering.


Kadme Whereoil for Lukoil and Norway Scout Web portal

Platform provides access to geological, geophysical and field data from single interface.

Stavanger, Norway-based Kadme has announced two significant sales this month, one to Russia’s Lukoil Overseas Holdings, the other to the Norwegian oil and gas association (Norsk olje og gass) formerly OLF. Kadme’s Whereoil platform will provide Lukoil with access to geological, geophysical and field data from a single interface. Whereoil pulls information from files, interpretation projects such as Petrel and Kingdom, databases such as IHS IRIS and spatial data in ArcGIS from IHS’ web services. The system has been integrated with Microsoft’s active directories, providing single sign-on user identification and access control. Following the initial roll-out, a second project is in the offing. Lukoil’s Vitaly Voblikov said, ‘We are planning to further expand the project geographically, as well as to increase the number of information sources connected to Whereoil, such as application systems and databases.’

In a separate announcement, Norsk olje og gass (formerly OLF) has engaged Kadme to develop a well-data and scouting portal for its scout group. The new ‘Scout Web’ portal will be integrated into the Association’s geodata trading operator (GTO) unit, which oversees the scouting functions for registering and monitoring members’ well data and trades. Scout Web, also a Whereoil development, provides registration, follow-up and reporting of trade status. The system is linked to the NPD’s Bluebook and Diskos’ Petrobank. Association rep Bob Johannessen (Total) said, ‘Scout Web simplifies information flow between oil companies and frees-up scouts from the time-consuming manual tracking of data availability.’ Kadme will be presenting Scout Web at the 2013 PNEC in May.


Earth Cube—$5 million funding to marry geoscience and IT

US National Science Foundation debates ‘proliferating’ standards at OGC Redlands meet.

The US national science foundation (NSF) met at ESRI’s Redlands campus earlier this year to further progress on ‘EarthCube’ an attempt to bring together geoscience and IT Standards. The meeting was co-located with an Open geospatial consortium technical committee. The EarthCube initiative, a.k.a. NSF program 13-529, sets out to develop a ‘community-driven data and knowledge environment for geosciences.’ Up to seven awards totaling $5 million are up for grabs.

At the outset, concern was expressed as to the risk of adding to proliferating standards—hence the teaming with OGC. OGC reference architecture and governance will be the model for EarthCube development. More generally, cooperation with other standards development organizations (SDO) is to be encouraged as they have been ‘working on cyber interoperability for 20 years.’ Current focus is on OGC Hydro, Met/Ocean and GeoSciML work group activity.

In particular, the adoption by WaterML of an OGC basis was seen as a good example of a standard evolving from community development to an SDO base. Cryptically, the EarthCube group observed that the prior geo standard GeoSciML ‘may not be on a similar path.’


SGI for DownUnder GeoSolutions

330 teraflop SGI Rackable cluster and five petabytes of disk storage for Aussie geophysicist.

Perth, Australia-headquartered DownUnder GeoSolutions (DUG) is to upgrade its geophysical data processing center with ‘new server technology’ from SGI. SGI has been providing DUG’s high performance computing resources since its 2003 launch.

Today, the DUG cluster comprises some 330 teraflops of SGI Rackable servers with ‘tens of thousands’ of cores. Nodes are equipped with both Intel Xeon E5 CPUs and AMD Opteron 6200 series processors. Some five petabytes of storage are ‘supported.’

The new servers have reduced floor space requirements and run DUG’s CPU-intensive code 20-30% faster. Energy consumption is down 10% and the total cost of ownership of the infrastructure over a three-year period is expected to be 25% less than the previous infrastructure.


Mahindra Satyam back in oil and gas

Operational excellence and Oracle E-Business Suite offering.

Following its near-death experience in 2008 when an accounting scandal forced the ejection of its CEO, Satyam, now Mahindra Satyam, is back on an even keel and is offering custom Enterprise Resource Planning (ERP) solutions for oil and gas companies, leveraging Oracle’s E-Business suite. Mahindra Satyam claims an operational excellence focus. This is achieved by breaking down business processes to an ‘atomic’ level and identifying key cost drivers.


Sales, contracts, partnerships and deployments

Asset Guardian, Badger Exploration, CGI, Emerson, Entero, Elsevier, IFS, ISS Group, Electronic Visualization Lab, Pandell, Cortex Business Solutions, Pratt & Whitney Rocketdyne, Kelvin Top-Set, Oceaneering, Petrofac, Doris Engineering, Jacobs Engineering, Technip.

Asset Guardian Solutions has been awarded a contract by Woodside to provide a programmable systems management tool set for management of critical software assets used to operate the Karratha Gas Plant, Pluto onshore and offshore facilities, as well as six additional facilities.

Wintershall has joined as sponsor of the Badger Explorer autonomous drilling system. Other sponsors are Chevron, ExxonMobil and Statoil.

Neste Oil has extended CGI’s IT infrastructure management contract for two additional years. Statoil is likewise extending its service desk contract with CGI through 2016 with an optional one year extension.

Emerson Process Management has received a $5.8 million order for ultrasonic gas leaks monitoring systems for deployment at various Australian LNG plants. Emerson also records a $33 million contract from Statoil for upgraded safety/automation systems on the North Sea Visund platform.

Entero Corp.’s ‘Mosaic’ reserves and evaluation system has been selected by a ‘leading North American E&P company.’

OMV E&P Middle East and Caspian unit has deployed Elsevier’s Geofacets.

Norwegian engineering contractor Apply Sørco has selected IFS Applications to ‘standardize and integrate its business applications.’ IFS was also selected by Sevan Drilling.

PTTEP has selected ISS Group to supply a production data management system on its Bongkot development. The $2.8 million deal includes BabelFish products and support services.

Mechdyne Corporation has licensed the CAVE2 hybrid reality environment developed by the Electronic Visualization Laboratory of the University of Illinois at Chicago.

Sure Energy has migrated to an e-invoicing environment with an ‘APNexus’ solution from Pandell Technology and Cortex Business Solutions.

Pratt & Whitney Rocketdyne has teamed with Kelvin Top-Set to combine the latter’s incident investigation skills with its engineering expertise.

Oceaneering International has received $40 million worth of contracts from Transocean to provide three subsea blowout preventer control systems.

Petrofac’s engineering and consulting services and Doris Engineering of Houston have been awarded a project management contract by Pemex for technical assistance and supervision of the the deepwater Lakach project.

Jacobs Engineering has been selected by Shell Australia for its Clyde terminal conversion project in Sydney, Australia.

Technip has received a five-year contract from BP Angola for engineering work on Greater Plutonio area FPSOs.


Standards stuff

OGP, American Petroleum Institute, CEN, CENELEC, ETSI, American National Standards Institute, National Spatial Data Infrastructure, jWitsml, OGC, GeoSciML and semantic annotation.

The UK-based Oil and gas producers association (OGP) and the American petroleum institute (API) have created a task force to develop a ‘single set of industry standards’ that will be ‘globally accepted and widely used.’ Scope is regulation, safety, legal and environmental issues in the upstream.

Meanwhile the European standards organisations (CEN, CENELEC and ETSI) and the American national standards institute (ANSI) are to ‘maintain and intensify’ collaboration on aligning their standards to facilitate trade in goods and services between Europe and the USA.

The US Federal geographic data committee’s national spatial data infrastructure (NSDI) held a leaders’ forum at the department of the interior this month to ‘gather input to help build the foundation for a new strategic plan.’ The plan is to finalize the plan by year end 2013.

jWitsml is working on version 2.0 of its Java Witsml library and is seeking sponsors prepared to contribute ‘live, real-time operational data for testing.’ jWitsml was launched by Statoil (Oil ITJ September 2010) and is independent of Energistics and the main industry software providers. The Java interface provides compatibility with Android-based mobile devices. V 2.0 will have ‘fully-fledged’ CRUD capability and support Witsml V1.4.

The Open Geospatial Consortium (OGC) has formed a standards working group to progress the GeoSciML data model and schema for geoscience information interchange to the state of an OGC standard. Previously GeoSciML was owned by the International Union of Geological Sciences. Public comment is invited until the 8th April 2013.

The OGC has also approved a best practice document for semantic annotation of OGC standards. Semantic annotation involves the inclusion of descriptive metadata using the W3C’s Resource description format RDF in OGC web service environments (as far as we can tell from a rather wordy release).


Simulation-based training in oil and gas

GSE Systems white paper argues that simulation can mitigate a ‘camelback’ workforce.

GSE Systems has just published a white paper arguing the ‘Case for simulation-based training in the oil and gas industry.’ GSE CEO Jim Eberle observed, US universities are producing only about 20% percent of the engineering graduates they did 20 years ago. Industry needs to train recruits faster on systems that are more complex than ever.

Citing a Deloitte study, the white paper describes a ‘camelback workforce,’ with a majority of workers over age 50, a dearth in the 30 to 50 year-old range and a slow influx of younger workers. Another study, by UK-based Visiongain, put a $2 billion value on the global oil and gas simulation and virtual reality market in 2011.

Simulators don’t come cheap. While an entry level system with limited functionality can be had for $10,000 or less, a full-scope simulator capable of modeling the complexity of a platform or refinery may cost between $2–$3 million. The white paper sets out to demonstrate that this is a good investment, putting dollar values on operational cost savings accruing from well trained operators and reducing the likelihood of an incident or emergency. The paper concludes that ‘structured, experiential learning builds competency most efficiently and effectively, from entry level to experienced personnel, as well as in cross-training environments.’


eLynx and Akami team on optimization as a service

SCADA specialist leverages cloud service provider’s Terra enterprise solution.

eLynx Technologies and Akamai have teamed on a software as a service offering for eLynx’ ScadaLynx flagship oil and gas monitoring and field automation system. Under the new agreement, Akamai’s Terra enterprise solution (TES) will host eLynx’ web-based monitoring, alarming and field automation services.

eLynx CIO Ryan McDonald said that the enhanced service levels were behind the move, ‘In oil and gas, the speed at which you receive data and the ability to act quickly impact productivity and profitability and also safety. Working with Akamai means that potential internet problems do not affect our applications.’ Implementing TES has produced sub one second average response times for users.

Akamai VP Mike Afergan added, ‘Hosted software performance can be degraded using the Internet as a delivery platform. TES was designed to eliminate this concern by making applications run as they would over a dedicated wide area network.’


TouchStar’s ‘Xpress bundled’ water tickets

Electronic field ticketing system offers crude and water haulers ‘lunch pail sized’ mobile solution.

TouchStar Solutions has announced ‘Xpress Bundled’ (XB), a low cost, electronic run ticketing system for crude oil and water haulers. XB is delivered in a self-contained, ‘lunch pail’ sized form factor. No in-cab installation is required. TouchStar’s CrudePac mobility solution and Enable middleware are included in the package. HyperDocs imaging technology captures field ticketing data electronically. Ruggedized hardware includes Intermec’s CN-50 handheld computer and scanner and PB-51 portable 4” thermal printer.

TouchStar provides secure hosting of the bundle including a wireless communications plan. TouchStar’s Dave Fredericks said, ‘XB reduces start-up time and costs. For crude and water hauling customers this is a fast track to low cost fleet automation and full-featured electronic ticketing.’


RPS develops safety case for driller Ensco

HSE and risk management specialist deploys CGE’s BowTieXP software and methodology.

RPS reports that its HSE and risk management team in Perth, Australia has been working with Ensco, one of the world’s largest offshore drilling contractors since 2007 to develop safety cases for the Ensco 7500 and 8500-series of mobile offshore drilling units. The 7500 foot water depth capable unit is a dynamically positioned semisub that has worked for Chevron in Australia and in the Gulf of Mexico.

RPS safety cases allow operators to demonstrate that all risks associated with the operation of these rigs have been reduced to ‘as low as reasonably practicable.’ Safety case development is conducted using a qualitative risk assessment approach involving extensive participation of the rig crew to meet local regulatory requirements. RPS uses the ‘bow tie’ approach to safety case work.

RPS is a licensed provider of CGE Risk Management Solutions’ BowTieXP software, training and consulting. The BowTie methodology is widely used to perform risk assessment on hazardous facilities. BowTieXP provides a visual, qualitative or semi-quantitative approach to assessing risk along with an understanding of the ‘top event’ and associated controls and responsibilities. Other safety cases developed by RPS include fire and explosion, evacuation and rescue, emergency systems survivability and documentation of Ensco’s safety management system.


CiSoft working on intel-derived applications in oil and gas

Domain vocabulary extraction and transduction trial for Chevron-backed R&D unit.

Researcher Craig Knoblock of the Center for Interactive Smart Oilfield Technologies (CiSoft, a joint venture between the University of Southern California and Chevron’s Center of Excellence for Research and Academic Training) has received a grant from SRI International to advance research in the field of ‘domain vocabulary extraction and transduction plus auto-induction of layout’ a.k.a. Dovetail.

Dovetail uses intel-based techniques to extract meaning from ‘unanticipated, multiple and varied data sets’ by ‘alignment’ of data models and the application of ‘advanced analytic algorithms.’ SRI’s researchers leverage scientific knowledge and expertise from R&D groups to develop new applications for natural gas, improve safety of oil and gas infrastructure and to monitor and mitigate potential disasters.

SRI is also active in the field of cyber security where its infrastructure security program works to harden the nation’s energy infrastructure security. SRI also supports the US Department of Homeland Security’s LOGIIC (Linking the oil and gas industry to improve cyber security) consortium.


ABB on the economic value of integrated operations

Information and communications technology from nuclear and power gen leveraged in oil and gas.

A new white paper from ABB investigates the economic impact of ‘fully integrated operations’ (IO). ABB’s John Oyen observes, ‘It’s hard to over-estimate the potential of integrated operations which can change the economics of an oil platform. Many Gulf of Mexico operators have implemented always-on videoconferencing, moving subject matter experts onshore, but this is just the beginning.’

Full IO requires a multi-disciplinary effort and the widespread use of advanced information and communications technology and real-time data. Concepts like ‘smart fields,’ the ‘digital oilfield’ and ‘intelligent energy’ all share the same goal of using ICT and streamlined work processes to improve operational performance.

ABB’s hitherto unsung foray into IO stems from work with Statoil in deepwater North sea operations. Here ABB leveraged experience of integrated operations gained in the power gen and process industries. These are facing similar challenges to oil and gas such as an aging workforce and a shifting landscape of regulation, safety and environmental issues.

IO means getting people from different disciplines working together in the same software environment, making decisions based on real-time data. While some IO benefits like safety and an work environment don’t lend themselves to a simple cost/benefit analysis, elsewhere the value of IO is significant. Operators of a ‘typical’ 40kbopd North Sea field can expect an ‘average yearly return on investment’ of the order of $33 million*.

* The actual units quoted are ‘$MUSD.’ We are assuming millions.


OGP rolls-out worldwide safety database

Post Macondo Global Industry Response Group builds incident database and ‘SWRP’ capping system.

The UK-based Oil & Gas Producers association (OGP) published a summary of how the industry has implemented recommendations made by its global industry response group (GIRG). The GIRG was set up after Macondo to focus on major incident prevention, intervention and response. The new document, ‘Offshore safety: Getting it right now and for the long term’ provides an update on the group’s activity three areas—improving drilling safety, well intervention and spill response.

Actions include the creation of an industry-wide well control incident database, assessment of blow-out-preventer reliability, improved training and the development of international standards for well design and operations. OGP has established a wells expert committee and a task force for each of these priorities. One GIRG spinoff is ‘SWRP,’ a subsea well response project that has already built a capping system. GIRG has also recommended the formation of an oil spill response joint industry project with participation from the American petroleum institute, the US marine spill response corp. and other key stakeholders.


CMG developing ‘simulator of the future’ for Shell, Petrobras

Dynamic reservoir modeling systems project sees ‘limited commercial release’ by year end.

CMG reports progress on the Dynamic reservoir modeling systems (DRMS) project, a joint venture between CMG, Shell and Petrobras to develop a next generation combined reservoir and production system simulator. The project, which has been underway since 2006, recently produced a beta version of the software and is ‘expected to continue until ultimate delivery of the software.’

Last year CMG promoted Rob Eastick to VP and he is to take on the project manager role for DRMS. A ‘limited commercial release’ of the software is expected before the end of calendar 2013 at which time CMG will have exclusive rights to commercialize the software while the other partners will have unlimited software access for internal use. CMG’s 37% stake in the project was estimated at $4.0 million ($2.2 million net of overhead recovery) for the current fiscal year. CMG plans to continue funding its share of the project costs from internally generated cash flows.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.