Oil IT Journal: Volume 20 Number 1


Acacia goes after Petrel

Patent troll accuses Schlumberger, Landmark, Roxar and other geoscience software vendors with infringement of Austin Geomodeling patent - citing, inter alia, ‘visits [...] to the AGM website.'

The sale of Austin Geomodeling to SeisWare (see below) was an opportunity for us to revisit the company that was established as a vehicle to monetize AGM’s patents. Dynamic 3D Geosolutions (D3DG) was established in December 2013 by Acacia Research and AGM’s owners in order to defend two patents for ‘dynamic, 3-D geological and geophysical modeling used in oil and gas exploration and production.’ The essence of the patent appears to be the ability to correlate well logs and automatically update and display a geological model in 3D.

Mid 2014, Acacia went into action with a series of lawsuits naming Emerson/Roxar, LMKR, IHS, Paradigm, Halliburton/Landmark and Schlumberger in respect of claimed infringements of D3DG’s US patent number 7,986,319 B2. The patent, applied for in 2007 and awarded in 2011, covers a ‘method and system for dynamic, three-dimensional geological interpretation and modeling.’

Acacia’s claim is that certain functionality of geological modeling software such as Schlumberger’s Petrel, Landmark’s DecisionSpace, Roxar’s RMS, IHS Kingdom and LMKR Geographix infringes on D3DG’s patents.
The complaints have been filed in Austin Texas, alleging infringements both within the State, and, by virtue of the Texas ‘long arm statute,’ elsewhere.

Acaia’s claim is that the patented technology first released in AGM’s Recon 3.0 was ‘quickly adopted’ across the industry and had become ‘the new de facto industry standard for geological interpretation software.’
Other vendors ‘began [..] selling licenses to their own software tools that unlawfully appropriated Recon’s patented inventions.’ AGM, lacking the resources to defend itself turned to Acacia and the rights to the patent were assigned to the new D3DG company.

In support of its claim that others copied AGM’s technology, Acacia cites evidence from AGM’s web log files that allegedly show that the defendants ‘visited AGM’s website.’ One, Roxar, has allegedly visited the AGM site ‘over fifteen times’ since patent was issued.
Acacia, in a quaintly termed ‘prayer for relief,’ has asked the court to stop defendants marketing their tools pending a trial by jury.

Comment—Whatever the merits of the case, and whether a visit to a competitor’s website can be taken as signifying anything at all, Acacia is a powerful adversary. In the past year it succeeded in actions against Autodesk, Blackberry, IBM, Garmin, GE, Nokia, and many others. The company had fiscal revenues of $131 million in 2013, down from a 2012 high of $251 million. More from Acacia.


SeisWare bags AGM

Austin Geomodeling acquisition adds patented 3D geomodeling to SeisWare’s seismic interpretation software. No involvement in patent tussle.

Calgary-based SeisWare International (formerly Zokero) has acquired Austin Geo-Modeling along with the rights to its patented Recon 3D geological interpretation technology.

SeisWare CEO Ed VanWieren told Oil IT Journal, ‘AGM is now a wholly owned subsidiary of SeisWare. Our goal is now to integrate the application and its workflows tightly into our interpretation suite. This will leverage AGM’s patented 3D interpretation and Cascade technologies. Regarding the patents, SeisWare has the rights to use the patented technologies in all our products. We have not initiated any litigation, nor do we intend to.

Recon was originally developed for Saudi Aramco to correlate and build a stratigraphic framework over the Gawar field, the largest in the world. Its developers, Robin Dommisse and Tron Isaksen were previously involved in the design of Stratamodel and Z-Map Plus. Along with connectivity to Landmark’s openWorks/DecisionSpace environment, Recon is available as a plug-in to Schlumberger’s Petrel. AGM’s enterprise clients include BP, BHP Billiton, Chevron and ConocoPhillips. More from SeisWare.


The Internet of Things - what the heck is it anyway?

Having held-off for months, Neil McNaughton decides it’s now or never for his attempt to understand the Internet of Things. With a little help (and amusement) from Wikipedia he enumerates some of the plethoric ’standards’ and ‘initiatives’ to conclude that the answer is probably a lemon.

Sometimes Wikipedia really comes up with some good stuff. I have been collecting references and items in preparation for a piece on the ‘Internet of Things’ (IoT) for months now, but have held back as it all sounded how should I say, a bit nebulous.

I will spare you the pain of yet another longwinded description of how ‘things’ are going to ‘connect’ and change our lives, because I am sure that you have heard enough of this stuff already from vendors, the popular press (not that we are not popular) and folks propping up the bar.

Having said that, for those fresh back from a trip to Mars, the IoT is, to quote Wikipedia (but not the good bit), ‘the interconnection of uniquely identifiable embedded computing devices within the existing Internet infrastructure. Typically, IoT is expected to offer connectivity of devices, systems, and services that goes beyond machine-to-machine communications (M2M).’ Of course the IoT has to ‘go beyond’ M2M otherwise it would be something that already existed and of no interest to the marketing department. Quite how it ‘goes beyond’ is unclear.

So what is the IoT really? On the one hand there is the ‘consumer’ IoT, with its poster child ‘Nest’. Nest allows you to play at being a process controller in your own home, switching lights and heating on and off from your smart phone. Having said that, if the endpoint is you, stroking your iPhone’s screen, then this is not really an internet of things.

If I might be allowed a philosophical digression, defining the IoT involves an inversion of Berkeley’s subjective idealism. Bishop Berkeley reflecting on what ‘reality’ really is wondered what happened to the tree in his college quad when he looked away. This led him to speculate that a ‘thing’ only exists when it is observed. I propose that the real IoT only exists when it is not observed!

The other IoT, the non-consumer variety has been variously called the ‘industrial internet’ and ‘Industry 4.0.’ Such systems are neither far-fetched nor new. Automation, control systems, plants have lots of stuff that operates un-observed and autonomously. For them to be a part of the new IoT though, they have to be connected and interacting via the internet—as opposed to being on a local area or process control network. To make things more interesting, this should/has to (according to your viewpoint) work across multiple vendor’s systems.

Now we have heard this one before. Vendors already provide proprietary systems for doing this so their involvement in initiatives that set out to remove the advantage that this bestows needs to be taken with a large pinch of salt.

Most examples of the IoT today are little more than rebranding exercises. If you google IoT and conferences you come up with M2M conferences. OSIsoft has re-baptized its annual PI System hackathon as the ‘Internet of Things hackathon’. GE came out with its Industrial Internet with no apparent change in its product line-up. Kongsberg Maritime has supplied offshore fleet operator Floatel with onshore to offshore satellite data communications from Attunity using guess what, Microsoft’s ‘Internet of your things’. Connectivity between SAP Hana and SK Solutions sensors on cranes and construction vehicles has been re-buzzed into SAP’s own IoT. Schneider Electric’s recent deal with ioBridge has brought us the ‘internet of small things!’ And Intel has announced the IoT Gateway.

The IoT has begotten more initiatives and organizations than you can imagine. While some seem to attempt to consolidate or evolve existing interoperability protocols to provide a working IoT, others, notably the Open Group’s Industrial Internet consortium’s declared intent is to ‘define and develop a reference architecture [...] for interoperability.’ A noble if nebulous goal.

Next up we have the NIST-backed Ontolog’s contribution, an IoT ‘Summit’ working towards ‘smart networked systems and societies.’ Several IoT initiatives address the ‘smart grid,’ the connection of devices in the home to a utility’s computers with some credible usages (fine grained use of off peak power) and some less so (selling electricity you generate driving your Prius around town back to the utility). Nest by the way was acquired by Google last year and has its own IoT protocol, Thread. Other competing groups include AllSeen/AllJoyn (LG, Sony, Cisco and many more including Microsoft—interesting because the AllSeen Alliance is a Linux Foundation project). And yet another—the Open Interconnect Consortium and its IoTivity protocol—another Linux Foundation grouping with a real protocol and backing from Cisco, GE (again) and Intel. And another one, the IPSO Aliance. In fact there are even more. If you are interested read Martin Warwick’s insightful piece on TelecomTV.

To get back to Wikipedia, I always check the ‘talk’ tab behind the main article. Here on the IoT talk page an anonymous contributor asks, ‘Does every marketing cliché/buzzword need its own Wiki page? […] for heaven’s sake delete this crap please!

New Oil IT Sponsor

We welcome a new website sponsor this month in the shape of ‘business to real-time’ solutions providers to the oil and gas vertical, IT Vizion. Please check out the website on www.itvizion.com.

While I’m on the topic, there are a couple more sponsorship slots available on oilit.com (which by the way received nearly 4 ½ million hits in 2014.) Ping me on neil@oilit.com if you want more information.

@neilmcn


Book review - the Drilling Data Vortex

Carlos Damski sets out to ‘put drilling data management in a business context.’ He argues against single vendor solutions with a soft sell of Genesis Petroleum Technologies’ toolset.

In the introduction to the Drilling Data Vortex* (DDV), author Carlos Damski (Genesis Petroleum Technologies) seeks to bridge the gap between the physical and digital worlds with a management-level exposé on data and data management. DDV attempts to put data management into a business context, unlike ‘endless self-centric publications and conferences which regard data management as an end in itself’ (who could he be thinking of?). Data gets valuable only when it is leveraged in a business context to influence decision making.

A brief discussion of E&P databases concludes that, ‘No single vendor can provide all solutions in one database, although some are trying to sell this idea.’ It is better to use ‘magic mapping’ techniques such as OpenSpirit. For analysis, Damski deprecates the spreadsheet and advocates domain aware tools such as Spotfire and Genesis’ own iVAQ toolset for statistics and Monte Carlo data analytics.

There is brief coverage of data QC where Damski disses database stored procedures for actions like checking data ranges and validity. Instead we should be using tools like Datavera and (another plug) Genesis XCheck. Data visualization likewise requires its own tools such as Spotfire and Genesis XPlot (and another).

A chapter on drilling data introduces Deming’s ‘plan, do, check, act’ approach. Daily drilling reporting is highlighted as key to subsequent analysis and improvement. Some sections, particularly drilling optimization, the ‘statistical AFE,’ maps and logs do little more than introduce the subject. Rather extensive coverage of Energistics standards is not matched with a brief aside to PPDM. Material in the case history chapter is covered in a superficial manner.

A final chapter on drilling’s future concludes that the ‘future is now’ and offers evidence in the form of digital oilfield-type tools such as ProNova, Verdande DrillEdge to conclude with an introduction to the SPE DSATS initiative.

DDV is an interesting, brief (33k words) and idiosyncratic collection of reflections on drilling data. A companion website is to provide an ‘extended discussion’ of the book’s content. All in all, DDV is reasonable value at $7 in eBook form.

* 2014, Genesis Publishing. 138 pages. Available on Amazon.


2014 Palisade @Risk conference, New Orleans

EpiX Risk Optimizer accelerates oil major’s drilling. Revay & Associates on risk in project management. Woodside’s ‘cost confidence’ modeling tool. Palisade’s working model of GoM asset.

Huybert Groenendaal (EpiX Analytics) speaking at the Palisade Risk Conference in New Orleans late last year offered some insights into building ‘decision-focused’ @Risk models based on his experience with a large multinational client. First, at the design stage, a model needs to be appropriately sized for computability and still offer the desired output resolution. Also, every iteration must be a possible future scenario. Often, modelers fail to take account of relationships between inputs that may bias the Monte Carlo results. To help keep modelers on-track, EpiX offers ‘Risk Optimizer,’ a free @RISK add-on that helps modelers chose the best method for characterizing relationships between variables. Failure to do so typically results in the under-estimation of risk.

One large multi national oil and gas company called on Groenendaal when they found that their attempts to optimize a large drilling program were taking three days to run and that ‘the answer wasn’t that optimal!’ Model constraints included budget, rigs, human resources and access (some sites could only be reached in the winter). The company’s large optimizer included Monte Carlo simulation for uncertainty evaluation. Risk Optimizer allowed the client to check if model constraints were being met early on in a simulation, avoiding unnecessarily long run times and to identify and capture scenarios that were producing better results. The approach greatly reduces the number of runs compared with a brute force approach.

Mark Krahn (Revay & Associates) presented his company’s work on a variety of oil and gas projects such as the Keystone XL pipeline, the Fort Hills and Horizon oil sands and Sunrise SAGD projects. Risk management is ‘the fastest growing area of interest in project management.’ Krahn’s presentation included some spectacular images of what happens when things go wrong—from the 2007 Dubai Infinity Tower flood.

Sujay Karkhanis showed how Woodside uses @Risk to improve ‘cost confidence’ in major projects. Woodside has developed an @Risk-based methodology for new projects. The Woodside cost confidence modeling tool (CCMT) has been used to build an international exploration portfolio characterized by ‘materiality and depth,’ with an emphasis on emerging petroleum provinces. Novel locations and projects make for limited control over market conditions and mean that ‘cost and schedule certainty will be paramount.’ Conventional single point estimates will be inadequate for decision making. The CMMT process has been successfully deployed on projects ranging from $20 million to ‘mega.’

A presentation from Palisade, including a working spreadsheet, showed how oil offloading from an oil rig in the Gulf of Mexico can be optimized in the face of weather disruptions. The facility serves as an inventory stocking point for several other platforms. On a daily basis oil may be shipped to the coast, sent through a pipeline or stored at the rig. A downloadable Excel spreadsheet lets users play with the model using a weather behavior generator that leverages historical data. Access the Palisade presentations.


Exprodat’s recipe for rapid SPEE Monograph 3 shale reporting

Exprodat’s Unconventionals Analyst speeds shale reserves reporting to SEC.

Exprodat blogger Chris Jepps has authored a guide to ‘one hour Monograph 3’ reserves reporting. The title refers to the Society of Petroleum Evaluation Engineers’ (SPEE) guidelines to reserves estimation in resource plays such as shale. Monograph 3 is the bible for SEC reporting but it can be ‘complex and time consuming’ without the right tools. So what are the right tools? For Exprodat they are GIS of the ESRI variety along with its own ‘Unconventionals Analyst’ extension to ArcGIS for Desktop.

Jepps shows how ten iterations of the M3 workflow can be achieved in under an hour. The technique is applicable to mature resource plays as it requires existing well and production data. A random selection of a set of ‘anchor wells,’ is used to calibrate the model. Statistics from test wells are compared for consistency. The approach involves concentric buffer zones and polygons defined by well locations—both decidedly GIS-type activities. SPEE Monograph 3 is a snip at $70 from the SPEE webstore.


'Uber-like’ pipeline mapping system

Geospatial Corp. leverages Salesforce.com API in crowdsourced pipeline data collection.

Geospatial Corp. has announced a new mapping support service combining its GeoUnderground location-based infrastructure management system with the Salesforce.com API in what is described as an Uber-style ‘strategic service provider’ (SSP) program. The SSP provides access to Geospatial’s data acquisition technologies along with operational assistance and project management services.

Geospatial CEO Mark Smith said, ‘With well over 3 million miles of underground pipeline infrastructure yet to be accurately mapped, Geospatial’s SSP program will seek to enlist a substantial portion of the 60,000-plus service companies currently collecting various types of infrastructure data. Ideal candidates are existing surveyors, engineers, private utility locating companies, various pipeline inspection companies and a vast assortment of specialized channel partners.’

Coupling the Salesforce.com API with GeoUnderground provides a geo-referenced view of the SSPs market areas and will allow Geospatial to establish operation centers in major cities across the US and globally. GeoUnderground, built atop the Google Maps Engine and AP, lets users securely gather, share, view and edit geo-referenced information from the field on laptops, tablets or smart phones. More from Geospatial.


IBM Redbook outlines components of digital oilfield solution

Websphere-based ‘integrated operations’ uses semantic model to tame the data deluge.

IBM has just published a new RedBook titled ‘Improving upstream oil and gas operations with the IBM integrated operations (IO) solution.’ The 13 Page pamphlet enumerates the components of IBM’s IO/digital oilfield offering that is claimed to provide visibility into upstream processes and analytics-based insights. Overall, the offering gathers from equipment in the field and provides business intelligence and advanced analytics on data collected. Use cases include well and reservoir performance monitoring, maintenance repair and operations support and more.

IBM’s WebSphere integration bus provides connectivity with a range of industry-standard data endpoints including data historians (OSIsoft PI System and Aspentech IP21). OPC-UA connectors are available in the unlikely event that any such devices are deployed.

An ‘Intelligent operations center’ (IOC) provides data visualization with a GIS-based GUI, and a ‘rules engine’ to support decision making. The IOC further leverages a ‘semantic model’ that can ‘show the relationships between a well, seismic information, logs, sensor measurements, video streams and work orders’!

To make sense of the data deluge you will also need some of IBM’s SSPS-based data science offerings. If you like you can enter the ‘big data’ arena, with InfoSphere BigInsights, IBM’s Hadoop engine. IBM’s high end toolset runs on 64-bit Red Hat Enterprise Linux and (optionally) System x servers with quad core processors.


Ecopetrol takes Skua on test flight

Paradigm’s interpretation flagship’s ‘pillar-less’ gridding captures Colombia’s complex geology.

A new white paper from Paradigm describes how Colombian Ecopetrol has used the Skua geomodeling tool to build a static model of a mature field in the tectonically complex Middle Magdalena basin. Following a recent drilling campaign, Ecopetrol believed that some reservoirs had been inaccurately mapped leading to sub-optimal exploitation.

Paradigm’s Skua uses ‘pillar-less’ technology that allows complex faulting and stratigraphy to be accurately captured ‘with no need to simplify the data to fit technologies that cannot handle high levels of complexity.’

Data from 185 wells, 23 stratigraphic units and seismics were modeled providing an accurate representation of reservoir complexity including reverse faults. A conceptual deposition model was used to build a 3D facies cube. Paradigm Jacta was also deployed to assess uncertainties in the final model.

Ecopetrol head of EOR said, ‘Skua provided us with new and accurate information on reservoir volumetrics. The use of advanced technology that took all the data into account has enhanced confidence in our drilling decisions.’


Software, hardware short takes

Canary Labs, Esri, Batelle, Cisco, Floteck, New Century Software, Palisade, Tibco, Blackhawk Specialty Tools, Yokogawa Electric, Headwave.

Canary Labs has announced new functionality in the ‘Store and Forward’ module V10.1 of its process data historian. More in the video and on use in shale ops on the Canary website.

Esri has announced the ‘next generation of GIS,’ or, in other words, ArcGIS 10.3. A new ArcGIS Pro 64-bit desktop app lets users design and edit in 2D and 3D, work with multiple displays and layouts, and publish maps directly to ArcGIS Online or Portal for ArcGIS, making them available on any device. Other new functionality includes 3D, real-time data and enhances geoprocessing.

Battelle has demonstrated its HorizonVue 360° camera deployed on a remotely operated vehicle (ROV). The unit can provide a 360-degree live video feed in up to 4,500 meters waterdepth. Watch the video or visit Battelle.

Cisco has a new edition of its C240 M4 grid engine with the latest Intel Haswell E5-2600 V3 processors and Nvidia’s GPU-based virtual desktop infrastructure VDI 2.0.

Flotek Industries has announced a Canadian edition of FracMax, its patent-pending, advanced analytics software leveraging data from 7,700 Canadian wells. The platform features Flotek’s CnF chemistries. The US edition holds data from over 85,000 wells. FracMax is marketed as a service to clients. Its closed architecture provides for ‘consistency and integrity of the data and processes.’

The latest 5.0 release of New Century Software’s Facility Manager enhances pipeline attribute and centerline data management with closer integration with ArcMap, re-station/reverse routes, bulk edits and work order-centric events management.

Palisade has announced @Risk/DecisionTools Suite 7 with a new data viewer that extends spreadsheet data with @Risk charts and graph. Also new is efficient frontier analysis for optimizing project ROI with respect to risk and ‘copulas,’ tools for correlating uncertain variables. A new ‘BigPicture’ diagramming and mind mapping tool for Excel lets users organize thoughts and ideas, or create dynamic maps from spreadsheet data.

Tibco has announced ‘Recommendations’ for Spotfire Cloud, a ‘jumpstart to self service data analytics. Recommendations lest business users select the best visualization for data discovery and storytelling. Recommendations is delivered from the Tibco Spotfire Cloud.

Blackhawk Specialty Tools has developed ‘Hawkeye,’ a wireless top drive cement head that enhances speed, efficiency and safety in cementing operations. The unit removes the need for a hydraulic umbilical and operating console and promises ‘the fastest plug, dart and ball reloads in the industry.’ Radio signal filtering blocks interference from Wi-Fi and Bluetooth for increased reliability. Hawkeye is currently in use in both land and offshore operations.

Yokogawa Electric Corp. has enhanced its GA10 data-logging software its SmartDac plus data acquisition and control system. GA10 R2.0 comes with optional real-time calculation and reporting functions and enhanced host system connectivity.

Correction

Hue’s Michele Isernia pointed out an error in our November 2014 report from the SEG. Neither Headwave nor HUE use Nvidia Index technology.


SMi Oil and Gas Cyber Security 2014, London

GDF Suez on socially engineered threats. CERT-UK and cyber information sharing. ENI’s framework for assessing cyber security. Europol’s Cybercrime center and the global cybersecurity index.

Speaking at the SMi Oil and Gas Cyber Security conference in London late last year, GDF Suez’ Phil Jones spoke on the social engineering and threats to the industry. Social engineering refers to psychological manipulation of people into performing actions or divulging confidential information. This is an easier option for the hacker than trying to break into the system. Phishing is an example of SE—opening an Excel file titled ‘recruitment plan’ cost RSA $63 million! USB keys dropped in the parking lot are another good way into the network. Individuals are also at risk when they divulge personal information on social media sites and in providing answers to ‘security questions’ to third parties.

Chris Gibson introduced the CERT-UK organization which has a close working relationship with the oil and gas sector. The system was recently put to the test with the discovery in September 2015 of the Shellshock Unix vulnerability with alerts and mitigation advice communicated to stakeholders in under 24 hours. CISP, a joint government/industry cyber security information sharing service has been established and CERT-UK now issues quarterly activity reports. One oil and gas company member recently took part in the ENISA cyber security exercise.

Alessandro Marzi described ENI’s work on an assessment framework for cyber security. The digital oilfield is bringing convergence of IT and operations. While this is driving efficiencies it brings risks of ‘sophisticated complex’ attacks on facilities. ENI’s IT department has been tasked with extending its scope to provide secure digital processes. Enter the ICT security maturity model, a set of tools and processes to provide risk-based, business-driven security. Security is proving to be a bridge between the IT and OT worlds.

Other presentations of note included Troels Oerting from the Eurpol Cybercrime Centre on the EU’s response to threats directed at critical infrastructure and ABI Research’s Michela Menting who introduced the Global Cybersecurity Index which ranks countries’ cybersecurity capabilities.

More from SMi Conferences.


ESRI EU PUG 2014

Tool of choice for shale (Total). GIS data models (Willbros). Big data and Neteeza (DNV). Bridging the engineering - GIS gap. CAD to GIS interoperability and AEGIS (CBI). The hidden costs of GIS deployment (Total). Tullow’s common operating platform. Fugro on pipeline inspection. Shell - avoiding disaster with APDM. Shell’s APDM. GIS seismology (Statoil). Trans Adriatic Pipeline GIS.

Presentations made at the ESRI European Petroleum User Group (PUG) illustrated how widely GIS is now used in a variety of upstream and midstream contexts. Esri’s GIS has become a popular development platform for a range of projects from shale play evaluation to real time vessel tracking and situational awareness. But it is not an out-of-the box solution and project costs greatly exceed licensing fees.

Total’s Arthur Gayazov showed how GIS has proved to be the tool of choice in the context of shale new ventures in Russia. Evaluating the potential for non-conventional (shale) plays involves a very wide range of data types. These include sedimentology, shale petrophysics, kerogen maturity and more. Russia’s West Siberian basin covers some 2million km2, larger than any current US play. There is proven overpressure and recoverable reserves in the 30-70 billion barrels range. Map-based play assessment starts with a selection of basins and geological proxies for modeling and mapping in ArcGIS and Petrel. Common risk segment and common volume segment maps were created using Exprodat’s TeamGIS extension for ESRI’s Spatial Analyst. GIS works as an integration platform as it provides quick access to multiple in-house data sources. Oil in place has been calculated displayed along with cultural data (satellite imagery, published maps) to get an idea of accessibility. All can be viewed as ‘effective area’ and ‘chance of success’ maps using TeamGIS. The results look promising with up to 4.5 million bbl/km2 for the area of interest. TeamGIS Unconventional Analyst was also used to investigate drill spacing and timing constraints. ‘GIS is the best tool for extensive unconventional drilling programs.’

Jeff Allen (Novara) offered an insight into different pipeline data models. Different segments of the industry from gathering, through midstream to transmission and distribution deploy variously APDM, PODS relational, PODS Oracle spatial, PODS Esri, ISAT or proprietary vendor models. The APDM model has morphed into UPDM (utility and pipeline distribution model) that includes ‘ALRP*’ risk data. UPDM is ‘pipeline centric’ and geodatabase-based. In the ‘moderately normalized’ model, each component is explicitly represented as single database table object.

Data from tankers’ automatic identification systems (AIS), a VHF signal captured by a worldwide network of receivers, is ‘big.’ Karl-John Pedersen showed how DNV GL has kitted up to track and use such data in oil and gas. AIS data provides detailed tracks of vessel movements around offshore facilities or busy ports and loading facilities. Analytics of such data sets provides risk reduction to DNV’s oil and gas clients. AIS shipping data is captured to an IBM Neteeza appliance, a ‘big data’ solution that includes ESRI spatial. The Neteeza data warehouse is coupled to DNV’s Cognos environment. Data is accessed via ArcToolbox query or Python. Maps of ship tracks are served from ArcGIS Server. ESRI Maps for Cognos also ran. There are some limitations: ArcGIS can’t write to Neteeza, there is a lack of raster support and a limited user base. The system has improved AIS data availability and enables non GIS users to perform analytics.

Catherine Hams showed how Cairn Energy has built an emergency response web portal for its Moroccan venture leveraging a hybrid ArcGIS online, ArcGIS Server/desktop. The portal, a ‘logistics game changer’ is now available directly from the Cairn Enery portal alongside data sources including Petroview, Tellus and FRogi.

A presentation from GXMaps addressed challenges in map management, in particular the thorny problem of version control and how to avoid decisions based on out-of-date maps. The answer is to use online, quality assured web maps and mobile endpoints. If you have to use paper, make sure there is a QR code for version checking.

Willbros’ Peter Veenstra wants to make GIS work for folks ‘without a GIS inclination’ by bridging the gap between GIS and engineers. Which means linking to documents, ‘the ultimate repository for ‘real’ information.’ Engineering information is meticulously collected and assembled but as it transitions to the GIS, much is ‘lost in translation.’ Terminology is different, data model footprints may not match. Engineers say, ‘just give us our data’ and ‘where is the easy button?’ The answer is to use the ESRI model builder to provide hyperlinks from the GIS to engineering documents in Maximo, Documentum, Sharepoint, Scada and corrosion management systems. ‘GIS does not need to store these items,’ ‘everything is the new data model!’

Keith Winning has taken a similar approach in CBI’s advanced engineering GIS ‘AEGIS,’ for pipelines. Winning is lead pipeline engineer on the BP operated Baku-Tbilisi-Ceyhan pipeline which is using an extended PODS model to bridge the gap between engineering/CAD and GIS. AEGIS integrates CAD and GIS systems by data sharing.

Sylvain Bard-Maïer warns that in a large GIS project such as Total’s online catalog, ‘E&P Maps,’ the direct cost of development is just the tip of the iceberg. For such a large GIS project, perhaps 15% of costs go on software licenses while 85% is ‘hidden’ involving data QC, standardization and GIS portal administration. ArcMap Document (.mxd) data files embed corporate standards for layers and symbology. Here, Total has developed an FME-based toolset to help and has established workflows and training for users. Maps can now be delivered across the Total network to mobile users.

The concept of a common operating platform (COP), as presented by Tullow’s Colleen Abell, was derived from a US Government requirement for a single overview of operations such as oil spill mitigation that gives on and off-scene personnel the same information and view. Tullows crisis management team used to rely on Google Earth and paper maps. Tullow now is developing a proof of concept COP with ArcGIS Online. A test scenario involves a blowout on a West African deep water well followed by a spill. The COP dashboard shows ship positions and ROV video of the BOP. The map shows locations of hospitals and shelters, vessels, aircraft, relief well locations and dispersant stockpiles. The COP also provides KPIs for executives and a post-exercise analysis and review (PEAR) function. Storytelling templates have been authored for communication with the public.

Boudewijn Possel showed how Fugro has used GIS to manage risks of offshore infrastructure. One case history involved a site survey and risk analysis of client Taqa’s loading terminal offshore Netherlands. GIS lets Fugro blend site survey data with pipeline features and rock dumps. The terminal management system includes 10 years of historical data on pipeline depth of burial, free spans, sandwave migration and more. Fugro offers analytics and a risk matrix for events such as dropped/dragged anchors and trenching. A ‘reciprocal of risk’ map was used to re-evaluate survey design. Data is also available remotely using web map services. ‘PINS,’ an acoustic pipeline inspection methodology originally developed for BP, has also been deployed.

Berik Davies outlined Shell’s implementation, with help from Willbros, of the ArcGIS pipeline data model (APDM) for use in its integrity management program. What problem is APDM solving? Davies showed some scary photos of the 2014 Kaohsiung, Taiwan gas pipeline explosion which killed 32. ‘We want to be able to say our pipelines are safe and to keep the hydrocarbons in the pipe.’ This is being achieved by using APDM as single system of record that fits with other tools. APDM is an auditable, global ‘public’ standard for Shell. The data model is neither too light (as per the default APDM 6 template) nor too heavy (as in APDM 4). The ideas is to store external document references in an APDM table to ‘avoid geodatabase bloat.’ The model needs to tie with Shell standards for symbology, geomodeling, enterprise GIS and with IT guidelines. The model embraces SAP functional location tags, document numbering to tie to the engineering data warehouse and to the LiveLink DMS. The solution, built on the latest APDM 6.0 release modeled in Enterprise Architect, will be Shell’s system of record for all pipeline activity. Notably including pipe integrity management and risk based assessment with w-PIMS. Data will be accessible through the Shell ‘MyMaps’ portal and, for engineers, as a quick view engineering report. APDM is ‘strong and fit for purpose.’ Moreover there is a ‘strong pull from the business to make this happen.’ The solutions is to be deployed on the troubled Kashagan project where production was halted last year following a pipeline leak!

Renaud Laurain (Statoil), has used ArcGIS to characterize and filter seismic interference from other seismic vessels operating simultaneously in the neighborhood. During the summer of 2014 in the North Sea Tampen area, up to 10 vessels were in operation. The technique involves modeling seismic arrivals from ‘foreign’ shots which can then be filtered. The approach has proved effective and Statoil is now developing similar functionality in ArcObjects.

Mark Hoogerwerf from Netherlands-headquartered engineer Royal Haskoning DHV showed how GIS is being used across the design and build phases of the 870 km long Trans Adriatic Pipeline. GIS, alongside the engineering document management, underpins specialist tools and activities. TAP’s IT philosophy is to (mostly) configure COTS** tools and add some development. These include ESRI-based mapping tools from Conterra for security and mapping and Jira for workflow. These have been linked to documents and photos in a land workflow to calculate landowner compensation. This is no small task as the complex project spans three countries with different languages and legislations.

Exhibitor Geocento was showing its EarthImages online search engine for worldwide satellite imagery. EarthImages allows users to discover what satellite imagery is available for their area of interest from suppliers all over the world and from a vast range of sensors.

Read the presentations from the EU PUG here.

* As low as reasonably practicable.

** Common off-the-shelf.


PPDM data management conference 2014, Kananaskis

Concho’s well hierarchy. 451 Data Solutions - what is a regulation? OpenSpirit’s workflow palettes.

The annual Professional petroleum data management (PPDM) symposium was held in Kananaskis, Alberta late last year with around 150 in attendance. The eponymous PPDM data model—now at version 3.9 has grown to some 2,700 tables, 71,000 columns and 26,000 constraints. PPDM’s ‘what is a well’ initiative has blossomed into a well hierarchy that EnergyIQ has deployed for Concho. The hierarchy is now a foundation for lifecycle data integration.

Marc Fine (451 Data Solutions) provided an introduction to the intricacies of well status and classification. Terminology—even that imposed from a regulator can be confusing. An ‘active’ well in Arkansas is one that is capable of producing hydrocarbons, in California it is any drilled and completed well and so on. The PPDM well classification and status workgroup has mapped codes from 39 regulators in Canada and the US. The agencies have been contacted to check definitions and so far, 1,500 codes and definitions have been mapped to PPDM ‘facets.’ Facets allow multiple types of classification to be defined separately re-using the same information. Examples include the regulatory life cycle and wellbore status facets.

Clay Harter from event sponsor OpenSpirit introduced its pre-built ‘palettes’ that have been leveraged in various workflows. These include data sync across a corporate well master and Schlumberger’s Studio/Petrel environment. OpenSpirit is even used to sync between Prosource and Studio (both from the Schlumberger stable!). Another palette connects Petrel to Landmark’s engineering data model EDM so that perforations, casing and tubing data and drilling events can be viewed. A subsurface search option is claimed to ‘make finding and using subsurface data as easy as finding a restaurant on Google maps.’


Folks, facts, orgs ...

Alberta Energy Regulator, Apache, API, Attunity, Blueback Reservoir, Cegal, Blu LNG, BP, CH Robinson, Dassault Systèmes, DNO, Elite Control Systems, Energy Navigator, Flotek, GE, McDermott, Georex, Geovariances, Global Hunter, IHS, Ikon, Intsok, JDR, J-W Energy, OGC, One Source, PESGB, RRC, Sierra Hamilton, TAP, Taulia, Venture Global, Willbros.

The Alberta Energy Regulator is investigating what makes a best-in-class regulator with help from the University of Pennsylvania’s Penn Program.

Pete Ragauss has been appointed to the Apache board. He retired from Baker Hughes last year.

The American Petroleum Institute has named Robin Rorick as head of its new midstream department.

David Collins has been appointed regional VP of Attunity’s North American sales team. Collins joined Attunity from Intel.

John Sayer heads-up Blueback Reservoir and Cegal’s new office in Aberdeen. Sayer hails from Earthworks Reservoir.

Blu LNG has named James Edward Burns as president. Burns was previously with Fortress Energy Partners.

BP has appointed Spencer Dale as chief economist. He comes from the Bank of England.

Kent Stuart and Corey Gear have joined Navisphere, a newly formed CH Robinson unit that is to develop supply chain solutions for the oil and gas industry. Stuart is a CH Robinson veteran. Gear comes from BDP International and Deutsche Post DHL where he established and led oil and gas units.

Gian Paolo Bassi is CEO of Dassault Systèmes’ Solidworks 3D design software brand. Former Solidworks CEO Bertrand Sicot moves over to VP sales at its Value Solutions channel.

Det Norske Oljeselskap has appointed Olav Henriksen to SVP Projects. Henriksen hails from ConocoPhillips.

James Eadie heads-up Elite Control Systems’ new Aberdeen office. He was previously with Safety Technologies.

Michael Jegen is the new executive VP of Energy Navigator.

Flotek has appointed Steve Marinello as director of CnF Applied Technology. Marinello was previously with Shell.

GE and McDermott have jointly launched IO Oil and Gas consulting. Dan Jackson is CEO, Mark Dixon, CTO and Tony McAloon, COO. IO is to offer ‘front-end solutions’ for offshore fields.

Georex and STC-Boussens are building a petroleum analysis laboratory at Boussens, France.

Olivier Bertoli is now MD of Geovariances. He replaces Yves Touffait who becomes special adviser.

Ken Sill has joined Global Hunter Securities as MD and senior oilfield services analyst in its Houston office. Sill was previously CFO at US Well Services.

Carlos Pascual has been appointed senior VP at IHS. He was previously US ambassador to both Mexico and Ukraine.

Matt Bell is president of Ikon Science Americas. He was previously with his own consultancy, IN2 Oil and Gas.

Dave Keating is Intsok new oil and gas advisor for the Canadian market and Nargiz Mehdiyeva is advisor for Azerbaijan. Mehdiyeva was previously with Statoil Apsheron in Baku.

JDR has appointed David Currie as CEO and member of the board. Currie hails from Aker Solutions.

J-W Energy has promoted David Miller to president. Richard Clement now heads-up the midstream company.

Scott Serich has been appointed OGC director, interoperability programs. Before joining OGC, Serich was at the IJIS Institute.

Greg Casey has joined One Source Networks as VP and general manager, oil and gas. He was previously with the Texas Energy Network (TEN).

Hamish Wilson is now president of the Petroleum Exploration Society of Great Britain.

The Railroad Commission of Texas has named Lori Wrotenbery director of the oil and gas division.

Sierra Hamilton has appointed John L. Morgan as chief executive officer and member of the board of directors. Patrick Drennon becomes president.

Ian Bradshaw is the new MD of Trans Adriatic Pipeline. He joins from BG Group.

Taulia has hired Rik Thorbecke as CFO. He was previously with Meltwater Group.

Graham McArthur has joined Venture Global LNG, as VP, president and treasurer.

Harry New has rejoined Willbros as president of oil and Gas.


Done deals

Target Oilfield Services, Fuse IM, Aveva, 8over8, Dawson Geophysical, TGC Industries, Spectris, ESG Solutions, Superior Energy Services, Prospect Flow Solutions.

Oman-based consultant Target Oilfield Services has acquired a controlling interest in Fuse Information Management, provider of upstream workflow and information management solutions. Target is to combine its service offering with Fuse’s XStreamline workflow manager and Expedite data management solution.

Aveva has acquired 8over8 and its ProCon risk management platform for connecting owner operators and engineering companies. ProCon is used on major capital projects either deployed locally or over a private cloud.

Dawson Geophysical and TGC Industries are inviting shareholders to vote ‘yes’ to a merger proposal. A ‘definitive joint proxy statement’ has been filed with the SEC in connection with the proposed merger.

UK-based Spectris has acquired microseismic specialist ESG Solutions, developer of FracMap. ESG will integrate Spectris’ Test and Measurement segment.

Superior Energy Services has acquired Prospect Flow Solutions. Prospect will join Superior’s Wild Well Control unit. The companies are partnering on a joint industry project to develop large-scaled field testing of subsea gas releases to validate computer models and ensure safe placement of a capping stack during critical offshore operations.


Microsoft’s big data offering for oil and gas

Big takeaways from big data or how HDInsight, Microsoft’s Hadoop might impact oil data analytics.

A position paper* authored by Microsoft’s Egbert Schröer looks at the potential for big data in the digital oilfield. Of all the technologies used in oil and gas, big data represents ‘one of the most disruptive and elusive competitive advantages.’

There are many definitions of big data, a term which now encompasses ‘all advanced data analytics.’ For Microsoft, big data encompasses ‘strategic planning, advanced mathematical analysis and collaboration and reporting.’ The aim is to produce ‘tangible takeaways’ (like a Big Mac?) from massive amounts of data.

For oil and gas this means deploying both high-tech and mathematically cutting-edge tools on the server along with an easy-to-learn and human interface ‘that allows decision-makers to view, assess and take action anytime from anywhere.’

On the server or rather, in the cloud, Azure HDInsight provides a Hadoop-based big data solution that scales ‘from terabytes to petabytes,’ processing clickstream, log and sensor data. For clients wary of the cloud, HDInsight can also be delivered through the analytics platform system (APS), an appliance for on-site deployment. APS also includes a SQL-server parallel-data warehouse and PolyBase for blending structured and semi-structured data.

At the client end, Microsoft’s ‘self-service’ business intelligence solution Power Pivot along with the ubiquitous Excel ‘allow users to develop data models and calculations’ without the need for IT specialists and developers.

Use cases include reducing non-productive time in drilling and production through machine learning-based predictive maintenance of critical components such as electric submersible pumps and to help reduce HSE incidents.

* Big Takeaways from Big Data.


Saipem deploys engineering fluid dynamics in offshore design

Mentor Graphics’ FloEfd ‘virtual lab’ provides 'engineering-oriented’ approach to semi-sub design.

Mentor Graphics reports that Eni’s engineering and construction unit, Saipem has used its FloEfd ‘virtual Lab’ to design the Saipem 7000, the world’s second largest semi-submersible crane vessel.

Mentor’s FloEfd embeds engineering fluid dynamics within computer aided design (CAD) to provide an ‘engineering-oriented’ approach to design. FloEfd was used to compute towing force on various mobile offshore units and to verify that temperature limits were not breached for crane operation in proximity to a flare stack, a pre-requisite for safe operations and certification.

Saipem has also used the tool to aid in the design of deepwater, dynamically positioned pipe laying vessels. These deploy powerful thrusters to generate the large towing forces required to cut deep trenches. But this brings the risk of seabed scouring when used in shallow water operations. FloEfd simulations have established safe operating limits that have been validated with seabed surveys. More from Mentor Graphics.


Texas Railroad Commission upgrades its GIS

New ArcGIS-based mapping system offers enhanced public access to oil and gas activity.

The Texas oil and gas regulator, the Railroad Commission has unveiled a new public mapping system for online viewing of oil, gas and pipeline data. The new system, built with Esri’s ArcGIS, is said to ‘increase transparency of oil, gas and pipeline data.’

RRC chair Christi Craddick said, ‘The enhancements the Commission has made to the GIS viewer will allow the public to better keep up with the oil and gas industry, and allow those in the industry to more easily and efficiently access information critical to their businesses.’

The new map provides well data by county, API numbers or street addresses, an enhanced GUI and a radius tool that lets Nimbys figure how far they are from a well.


Red Hen System’s year-end cock-a-doodle-doo

Methane emissions detector, laser mapping bundle, $15 Android mobile mapper.

Mobile mapping specialist Red Hen Systems has been busy upgrading its product line. Of interest to oil and gas is a new methane emissions detection solution that multiplexes data from both a Boreal Laser GasfinderAB and a FLIR infrared camera. The tool works from a helicopter or a vehicle. Data can be captured to video and viewed in Google Earth.

A new digital mapping reconnaissance bundle is ready for shipping, comprising a custom configuration of cameras, laser rangefinder, GPS unit, and software all linked through the Red Hen VMS-333 multiplexing system.

A more lightweight solution is the MediaMapper Mobile, a $15 Android mobile mapping app with multiple uses including, for shale operators, ‘counting and geotagging prairie dog dens on public land.’ Captured data can be stored encrypted with the MediaMapper, either on site or in RedHen’s cloud.


Sales, deployments, partnerships …

Mellanox Technologies, EMGS, KBC, Metro Oil, Kalibrate, Tall Oak,ESI, FMC Technologies, Star Deep Water Petroleum Limited, GSE Systems, Jacobs Engineering, BP, GE, Meridium, Safran, Novara, Khatib & Alami, Helix, OneSubsea, Schlumberger, STW, TRE & Associates, Technip, ONGC, MAAZ, Thermo Fisher Scientific.

EMGS has deployed Mellanox’ low latency 40 gigabit ethernet switch to connect 264 servers in its datacenter.

KBC been awarded a £3.3 million, 7 year contract from a major EU oil field services company for its upstream simulation portfolio. The deal covers Feesa Maximus well bore and pipeline modeler, KBC Multiflash and Petro-SIM.

Metro Oil has purchased the Kalibrate cloud pricing solution for roll-out across its retail locations in the Philippines.

Tall Oak Midstream has selected ESI’s GasStream software for its volume and transaction accounting.

Chevron unit Star Deep Water Petroleum has ordered subsea equipment for operations in the Agbami field, offshore Nigeria, from FMC Technologies.

GSE Systems has signed a three-year agreement with a major LPG supplier to perform electrical engineering, instrumentation, control and automation in the UK.

Helix Energy Solutions, OneSubsea, and Schlumberger have signed a definitive agreement for a ‘non-incorporated alliance’ to develop technologies and deliver equipment and services to optimize subsea well intervention systems.

BP has extended a contract with Jacobs Engineering for strategic supply of global mid-cap engineering work.

GE Measurement & Control and Meridium have announced ‘Production Asset Reliability,’ an an integrated asset performance management solution for oil and gas.

SAP’s Integration and certification center has awarded the ‘powered by SAP HANA’ accolade to Norwegian Safran for its Project 5 project management solution.

Novara GeoSolutions is teaming with Khatib & Alami to market Novara’s Intrepid pipeline management solution in the Middle East.

Mohammad Abdullah Al Azzaz Co. has implemented Thermo Fisher Scientific’s SampleManager laboratory information systems (LIMS) to support clients in the oil and gas and other verticals throughout the Middle East.

Technip has been awarded a €100 million contract by Oil and Natural Gas Corporation Limited for a six million m3/day onshore terminal at Odalarevu in Andhra Pradesh, India.


Standards stuff

HDF5 in oil and gas special. Following adoption in Resqml, we talk to HDF Group CEO Mike Folk about other oil and gas usage and his plans for new tools and a software-as-a-service/cloud offering.

Our standards special this month homes-in on use of the HDF5 protocol for the storage of large oil and gas data sets. HDF, the hierarchical data format, was originally developed at the National center for supercomputing applications at the University of Illinois and today is managed by the HDF Group. HDF Group was spun out of the University of Illinois in 2004 as a not-for-profit tasked with the continued development of HDF5.

HDF Group CEO Mike Folk told Oil IT Journal ‘HDF5 is well suited to meet the increasing demands in oil and gas to handle ever bigger and more complex data. Current industry implementations that we are aware of include Energistics’ Resqml standard for reservoir data and EMGS’ H5EM-TS exchange standard for field EM data which is also used by Statoil. The Passcal Instrument Center at New Mexico Tech has developed PH5, an alternative to SEG-Y for seismic data archival based on HDF5 and New Zealand seismic Globe Claritas uses the format in its processing software.’

Folk elaborated on the HDF Group’s status as a not-for-profit. ‘As a non-profit, our goal is to promote and support HDF. HDF5 is open source in the sense that we publish the source code for the basic HDF5 library and tools, and it has a BSD-like license. But it’s not developed and maintained by a large community of people the way most open source software is. Most development and maintenance is done in-house. Today, all our revenues come from two organizations to whom HDF is critical.’

‘We are now investigating an expansion of our business model, developing tools that we might sell and maybe a ‘pro’ version of the library. We also have a project to investigate HDF software-as-a-service, particularly within a cloud environment.’

For those of you located in Houston can catch up with HDF5 at the Rice Oil & Gas HPC Workshop in March.


Carbon capture and storage hits the buffers

CATO and AFTP events reveal dwindling public support and geological gotchas to CCS.

It’s not just the oil and gas industry that is having a hard time these days. The carbon capture and storage business (if it is a business) is suffering from a swath of obstacles, political, economic and technical. The idea is simple enough, capture the CO2 emitted by electricity generation or cement manufacture and ‘sequester’ it underground in deep reservoirs.

At last year’s Cato symposium in Amsterdam, Brad Page of the Global CCS Institute observed that, ‘Fossil fuels will be important for a long time. Huge reserves (134 GW) of coal capacity were added in 2013, double any other fuel.’ Most of this was in undeveloped countries ‘where it will be developed whatever we think.’ So CCS is essential if the world is to stay within the 2° temperature limit. Renewables are not enough. But worldwide, projects are being shelved. Political objections arise because despite the green credentials, local populations see the activity as pretty much as objectionable as fracking. Economics are problematical as the carbon trading market has collapsed and (at least in the EU) the economy at large is not conducive to such experiments. IEA president Fatih Birol has been quoted saying, ‘I don’t know of any other technology that is so important for the planet and yet for which we have so little appetite.’

But what of the technology itself? We caught up with a couple of French experiments at a meeting of the AFTP this month where the results of the Lacq test project were presented. This has demonstrated the feasibility of CCS in a depleted gas reservoir. The biggest problem was with the nimbies and a rigorous environmental monitoring program is now underway to demonstrate the long term integrity of the reservoir.

Other numerical evaluations of CCS projects in saline reservoirs showed that it is very hard to find a target that matches all of the desired parameters. In general, sequestrable volumes shrink as long term migration risk to aquifers and caprock integrity concerns are considered.

Things are looking better for CCS in the US where the Energy Department reports that its Illinois Basin-Decatur project has sequestered one million tonnes of CO2 since 2011 in a saline aquifer. The CO2 is captured from an ethanol-production facility and is injected into a sandstone formation at 7,000 ft. Reservoir pressure remains ‘well below’ regulatory limits.

For more, especially on the social license to operate aspects, read the excellent 2014 CATO report—‘Linking the chain,’ a free download from the Cato website.


OpenText buys Actuate, Informative Graphics

BIRT development toolset for predictive analytics to link with enterprise information management.

Document management, sorry ‘enterprise information management’ (EIM) solutions provider OpenText is in acquisition mode having acquired both Informative Graphics and Actuate Corp. in a $330 million cash transaction. Actuate’s predictive/analytics technology is to be integrated with OpenText’s existing offering. Actuate develops BIRT, an open source integrated development environment and iHUB, a platform for developing business intelligence applications.

Actuate claims that the BIRT IDE is ‘the only top-level Eclipse project focused on business intelligence and reporting.’ The company’s ‘hybrid’ strategy combines open source software with enterprise development. The toolset includes access to data stored in a Hadoop file system thanks to a deal with Cloudera. Actuate claims over 3.5 million Birt developers.

The Informative Graphics Corp. (IGC) acquisition brings secure mobile access, document mark-up and collaboration to OpenText. IGC is a long time OpenText partner and claims over 300,000 seats for its ‘Brava’ for OpenText content server. The toolset will augment OpenText’s new engineering solutions offering. InfoGraph clients include BHP Billiton, BP America, Chevron, Marathon Oil, Petro-Canada and Saudi Aramco.

Actuate’s oil and gas credentials include CGI’s oil and gas division which, according to lead developer Bimal Thomas, uses the toolset to ‘improve our application’s usability, presentation and interactivity.’ More from OpenText.


GE Predictivity for Sabine Pass LNG export

Cheniere’s terminal now turned around from import to export with help from GE.

Last time we reported on Cheniere Energy’s Sabine Pass LNG terminal (OITJ April 2007) it was just about to open … for import. A lot has happened since then. Cheniere just announced that the facility is to open (for real?) and that it will be exporting US shale gas to the more lucrative world market. The company has entered into a 20 year, $1 billion contract with GE for the provision of a range of supplies and services on the first four LNG trains currently under construction. Each train will have six gas turbines and is expected to have nominal capacity to produce approximately 4.5 million metric tons per annum of LNG.

GE’s ‘Oil & Gas Services 2.0’ package leverages its Predictivity technology for equipment monitoring and pre-emptive maintenance, maximizing uptime. The project also will benefit from the expertise of the Houston GE Oil & Gas iCenter, which features more than 11 million hours of machine data and is one of three global centers strategically set up in different time zones to ensure 24/7 monitoring and diagnostics services for GE Oil & Gas’ installed fleet.

The Cheniere story featured in Gregory Zuckerman’s entertaining book, The Frackers. The shale gas (including, most recently, LNG export economics) story is also regularly covered by the FT’s shale skeptic-in-residence John Dizard.


Forecasts and fantasies

CITI, Booze Allen, IDC, Markets & Markets, TechNavio - the crystal balls are on overdrive.

A graphic from Citi researcher Ed Morse puts Saudi Arabia’s ‘fiscal breakeven’ production price at $98/bbl and the Eagle Ford/Bakken at a mere $40! As in ‘apples and oranges?

Booze Allen’s top energy sector trends for 2015 warns of a ‘Cyber Macondo’ attack on the energy sector. Industry is confronted with an ‘alphabet soup’ of complex regulations. There will be a new focus on reputational risk. HSE and capital investments will both ‘go predictive.’ The silver hair ‘tsunami’ leaves companies with shortages of key workers. Risk management ‘must move from the backroom to the boardroom.’

IDC Energy Insightspredictions see oil and gas companies ‘reengineering processes and systems to optimize logistics and hedge risk.’ 40% of major oils and all of the service sector will ‘co-innovate’ on technical projects with IT professional service firms. Companies’ IT environments are to evolve to a ‘3rd platform-driven architecture.’ 50% will have advanced analytics capabilities by 2016.

Markets and Markets forecasts that the reservoir analysis market will grow from $13 billion in 2014 to ‘nearly $22.4’ billion by 2019, a 10.6%/year growth.

But that pales when one considers the 56% annual growth that TechNavio is forecasting for the global big data market in oil and gas through to 2018. TechNavio also puts the global oil and gas Scada market’s growth at 7.45% from 2015-2019.


Aptomar teams with Transas on ‘advanced’ oil spill simulator

Honningsvåg, Norway facility claimed to be world’s most advanced.

Norwegian marine security specialist Aptomar has teamed with Cork, Ireland-headquartered Transas on an upgrade of what is claimed to be ‘the world’s most advanced’ oil spill simulator. The facility was developed with support from Eni Norge, Statoil, Gdf Suez, OMV and the Norwegian Coastal Administration and is located at the North Cape maritime training center in Honningsvåg, Norway.

The upgrade extends the unit with communication between onshore and offshore resources offering a range of counter measures and navigation scenarios using multiple vessels and other assets. Aptomar’s tactical collaboration and management system (TCMS) is now integrated with the simulator offering connectivity with TCMS’ operation rooms worldwide for second line response and operations management training.

Aptomar founder Lars Solberg commented, ‘The North Cape simulator allows individuals or groups to develop this skillset without affecting an operator’s day-to-day operations.’ The North Cape center and programs will be qualified according to DNV GL’s SeaSkill certification. More from Aptomar and Transas.


Wiley Journals map contents for ArcGIS, Petrel, Google Earth

Deal brings over 100,000 maps from 26 journals to Geofacets information mapping platform.

A deal between publishers Wiley and Elsevier sees over 100,000 maps from the former’s journals geotagged and embedded into the latter’s Geofacets mapping platform. The deal includes geology, petroleum geoscience, geophysical and other maps from Wiley’s 26 journals published on behalf of society partners such as the American Geophysical Union. It is anticipated that by year end 2015, the Geofacets’ content portfolio will extend to over 500,000 maps extracted from almost 200,000 associated articles.

Geofacets provides georeferenced geological maps from published sources, making them available to users of tools such as ArcGIS, Google Earth, and Schlumberger’s Petrel. The new deal provides users with access to content from journals such as Basin Research, the Journal of Petroleum Geology, and Sedimentology, Geophysical Research Letters and more. Previous deals gave Geofacets users access to publications from the London Geological Society, the SEPM/Society for Sedimentary Geology, the Society of Economic Geologists and the Geological Society of America.

It’s not clear if these Societies’ members will get access to their own content via Geofacets.


More on ConocoPhillips’ new intranet

SharePoint specialist BrightStarr wins award for The Mark intranet re-vamp.

UK-headquartered Microsoft SharePoint developer BrightStarr has successfully leveraged its ‘Kinetica’ methodology to deliver ‘The Mark’, a re-vamped intranet to ConocoPhillips. Kinetica uses surveys, interviews and workshops to gather user requirements at the design phase and is claimed to provide a ‘personal’ intranet experience while maintaining a strong connection to the business. BrightStar SVP Glen Chambers said, ‘A solution has to deliver a balance of form and factor. We have worked with ConocoPhillips to deliver an engaging user experience that promotes adoption.’

The new intranet provides 18,400 employees in 27 countries with collaboration and access to business information. The Mark offers a new ‘Mega Menu’ with three sub-categories ‘Our Company,’ ‘My Work & Collaboration,’ and ‘My Life & Career.’ These provide a clean homepage that ‘brings essential corporate information to the fore.’ The intranet can be accessed from any device, providing staff with anytime, anywhere access to essential resources. The Mark won BrightStarr one of the 2014 Nielsen Norman Award for best intranet design, its third in four years. More from BrightStarr.


© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.