I just got a new mobile phone. The love affair with the rugged but, in the end, crap Sonim is over (another ill-considered purchase made while hanging around in an airport). I am now, rather late in the day, the possessor of a ‘proper’ smart phone, although this is not strictly my first, my old Nokia 6600 (circa 2003) seemed smarter than the Sonim.
I have now spent a few weeks living in the Android ecosystem. In other words, I have been sharing an alarming and uncontrollable amount of personal data with Google, Samsung and who knows who else! I have loaded multiple apps for stargazing, reading the news and getting updates on public transport. Usually these require accepting terms and conditions, which are never read, and being forced to allow the app to access all sorts of other stuff like one’s own contacts. I fail to see why the RATP, the Paris underground system, needs my Google+ contacts to provide me with the train times. It is, as Google’s ex-CEO Eric Schmidt might have said, creepy1.
Back in the day, when Oil IT Journal was Petroleum Data Manager, we reported on an entertaining talk by one Gary Hamel (OilITJ March 19982). Hamel asked, ‘What is the point of a behemoth?’ He likened large corporate mergers to ‘tying two drunks together’ and observed that ‘it doesn’t make them sober.’
This aphorism came to mind when I heard that Apple and IBM were to team. The rationale apparently is that corporate users of iPhones will be able to access IBM’s Watson-style big data analytics. Well that’s OK I suppose, but both Apple and IBM are already rather big and one wonders what sort of a behemoth they might make if they were to combine. Of course the deal is for the moment just a ‘partnership’ and I am not saying that either Apple or IBM are in any way ‘drunks.’ But both companies have their issues. Apple’s Tim Cook is under scrutiny for apparently not having Steve Jobs’ genius touch and, so far, Ginni Rometty has failed to return IBM to its former glory.
So what is the new teaming going to deliver? A report in the New York Times3 provided one use case for the new partnership viz: ‘An airline pilot tapping [on an iPhone] a calculation [on an IBM Watson] of updated fuel use and flight paths as weather conditions change.’ Reading this I had visions of the pilot, sitting in front of a constellation of gauges and computers, fumbling around for his iPhone, poking and stroking away to lose his current game of Angry Birds, locate the ‘Watson’ app and then laboriously enter the new flight data into the autopilot. This is, as they say, not going to happen!
Highlighting such an improbable example makes it appear that either Cook and Rometty were sleepwalking into the deal or that their combined marketing departments must have been smoking some of that medicinal weed when they came up with this. So I thought I would offer a few words of advice to Apple and IBM and, at the same time, work on the outline of a chapter on the book-I-will-probably-never-write on how to market technology.
In a previous existence I used to be in more of a buying role. The power that this brings tends to make people arrogant SOBs and I am ashamed to admit that I was no exception. I scorned vendors’ lame explanations of what their products could do for my business. Eventually I realized that a poor marketing use case can be effective. A weak or unrealistic pitch is a smart way of getting clients to think and imagine more telling uses for themselves. It can be good in that it lets the buyer feel smart at having seen through the dumb or inappropriate stuff and ‘invent’ something really useful.
But such dumbed-down marketing is better executed by someone in a relatively junior position—an enthusiastic communicator who just hasn’t got all the facts in his or her possession. This is not a job for the CEO. Even less so for two CEOs of two of the largest companies in the world.
So what is the real rationale behind Apple and IBM’s teaming? Well how about a nice conspiracy theory which has the merit of being a much more realistic use case than the iPhone on the flight deck. What makes the world wide web go round? They used to call it ‘content’ as in ‘content is king.’ Today, ‘content’ has morphed into ‘big data.’ Google of course has boatloads of it, harvested from my Android phone and from search. These make for insights that we are assured stay ‘just on the right side of creepy’. Facebook uses mountains of personal stuff and does not care a fig if it is being creepy or not!
IBM has no content to speak of but it does have analytics. You might like to re-read our report from the 2011 Houston Digital Plant event (OITJ December 20114) report on IBM’s ‘massively parallel probabilistic evidence-based’ Jeopardy-winning machine. As IBM’s Katherine Frase explained, Watson and its DeepQA natural language processing engine can ingest and ‘reason’ with data from databases, sensors and what was then the ‘next frontier,’ the 80% of information that is ‘unstructured,’ i.e. text.
Apple though has mountains of text (or content or ‘big data’ or whatever you like to call it) streaming in from its iPhone/iTunes/AppStore ecosystem. Applying Watson to Apple’s big data would be a pretty good business case. Why was this not put forward instead of the flying iPhone? Probably because the benefit would be the insights it brings to Apple rather than the end user. And also probably because IBM doesn’t do ‘creepy.’ Well, not yet anyhow.1 oilit.com/links/1407_0301
Speaking at the IRM-UK conference on enterprise architecture and business process management (EA/BPM) last month, Statoil’s Torbjørn Stølan and Nick Cherrie presented a case study on the use of the Aris methodology to map and organize Statoil’s cross discipline work processes. Aris stands for the architecture of integrated information systems, a means of modeling and improving enterprise work processes.
A tenet of the EA/BPM movement is that it is not (just) about IT. As another speaker put it, ‘An enterprise has an architecture, even if it doesn’t have electricity1.’ Statoil uses Aris to understand and communicate the complex interactions that take place during activities such as acreage evaluation, drilling, geophysical data acquisition and more. Just about every activity that the $90 billion, 20,000 employees company is involved in.
The starting point for the case study was the Statoil ‘Book,’ a 75 page top level outline of the key policies and requirements for the whole group. The book is the foundation of Statoil’s business, setting standards for behavior, leadership and what is expected of its employees. Statoil uses a ‘capital value process’ (CVP) to identify and execute projects—moving through a series of stage gates from a business opportunity to ‘the most profitable operation for the total value chain.’
Drilling down from these top level requirements, the Aris case study leveraged Software AG’s Aris Express to capture the CVP processes in terms of technical requirements, best practices and resources required. A process requirements document defines the CVP and sets out the requirements for moving through a decision gate or approval point.
First attempts to use business processes modeling met with reticence as staff saw work processes as bureaucratic, causing people to stop thinking, and stifling creativity. The ‘box ticking’ approach was also perceived as slowing down work and decision making. It proved hard to capture the iterative, non-linear work processes used in the upstream. The Aris practitioners decided to keep things simple, proposing instead an outline, non mandatory sequence of events.
The approach proved successful in making team members aware of their own and others’ roles in the overall process and in revealing skills gaps. Work processes help plan and perform everyday tasks and improve individuals’ awareness of their roles and required documentation. More from IRM-UK and Software AG.1 Colm Butler, Open Group IT Architecture Practitioners conference 2005.
Speaking at the 2014 Schlum-berger investor conference, CTO Ashok Belani announced the establishment of a new Schlumberger software and innovation center in the (San Francisco) Bay area that is to become a ‘visible part’ of the Silicon Valley ecosystem. Belani highlighted the role of software in Schlumberger’s ‘digital future’ observing that if Schlumberger was a software company, it would rank in the top 50 in the world. Schlumberger spends 33% of its research and engineering budget on software development and employs some 2,000 coders.
Belani envisages a cloud-based ‘new enterprise architecture’ with contributions from some well-known partners (SAP, Google, Microsoft and Nvidia) and some not so well known ones (Wearable IntelligenceChaotic Moon). All this to be delivered from a new Software Technology ‘SWT’ organization to complement the two other ‘pillars’ of hardware and research. The move to the Bay is a back-to-the future event. Schlumberger closed its Palo Alto technology research facility in 1998, relocating its developers to the Schlumberger laboratory for computer sciences in Austin, Texas. More presentations from Schlumberger.
International Human Resources Development Corporation (IHRDC) has expanded its partnership with the Society of Exploration Geophysicists for the distribution of its library of oil and gas e-learning courses. The courses will be made available to the Society’s 33,000 members from an upgraded SEG professional development site, SEG On Demand. New courses include synthetic seismogram modeling, controlled source electromagnetics and microseismics.
The American Petroleum Institute (API) has contracted with Petrofac’s Oilennium training arm to provide e-learning modules to its membership. Courses include an introduction to oil and gas, occupational safety and health administration (OSHA) and health, safety, security, the environment and quality (HSSEQ!) and the fundamentals of drilling. The courses are delivered through the API-University portal.
Petrofac has also teamed with ESI International, a project management training specialist, to establish the Petrofac project management academy. The PPMA will provide training to oil and gas organizations in the Asia-Pacific region. ESI’s clients include Shell, ExxonMobil, Chevron and Halliburton. Courses will be delivered at Petrofac’s Chemical Process Technology Centre in Singapore. More from ESI.
Epsis has signed a strategic partnership agreement with Norwegian control room systems integrator Eldor AS to market Epsis’ ‘TeamBox’ collaboration solution in Stavanger and to enhance Eldor’s offering. Founded by Bernt Eldor in 2006, Eldor has designed and built several control rooms and collaborative work environments for Norwegian continental shelf operators.
The flagship deployment of the combined Eldor/Epsis technologies is in the future control room of Total’s Martin Linge development. The field is located some 170km from the shore and innovates with a subsea electrical cable providing power to the rig (and reducing its CO2 footprint) and by moving the control room onshore. The onshore operations center takes integrated operations to the ‘next level.’ While the main control takes place onshore, a second offshore control room mirrors the onshore facility to ‘ensure security and operational reliability.’
Epsis’ TeamBox lends itself to the mirrored control room paradigm, providing collaboration management and workflow control. The unit allows management of video screens, meeting and workflow orchestration and video conferencing. ‘Complex’ control systems such as touch screens, matrix switches and display controllers are eliminated. Patented workflow management is used to manage and share information across remote sites. Eldor said, ‘Epsis Team Box provides an intelligent solution to content management in the control room. The partnership enables us to provide more flexible solutions in customers’ meeting rooms, collaborative work environments and control rooms.’ More from Eldor and Epsis3.
Oslo, Norway-headquartered HueSpace has partnered with PC manufacturer Lenovo, Nvidia and hardware specialist Magma on an appliance for high-end seismic visualization development. Hardware comprises a Lenovo ThinkStation D30 workstation, Nvidia Tesla K40 GPU accelerators and Magma’s GPU expansion technology. The HueSpace API is used to render large data volumes such as pre-stack seismics. The 64-bit Linux/Windows environment is capable of handling terabyte ‘and even petabyte’ data sets.
HueSpace has been using GPUs, ‘even before Cuda was invented,’ for ‘general purpose computing on graphics processing units’ as opposed to their use for display. Connectivity between the ThinkStation and the Teslas is provided by Magma’s ExpressBox 3600 GPU expansion system. The ExpressBox can house up to 9 K40 GPUs providing some 13 teraflops of double precision compute bandwidth.
HueSpace exposes a plug-in environment for in-house and third party developers of upstream data visualization offering a range of domain specific functions (such as automatic parallelization across different accelerators), data management and I/O. The solution runs on Linux and Windows. Linux is preferable as it is better at handling a larger number of GPUs.
HueSpace’s core engine runs optimized, multi-threaded C++ throughout the pipeline, using GPUs wherever it can, for instance while transparently compressing and decompressing seismic data. General purpose computation is performed on the HueSpace compute plug-in. The HueSpace API is offered as .NET/C++/Java on Windows, and C++/Java on Linux. The announcement is a foretaste of Lenovo’s move upscale with its acquisition of IBM’s x86 business. More from HueSpace.
Speaking at the 2014 SciPy conference in Austin, TX this month, Joe Kington (now with Chevron) performed a geophysical fireworks display using nothing other than open source Python code. As a part of his PhD project at the University of Wisconsin, Kington was studying sedimentation and tectonics in the Nankai Trough, Japan. Kington found that the software used on interpretation workstations lacked the ability to produce display quality vector graphics for publication. Moreover, industry standard workstations are ‘walled gardens’ and it is hard to get data out into plotting or other data manipulation environments such as Python.
Kington rolled up his sleeves and reverse-engineered Geoprobe’s file formats and came up with ‘python-geoprobe,’ a python module to read and write horizons, volumes and faults. Once outside of Geoprobe, many high end functions can be achieved with a few lines of code—corendering coherence and amplitude, blending multiple data volumes and stratal slicing. Kington has it that data manipulation in Python makes it easier to get the interpreter’s point across. It is easy to do spectral decomp in Python and you can emulate tools like Geoteric or Voxelgeo in ‘just a few lines of code.’ The Python ecosystem provides common blocks for powerful seismic processing and imaging. Watch the SciPy talk and download python-geoprobe.
Fort Worth, TX-based The Strickland Group (TSG) has announced WellMetrics, a well information management solution that leverages technology from Tampere, Finland headquartered M-Files Corp. WellMetrics provides access to structured and unstructured well-related information for geoscientists, production engineers, land and operations personnel. WellMetrics provides an Esri GIS front end to find, view and manage well data and document collections. The system provides document control capabilities and integrates with third party tools such as Aires, Excalibur, LandWorks and Rio.
The M-Files enterprise information management (EIM) system provides configurable connectivity with existing databases and systems and is claimed to ‘eliminate information silos’ and provide access to approved content from core business system and devices. M-Files can be operated as an on-premise, cloud or hybrid solution, reducing demands on IT. M-Files users include SAS, Elekta and EADS. More from M-Files and Strickland.
IHS has announced Kingdom 2015, the latest release of the geoscience workstation it acquired from the then Seismic Micro Technology back in 2011. The release includes patent-pending ‘Illuminator’ 3-D analytical technology for fault interpretation and fracture identification. Illuminator helps unconventional operators identify areas that are un-fractured for possible development, and to ‘compare fracture monitoring results against areas located away from the immediate borehole.’ New geosteering capabilities help drillers stay in the sweet spots.
Another new functionality is ‘dynamic depth conversion’ of time-based data that allows interpreters to build and maintain a ‘virtual’ velocity model. Depth conversion is no longer a time consuming and error prone process. Geosteering operators now have real-time up-to-date depth conversion for accurate well positioning. IHS Energy geoscience portfolio lead Clifford Kelley said, ‘Depth conversion process used to be a special, static process conducted by an expert geophysicist. Kingdom 2015 makes it a shared workflow involving geologists, geophysicists and engineers.’ More from IHS.
Version 5.0 of dGB Earth Sciences’ Open dTect seismic interpretation package will be released at the SEG in October this year. Open dTect is delivered under a ‘freemium’ model with an open source base release augmented with commercial extensions from dGB and third parties. Development is also supported by sponsor-led development. The 5.0 release sees a move of the documentation to HTML5 using the Madcap Flare authoring tool and the addition of a connection to Matlab.
A new ‘directional texture’ plugin from Austria’s Joanneum Institute allows calculation of grey-level co-occurrence matrix energy in all directions. Attributes have been ‘pre stack enabled.’ BG Group has sponsored the development of a base map/contouring package while a new ‘finger vein’ fault tracker was backed by Marathon. A ‘Geo-Data-Sync’ developed by Ark CLS provides direct access to Petrel from Open dTect and vice versa—both sans data duplication. More from dGB.
APS Technology’s new SureShot through formation EM downhole MWD/LWD telemetry system offers data rates of 1 to 12 bits per second at measured depths of 4,400 feet.
Atek Access Technologies’ new service platform provides remote monitoring of its TankScan wireless tank level sensors. Parameters such as battery life, uptime and cellular data usage are tracked and corrective measures taken as required.
New functionality in Halliburton’s Cypher 2.0 seismic-to-stimulation service uses Landmark’s DecisionSpace earth modeling suite to optimize fracs jobs and enhance well performance. Cypher 2.0 also helps operators evaluate well placement and completion design options.
The latest release (10.2) of iAsset, Interica’s upstream data management solution, includes new bulk data loaders, audit trails, enhanced map generation and improved catalogue maintenance.
Saddleback Geosolutions’ new Attribute Workbench is a plugin to INT’s INTViewer seismic analysis platform.
A new ‘native’ edition of the Norwegian Petroleum Directorate’s Oil Facts app is now available for the iPhone, Android and Windows smartphones. The app, developed in collaboration with Capgemini, provides access to Norwegian oil and gas data.
The latest release (1.2.1) of Meridium’s Asset Answers (AA) cloud-based asset performance diagnostics solution provides cross-company and cross-industry comparative analytics and visibility of operational data. AA is updated monthly with anonymized data to deliver industry-standard metrics such as average corrective work cost, mean time between failure and more. Users can drill down through the equipment hierarchy and filter by manufacturer or part number.
P2 Energy Solutions has released P2 Land Broker, a mobile solution connecting land brokers in the field with back office teams. Lease data entered in the field is routed to the corporate land system for review and approval by analysts. Approved leases appear on an embedded map, along with leasehold positions, Pugh Clauses and expiration dates.
Schlumberger’s new reservoir geomechanics software includes the Visage finite element modeler for intact, faulted, and naturally fractured rock and a Petrel module for model creation, pre and post processing and run management.
Pegasus Vertex has announced the imminent release of Mudpro 3.0.3, its API compliant mud reporting package. New Features include a new GUI, platform and database, upgraded volume calculations and improved graphs and reports.
Jeff Allen (Coler & Colantino) showed how a large transmission operator has GIS-enabled its SAP systems of record using Geo.E, a joint SAP/Esri development that implements an Esri geodatabase in SAP, and SAP’s linear asset management extension. C&C’s own toolset adds a PODS pipeline database. Oracle’s enterprise service bus links the systems together to support various pipeline maintenance workflows and audits. The service bus orchestrates synchronization across the different repositories according to operator-defined rules.
New Century Software’s Chris Nichols showed how ArcSchematics is used to manage chemical treatment across a pipeline network. This data-driven tool offers a simplified view of a pipe network that shows the relationships between components as opposed to a spatial representation. A split screen shows a map and ‘smart tree’ view providing a clear representation of e.g. scale and biocide treatments and corrosion inhibitors.
Scott Sitzman presented a benchmark study of ConocoPhillips’ GIS systems performed using Exprodat’s ‘proven GIS review methodology’ of surveys, interviews and workgroups. CP applies GIS in many upstream domains and maturity levels are in the mid to high range. The company has a structured process for assessing and propagating GIS best practices in areas such as exploration play fairway analysis and land management.
geoLOGIC systems’ Tim Downing presented a case history of a Canadian oil sands operator that has applied geoprocessing to solving reporting requirements that integrate GIS with complex data models. Canada’s oil sands tenure regulations make up a 36 page document that stipulates, inter alia, a minimum level of evaluation such that at least one well must be drilled per section in an ‘even and uniform’ pattern’. SQL/spatial queries show the intersection of leases and expiry dates, wells and core data to identify sections sans wells and sections where wells’ core ages don’t match land lease ages. The current project targets data clean-up and regulatory reporting but will likely extend to the identification of third party leases that have not fulfilled their regulatory requirements.
Andrea Le Pard provided insights into Nexen’s envisioned move to the cloud, a.k.a. to ArcGIS online or not. Various deployment patterns are now possible from full cloud to hybrid (data in-house, hosted portal) and everything in house. Current ArcGIS online solutions see Esri as playing a middle man role between a client/subscriber and the cloud proper—Amazon or Azure. For maximum flexibility a portal is still best. ArcGIS online lacked two key Nexen requirements, of isolated data tenancy and MPLS direct connection for security. Whatever you chose be prepared for lots of IT gotchas and myths. Active directory in the cloud requires work for IT. Licensing hybrid solutions is restrictive and security remains an issue ‘oils are afraid of the cloud for good reason.’
Rich Priem (Priemere Geotechnology) has used geoprocessing here to evaluate the probability of success in the Marcellus shale play. He combed grids of shale thickness, depth overpressure and vitrinite reflectance from over 1,000 Pennsylvania wells in the public domain. Areas of interest can be identified by plotting sweet spots over 5 miles from existing wells and using a pivot table to target land effort. Play fairway play analysis is enabled with Priemere’s Power Tools—said to be ‘better than tedious and complicated use of native ArcGIS desktop.’ The PUG presentations are available from the Amazon cloud.
The plenary EAGE forum, ‘Experiencing the energy, doing more with less’ offered panel members an opportunity to opine on pretty much any subject of their choosing. Which they did. Mike Daley (BP/Oxford University) introduced the panel saying that the oil and gas industry was engaged in ‘feeding the world with energy.’ We are at an ‘interesting moment’ with growth in oil and gas consumption but an increasing dislike of industry from investors and society. There is also a notion that everything is harder than it used to be. But actually ‘things have always been difficult.’
Woodmac’s Andrew Latham observed that costs have risen sharply. It now takes $4 to add a barrel to company reserves (it used to be $1) and full cycle returns are down to some 12%. Investors don’t like this when compared to the risk involved. We are seen as an industry that is struggling to perform. As conventionals get harder (pace Mr. Daley) unconventionals are where there is most growth. But this comes at the price of intense effort and our license to operate is not a given. Unconventionals in the sweet spots of the best plays are very attractive. Elsewhere (i.e. for the majority), the economics are ‘pretty marginal at best.’ Companies are wondering how to address complex marginal plays without hiring ‘legions of geoscientists’ because there is not the income growth to support this. Yes, that is what he told the geoscientists to their faces!
Wouter van Dieren (Instituut voor milieu en systemen analyse, a Dutch environmental think tank) added that complexity is not just about technology but about policy too. We depend on fossil fuels but society’s new value system is becoming dominant and challenging.
Shell’s Ceri Powell suggested that it’s all about brainpower. We need more innovation and technology. She proposed a novel metric—not so much petaflops but brain cells/barrel. We need early derisking before drilling with for example CSEM or country-wide 3D seismic surveys. Shell’s ‘rejuvenate opportunities now!’ workshops eschew high tech for old-style Mylar and colored pencils and are ‘hugely successful.
José-Luis Alcover (Repsol) proposed the 3P/3C squeeze on the business as in price volatility, pressure from investors, politics and contracts, competition and costs. We are up against US natural gas prices, short term investors and nimbys. Industry may be more competitive but there is always the risk that we are destroying value. Many aggressive exploration and development bookings result in write-offs. Perhaps it will be the smaller, nimble companies who will survive.
Marc Blaziot (Total) agreed that today, the problem was one of resource quality in terms of finding and producing costs but also its social and environmental footprint. The industry is less accepted than it used to be, ‘even to our own, younger colleagues.’ Doing more with less is a pivotal question for the majors where costs are too high and returns too low. Some recent statistics from Petoro showed that average rate of drilling penetration had doubled in the last 30 years.
Mario Ruscev (Baker Hughes) revealed that 60-70% of completions are uneconomic suggesting that ‘we don’t know what we are doing.’ The biggest social issue in the US is the amount of trucks going round, ‘people can’t take it anymore.’
In the rather contrived Q&A (anonymous written questions filtered by the moderator) someone suggested that unconventionals mainly delivered returns to ‘service companies and acreage speculators.’ Ruscev responded, ‘I wish!’ In one recent frac job, Baker had ‘20 guys bidding against us.’ ‘We bid against folks who don’t need to pay for guys like me!’
van Dieren has worked with Shell on social acceptability of oil and gas projects in sensitive areas. He has little time for Greenpeace, who are ‘dangerous people.’ A decade ago, exploration in Holland’s estuary area was halted by the conservationists. van Dieren managed to convince stakeholders that their perceptions were wrong, presenting the opportunities that production would bring to communities. He also told Shell to ‘remain silent’ as it was not a credible party. His actions worked and there is now gas production from the area and an €800 million fund for ‘sustainable biodiversity.’ The trick is for oils to ‘outsource their credibility.’ Oils need to create a new common value system that embraces wind and solar. Belief in climate change in industry is low even though, ‘It is coming and all major institutes believe this. You are running into a wall you cannot see.’
Notwithstanding all of the above, industry ploughs on with some pretty fancy technology. A talk from Mark Thompson showed how Statoil is doing quantitative 4D/time lapse seismic analysis storing its massive datasets in a Teradata appliance. Statoil’s 4D seismic reservoir monitoring group has acquired a huge amount of data in the last 17 years. And the volumes are increasing constantly with repeat surveys now carried out every six months. Data management is a big headache. Thompson managed to get some time on a Teradata ‘big data’ appliance and managed to standardize and automate 4D data ‘munging.’ The parallel shared nothing architecture and SSD storage is set to ‘make data managers unemployed.’ Now, instead of computing and storing attribute volumes, a user defined function creates it on the fly, right from the data warehouse. Many ad-hoc investigations are now feasible, from survey repeatability analysis to seismic reservoir pressure studies. ‘Put all your data in the same place and it should be easier to integrate people. Wake up it’s the 21st century.’
Schlumberger’s Darrell Coles’ talk was likewise on the topic of big data. A modern seismic survey can have hundreds of millions of shots and trillions of samples. Schlumberger is working on a tool to mine the ‘data deluge’ using guided Bayesian statistics to maximize the information in recorded data. The approach can be used to plan a survey or to cherry pick the most informative data for processing.
Alistair Crosby (BP) observed that our capacity to simulate now outstrips our ability to build the models required for seismic imaging. Modeling is a hand-crafted activity that is holding back progress. BP has been working on a new method for rapid model development by generating synthetic stratigraphy from facies templates and structural morphing. Multiple realizations give different facies distributions. Running the process tens of thousands of times gives a synthetic seismic section that is then morphed into the true geological structure.
Much more fun was Paolo Dell’Aversana’s (ENI) presentation on seismic data analysis with digital music technology. Dell’Aversana thinks that by transforming seismic recodings into audible MP3 files we can leverage Shazam-like digital music technology to perform pattern recognition. Combining sound with imagery should lead to better interpretation. Our brains are made for multi sensory perception. SEG-Y files are converted to Midi and frequency shifted. We listened to the deep rumble of boiling Hawaiian lava. Next the Midi file was played back through a digital piano, sounding like a rather good morceau of Stockhausen.
The morning session on high performance computing for geoscience was probably not very representative of mainstream oil and gas HPC. A presentation from the SEISCOPE II consortium demonstrated reverse time migration on the open source Valhall 3D dataset on an IBM BlueGene/Q at France’s IDRIS/CNRS HPC center. No geophysical gathering would be complete without a presentation on the use of field programmable gate arrays (FPGA) to accelerate seismic imaging. The paper from the SINBAD II consortium found that, ‘while further development is needed in order to realize the potential for acceleration inherent in the platform, our preliminary results give us reason to be optimistic.’ Not very compelling when you think that Oil IT Journal reported back in 2003 that the now defunct Starbridge’s FPGA-based supercomputer was to provide a ‘100 fold speed up over conventional microprocessor-based machines.’
We attended an informative demo of Schlumberger’s CoreFlow service. This uses argon ion tomography to capture core structure. Digital simulations can then be performed to estimate stuff like unsteady state relative permeability and plan development. We asked if enough was known about the rock physics of unconventionals to support such an approach and received a gratifying, ‘Your question is to the point.’ The semi quantitative analysis can give relative values. But, ‘Industry doesn’t really understand the physics of flow in these nanometer-scale reservoirs.’
Our trawling around the exhibition floor brought the following. Paradigm continues work on its ‘Epic’ open data infrastructure (OilITJ October 2013) now with interfaces to Witsml, Prodml, Ppdm and Resqml data sources. A prime time release is planned for 2015. Oracle’s offering in the big seismic data space was unveiled on the Westheimer booth where its ISDS data management system is coupled to a Oracle 12c/Exalogix appliance for performant access to seismic trace data. Landmark was showing its Zeta Analytics technology on an EMC data appliance that runs a Pivotal/Greenplum database (1304). Fraunhofer has renamed its high-end parallel GFS file system ‘BeeGFS’ and spun out this activity into a new ‘ThinkParq’ unit. BeeGFS is a head-on competitor for Hadoop. Meanwhile the latest release of Fraunhofer’s GPI 2.0 HPC development environment adds support for Nvidia GPUs. GPI removes the complexity of adapting programs for different parallel environments. Ikon Science was showing ‘JiFi,’ its joint impedance and facies seismic inversion. JiFi offers a quick way of determining low frequency background models using all available data. The ‘mixed discrete and continuous inversion’ approach is said to correctly capture the physics of the inversion problem. UK-based Big Data Partnership helped Teradata configure its seismic big data solution, used by Statoil to access full prestack data sets (see our May 2014 lead). Shell was loudly touting its own in-house developed interpretation software, 123 DI/Geosigns. One ‘selling point’ is that in-house development offers ‘control over the whole IT infrastructure, to ensure that all the right flavors of Windows, Linux and Oracle work together.’ More from the EAGE’s EarthDoc site.
Arne Røed Simonsen set the scene with results from a 2012 cybercrime survey conducted by the Nćringslivets Sikkerhetsråd business and security council. This found a big gap between cyber threats and preventive measures actually in place. Companies are increasingly dependent on critical IT systems and often deploy new technology without risk assessment. There is also decrease in management awareness of such issues.
Siv Hilde Houmb of Gjøvik University College’s NISlab introduced plans for a Norwegian center for cyber and information security along the lines of the US NIST cyber security framework. The unit would issue oil and Cert advisories operators on the Norwegian continental shelf.
Jacques Sibue described the evolution of GDF Suez’ IT with the creation of a security operations center and computer security incident response team. This has leveraged a generic risk management methodology developed under its Asphales program. A tool has been developed to inventorize Scada, IT and data assets and link these to cyber risks and potential consequences.
Total E&P UK’s Ewen MacDonald thinks that the technological solutions to cyber risk can be improved with a few simple practical steps. Inventorizing assets is again key, preferably using intelligent drawing tools like Visio and iPDF. Train personnel on the safe use of their PCs, especially at home! Enemy number 1 is the USB key that users bring to work. Mobile devices likewise present risks as does third party access. One contractor got infected by the Zemra virus that came from a control systems server! MacDonald advocates reducing the number of Microsoft devices on the industrial network. If you have to deploy such, put them in a demilitarized zone.
Damiano Bolzoni presented Security Matters’ ‘SilentDefenseICS’ (SDI) solution that builds on the ESCSWG’s cybersecurity procurement language for energy delivery systems. SDI was built from the ground up with ICS/SCADA in mind and includes self-learning, automatic whitelisting, deep inspection and more. Bolzoni also gave a head-up regarding the EU Densek project, an open-source information sharing and situational awareness platform.
Phil Legg of Oxford University’s Cyber security center introduced the CITD Project. This combines psychology, criminology and a range of IT/analytics to monitor users for anomalous behavior and potential misuse. The system builds an employee profile, learning from usage data in real time. ‘Understanding the human aspect is key to detecting and preventing attacks.’
Gal Luft of the Institute for the analysis of global security warned that energy cyber security is far behind IT security. Large assets such as an FPSO’s dynamic positioning are ‘still connected to external networks.’ More from SMi.
Peter Lwin heads-up CGG’s ‘open’ imaging center in Yangon, Myanmar.
Circulation Solutions has appointed Zachary Grichor to its sales team. He hails from DrillChem.
Karin Breitman heads-up EMC’s new big data R&D center in Rio de Janeiro, Brazil.
Benjamin Schuman leads Drexel Hamilton’s coverage of the oilfield services sector. He was previously with Pacific Crest Securities.
Drillinginfo has named Rich Herrmann senior VP and manager of E&P products. Herrmann hails from IHS.
The UK Energy Industries Council (EIC) has named Azman Nasir head of its Asia Pacific branch in Kuala Lumpur.
Omega Well Monitoring has hired Philippe Legrand as fiber optics and engineering manager. Legrand was previously with Baker Hughes.
Amalto Technologies’ Bryan Pederson has been named Canadian country ambassador for the Petroleum Industry Data Exchange (PIDX International).
The Society of Petroleum Engineers has named Nathan Meehan (Baker Hughes) as its 2016 president.
Electric submersible pump manufacturer Borets has named Cheri Vetter to business development manager in Houston, Stan Herl to sales manager in Oklahoma City, Chad Hamilton to district manager in Kilgore, Texas and Jim Bacon to senior adviser in Tulsa, Oklahoma.
Brian Boulmay has moved to newly created role of global geospatial data lead in BP’s upstream business.
Energy Ventures has added Saad Bargach and Desmond Kong to its partner network. Bargach hails from Lime Rock Partners, Kong from Halliburton.
Sigma Cubed has appointed Alan Cohen as CTO and Mark Bozich as Houston engineering manager. Cohen comes from Rock Solid Images, Bozich from FTS.
Svenn Paul Jacques heads-up Emerson’s new flow loop test facility in Stavanger.
Brad McDonald has joined Entero as senior account executive in the Calgary office. He was previously with Scott Land & Lease.
Richard Wylde is now geodetic specialist and geomatics lead at ExxonMobil.
Mauricio Alberto Cardona heads-up Gaffney Cline’s new office in Bogota.
Ram Srivastav is now chief metrics officer at Geotrace Technologies.
Greg McIntosh has joined Iron Mountain as senior VP and general manager, Canada. He was previously with D+H.
John Luongo has joined the board of Space-Time Insight.
NetApp has named Randall Runk, formerly with Infor Global Solutions, as executive VP field operations.
Ioannis Charalambous is now VP and CIO with Occidental.
OFS Portal has elected Weatherford’s Shaun Skene to its board of managers.
OilPrice has introduced a newsfeed aggregator of financial results from oil and gas companies in the US, Canada, UK & Australia.
The Pipeline Research Council International is building a technology development center in Houston to test and promote new technologies for pipeline integrity management.
Kirk Byles is VP sales and marketing with Rajant Corp. He hails from Firetide Inc.
Marcelo Comarin heads-up Weatherford’s new laboratory in Bogota, Colombia.
A federal appeals court has upheld the 2012 injunction against M3 Technology in favor of Aspen Technology regarding trade secret misappropriation, copyright infringement and ‘tortious interference.’ In the original judgment, M3 was ordered to pay AspenTech $11.3 million—this has been reduced to $10.8 million.
Aker Solutions is to split into two companies and write down the value of assets in its Oilfield Services unit. One unit will retain the Aker Solutions name and will take over the company’s activity in subsea, umbilicals, maintenance, modifications and operations and engineering (ENG). The other, Akastor, will house drilling technologies, process systems, surface products and oilfield services. A NOK 1.6 billion impairment has been recognized in regard of assets and goodwill of the oilfield services unit of Akastor.
Fugro has acquired Africa-based Geofor, geotechnical consultant and service provider. Geofor has some 600 employees and had €25 million revenues in 2013.
Petrofac has entered into a framework agreement with First Reserve to create PetroFirst Infrastructure Partners. The new venture is to acquire assets from Petrofac’s Integrated Energy Services division as well as in new energy infrastructure projects that utilise Petrofac’s development capability.
SNC-Lavalin is to acquire Kentz Corp., a global oil and gas services company, with 14,500 employees. The deal values Kentz at 1.2 billion, a 33% premium to the share price before the transaction. The new company will have around 44,500 employees of which some 18,500 will operate in the oil and gas sector.
Wood Group is acquiring Agility Group’s Agility Projects unit for NOK 1 billion. Agility will operate in Wood’s Mustang unit.
Following its 2009 ‘redomestication’ from Bermuda to Switzerland, Weatherford International is on the move again. Its new home is corporate tax haven, Ireland.
ION Geophysical now owns 100% of ocean bottom seismic specialist OceanGeo.
Kongsberg took up an undisclosed number of the shares in KBC Advanced Technologies’ recent placement.
Fiber optical cable is seeing uptake in a variety of oil and gas domains. Careful measurement of backscattered light can provide information on pressure and/or temperature along the length of the cable. Speaking at the EAGE last month, Flavio Poletto of Italy’s OGC research establishment showed early results from tests of Silixa’s intelligent distributed acoustic sensor (iDas). DAS technology replaces point receivers (geophones) with continuous recording of a seismic pressure field along the cable and has application in massive borehole arrays and conventional seismic recording. Tests comparing iDAS with single and multicomponent geophones and were said to ‘confirm the quality of the iDAS signals for seismic purposes.’
A recent deal between Stavanger-based Ziebel AS and ConocoPhillips heralds the use of distributed fiber optic (DFO) acoustic and thermal real time monitoring of horizontal unconventional wellbores. The truck mounted Z-System will be deployed to visualize unconventional wellbores while drilling.
Researchers from the University of Pittsburgh have demonstrated the use of fiber for gas flow rate measurements. The active ‘smart’ optical fiber sensor uses an all-optical high-temperature flow sensor based on ‘hot-wire anemometry.’ Reliable gas flow measurements were demonstrated between 0.066 m/s and 0.66 m/s from room temperature up to 800°C.
Baker Hughes reports that its CoreBright optical fiber has been reliably transmitting downhole data for four years in a Canadian steam assisted gravity drainage well. Baker’s SureView system has been monitoring an electrical submersible pump and supplying condition data to the company’s Scada network. CoreBright is claimed to be particularly resistant to hydrogen darkening, a potential downgrader of optical performance over time.
A new white paper from Energy Solutions International (ESI) investigates software methods for pipeline leak and theft detection. Recent litigation, increasing theft and the drive for safety and environmental protection have raised awareness of the need for detection systems. Realistically, leak sizes of the order of twice the accuracy of the flow meters are considered detectable with location accuracy of around 5 to 10 %. The detailed white paper includes advice on alignment with the API 1155 leak detection standards along with (naturellement) insights into deployment of ESI’s flagship PipelineManager software.
An alternative means of leak detection has been proposed by researchers from the Massachusetts Institute of Technology and Saudi Arabia’s King Fahd University of Petroleum and Minerals. The novel approach uses ‘PipeGuard,’ a robotic system that detects leaks by sensing pressure change at leak locations. PipeGuard was unveiled last month at the International Conference on Robotics and Automation in Hong Kong.
You might think that filling an oil storage tank was straightforward. Not so, as the 2005 Buncefield, UK tank farm fire demonstrated. After the fire, investigators determined that ‘Tank filling management systems at Buncefield were both deficient and not properly followed.’
Buncefield has since been the subject of intense study and has informed a ‘radically changed’ edition API 2350 standard. The Complete Guide to Version 4.0 is authored by Johan Sandberg of Emerson’s Rosemount division. Sandberg traces the evolution of tank gauging technology from the 1940s to the latest radar devices and high-level alarms. The latter can be point-level, discrete devices or provide continuous measurements. Point level devices may be easy to deploy but the question is whether they work or not, as they usually lack communications and diagnostics.
Today’s best practices are for continuous measurements of tank levels and remote proof-testing to verify proper operation. ‘2-in-1’ radar devices are also recommended for redundancy and safety. More from the informative paper and the Rosemount Tank Gauging minisite.
The latest edition of the TOP500 list of high performance computers has an upgraded edition of ENI’s machine leading the oil and gas computing stakes. ENI’s ‘HPC2’ IBM iDataPlex is n° 11 at 3 petaflops. This pips Total’s 1 petaflop Pangea for the top industry slot and BP’s 2 petaflop machine (not in the Top500). At n° 43 is Saudi Aramco’s Faris HP cluster with 0.8 petaflops.
Practically all operating systems in the TOP500 are Unix/Linux-based. There are no new Windows-based machines since 2012 when the Azure/Faenov machine came in at n° 165 (now n° 309). China’s Dawning/Magic Cube (0.15PF), which entered the Top500 at n° 11 in 2008 is now down to n° 237. Lots more in the Top500 lists.
Shell has awarded Accenture a multi-year managed services contract to support its upstream and downstream engineering, production and process control operations.
Petronas Carigali has awarded AGR a $1 million software contract for the supply of its P1 well operations planning tool and the CT cost tracker.
SAP specialist Absoft has helped Dolphin Drilling achieve DNV-GL certification for the maintenance management system on the new Bolette Dolphin drillship
Eni Norge is to deploy Aptomar’s marine surveillance and oil spill detection solution as a component of what is claimed to be ‘probably the most advanced oil spill detection and management system on the Norwegian continental shelf.’
Malaysian engineer D3SCom has adopted Aveva Everything3D and Laser Modeler for its brownfield projects. Aveva’s tools have also been adopted by KBR in its new integrated engineering and design environment.
Belsim has signed a partnership with Wipro for deployment and support of its Vali data validation and reconciliation package.
Cortex Business Solutions has partnered with Cendec to extend the latter’s finance and accounting offerings to its current and prospective customers.
Exprodat has partnered with Novara GeoSolutions (formerly Coler & Colantonio) to provide implementation services for Novara’s Intrepid pipeline management system throughout the EU.
FFA has signed a multi-year technical assistance agreement with the Instituto Mexicano del Petróleo (IMP) for the provision of its GeoTeric software and consulting services. The deal is an element of Mexico’s Conacyt-Sener shale evaluation project.
Yokogawa is to distribute GasSecure’s ISA100 wireless gas detectors through its global sales network. Reciprocally, GasSecure is to market Yokogawa’s field wireless devices alongside its systems.
Genscape is to combine its petrochemical supply chain data acquisition technology with The Petrochemical Standard’s expertise in commodity price assessment and analytics to provide ‘unique intelligence’ on petrochemical trading opportunities around the world.
Halliburton has signed an agreement with the SPT Energy Group affiliate, Petrotech (Xinjiang) Engineering for the establishment of a joint venture Xinjiang HDTD Oilfield Services. The new unit will provide hydraulic fracturing and production enhancement services in Xinjiang, China.
Suncor Energy has selected Honeywell as the main automation contractor for its multi-billion dollar Fort Hills oil sands project in Alberta, Canada. The deal includes process control software, safety systems, alarm management and simulation.
Norwegian oil services company Apply Sørco has rolled out IFS Applications’ ERP solution to its 900 on and offshore users. IFS has also partnered with Deloitte for implementation of IFS Applications in the oil and gas and maritime industries in the Benelux region. In yet another deal, IFS announces that Larsen & Toubro unit L&T Infotech has joined its partner network and is to resell IFS Applications to customers around the world.
Kongsberg Oil & Gas Technologies has signed a four-year, multi-million dollar drilling data contract with OMV Group for its global drilling activities. OMV’s offices and rigs will be equipped with Kongsberg’s SiteCom real-time operations support solution. Kongsberg is also to collaborate with KBC Advanced Technologies on improved oil and gas production solutions.
Merrick is partnering with Aventa Systems to provide an ‘all-encompassing’ production management solution to European upstream operators. The deal combines Aventa’s oil country IT expertise with Merricks software suite.
Noah Consulting has extended its content management service offerings in a partnership with OpenText. The deal includes Livelink and OpenText’s portfolio of SAP-centric content solutions.
OSIsoft has partnered with Metrix to embed PI System in Metrix’ Setpoint condition monitoring package.
Cote d’Ivoire’s state oil and gas corporation Petroci has deployed P2 Energy Solutions’ Ideas accounting package.
IGas Energy has implemented Palantir Solutions’ ‘Plan’ package for its rig scheduling.
Petrobras is using SAS Analytics to manage geoscience data, improve well placement and extend the life of mature fields.
Gibraltar Software is helping Serafim enhance the user-interface of its Future integrated asset modeler.
V1.0 of Energistics’ unit of measure standard, a dictionary and the symbol grammar has been finalized. The SEG is to refer to the new standard’s UoM ID in its upcoming SEGD3 and SEGY2 formats. PPDM is working to enhance the UoM subject area in its industry data model to align with the new standard.
Merrick Systems is working with Energistics to develop a North American oil and gas production reporting standard.
The Open Geospatial Consortium has approved the Open modelling interface standard V2.0 (OpenMI). The standard lets independently developed computer models of environmental or other processes exchange data and interact as they run.
A white paper from The Open Group introduces its Open Platform 3.0. OP3 follows on the footsteps of OP1 (Unix) and OP2 (the web), promising an amalgamation of ‘cloud computing, social computing, mobile computing, big data and the Internet of Things.’
The American Petroleum Institute (API) has ‘expressed its support’ for a new Oil and natural gas information sharing and analysis center (ONG-ISAC), which sets out to ‘help protect infrastructure from cyber-attacks.’ The Washington-based unit is headed-up by Curt Craig, manager of integrated systems and information security at Hunt Consolidated, Inc. The idea for an ONG-ISAC was floated in a 2001 US-government backed report ‘Securing oil and gas infrastructures in the new economy.’ Seemingly this has taken a while to put into place.
The US NIST standards body has just published a Cryptographic standards and guidelines development program briefing book. The free 44 page publication covers algorithm specifications, guidance on the use of cryptography, standards for the personal identity verification card, public key infrastructure and testing according to Federal information processing standards.
The June 2014 quarterly threat report from Intel’s McAfee unit opines that while 64 Windows introduced much new security, no system is bulletproof. McAfee expects an increase in attacks from valid, digitally signed malware, obtained from stolen digital certificates. The count of ‘suspect’ websites hit a new record at over 18 million (a 19% hike). 68% of these are located in N America.
A new publication from Unisys and the Ponemon Institute addresses ‘Critical infrastructure: security preparedness and maturity.’ Ponemon interviewed some 600 execs for the report to find that utility, oil and gas and others are ‘high profile targets’ for security exploits where at risk ICS/Scada systems risk ‘potentially enormous’ damage.
The Oil and Gas Producers association (OGP) and IPIECA have just released an operating management system framework (OMS) for controlling risk and delivering high performance in the oil and gas industry. The OMS offers advice on defining and achieving performance goals and stakeholder benefits, while managing ‘the broad and significant range of risks inherent in the oil and gas industry.’
The 44 page worthy but wordy document advocates a ‘plan-do- check-act’ approach, ‘which has been adopted by ISO and others,’ through the application of four fundamental principles and ten structural elements that establish the OMS and its expected outcomes. A 60 page supplement, ‘OMS in Practice,’ drills down into the framework with examples of industry-specific processes and practices.
Metso Automation has received a US patent for automated root cause analysis (Arca) technology developed at its ExperTune unit. The patent (US N° 8762301) uses time-series data from a ‘primary reference variable’ and cross-correlates it with data from other process variables to determine the degrees correlation and their time shifts. Highly correlated variables with the smallest time shifts are considered to relate to the most likely root cause.
The ‘big data’ approach is claimed to eliminate ‘costly and time-consuming’ steps in the problem-solving process. Metso’s George Buckbee, inventor of record, said ‘Clients see the benefits in energy savings, production increases and quality improvements. With training, a single engineer can perform like a large team.’ The patented technology is available in the current release of PlantTriage, a component of Metso’s control performance business solution. Buckbee observes that as the technology does not rely on process models it is applicable across large process plants with complex interactions ‘where accurate dynamic process models may not be available.’ More from Expertune.
Precyse Technologies has announced a major deployment of its wireless ‘remote entity awareness and control’ (Reac) system at ten CO2 recovery plants across Texas operated by an oil and gas major. The Reac system is used to track some 1,000 workers operating in potentially dangerous facilities several hundred acres in size. Operators are equipped with ‘xAgents,’ wireless badge transmitters that provide information on worker location, activity and ‘man-down’ detection.
A combination of A-GPS, static beacons and active RFID technologies ensure bi-directional communications throughout complex industrial environments. Software further monitors workers for falls, a panic button press or entering an unauthorized area. In such an event, the system can send emails and text messages to emergency responders, control room operators, security personnel and/or supervisors.
The system provides real-time and historical reporting of personnel movement during normal operations. In the event of a muster, the application displays head count and location of workers as they move to safe points. An OSHA-compliant ‘post muster’ report is generated. More from Precyse.
Lloyds Register reports that its BOP risk model, developed in response to the Deepwater Horizon disaster, represents a first step towards risk-informed decision-making for safer drilling and a tool for consistent communication with stakeholders in the event of a subsea BOP failure. The BOP Risk Model combines expertise from Lloyds Register’s consulting and drilling units. The tool provides a fault tree risk model developed with Lloyds’ RiskSpectrum PSA1 and delivered via the online RiskWatcher front end.
The BOP risk model models the risks and potential consequences of BOP component failure and provides advice on ‘when to pull the BOP for surface inspection and repair.’ A block diagram represents key BOP functions and the logic that connects the different assemblies and sub-components. Failure mode analysis is performed across all components. RiskSpectrum is then used to map out fault trees. BOP risk levels can then be assessed in RiskWatcher by comparing the remaining available redundancy of the BOP capabilities with the minimum requirements in the company policy, industry standards and regulatory regulations and rolled-up into ‘pull’ or ‘no pull’ signals.
Lloyds is now working on a web version of RiskWatcher to allow for collaborative work across a corporate intranet. A future release will add event history and details of out-of service equipment. More reliability metrics will allow for fine grain compliance with safety integrity levels and conditional probabilities for a defence-in-depth approach. More from Lloyds.
1 Probabilistic Safety Analysis.
A fire, quickly extinguished, at Statoil’s Eisenbarth, Monroe County, Ohio well last month was an opportunity to observe how the company’s public-facing incident response operates. Statoil has set up a website that links through to its Ohio operations incident response site offering updates, a map and an opportunity to sign up for information on the incident.
The minisite leverages Witt O’Brien’s Public information for emergency response (Pier) system, a platform that helps companies prioritize and execute crisis communications tasks. Pier is a web-based, virtual crisis communications center for internal and external communications and preventing ‘misinformation’ during crises. Earlier this year, Witt O’Brien teamed with Everbridge to augment Pier with ‘unified critical communications,’ a.k.a. ‘scalable bi-directional, multi-modal communication.’ Everbrige adds notification templates, polls and messaging and language localization to Pier. More from Statoil and Witt O’Brien.
As we reported last month, Roxar is to leverage Calgary Scientific’s PureWeb technology to support multi-endpoint manipulation of large cloud-based data sets. PureWeb offers what is claimed to be a lightweight API that enables interaction with large data sets on remote servers from laptops, tablets and smartphones. The technology was on show on the HueSpace booth at the EAGE and, as we reported last month, is used in Roxar’s RMS to Go model-in-the-cloud solution.
Seismic stratigraphic interpretation specialist Eliis likewise uses PureWeb to transform its PaleoScan desktop software into a ‘secure, interactive online version accessible anywhere and anytime.’ The tool also provides licensing flexibility and eliminates the need for high-end hardware on the desktop.
Although Seisware wasn’t at the EAGE, it was an early adopter of PureWeb to reduce the time and cost of supporting multiple devices and smaller screens.
We tried out the technology at the EAGE. Calgary Scientific’s Gary Mendel provided an URL and in seconds we were playing with an RMS model on our MacBook. PureWeb runs on iOS, Flash, HTML5 and Silverlight. Calgary Scientific claims good take-up in the medical where it is used in remote diagnostics of CAT/MRI imagery. Another poster child is Cyberska.org, a ‘Facebook’ for radio astronomers. More from firstname.lastname@example.org.
Gulfport, Miss.-based Horizon Energy has announced the Intellipumper, a rod pump control that claims to provide a ‘low-cost option with high-tech capabilities.’ The Intellipumper targets the digital oilfield with a ‘ready to use’ self-contained unit and wireless Scada communications.
President and CEO Robert Bludorn said, ‘Intellipumper improves the performance of every pump stroke, maximizing recovery and lowering mechanical stress to minimize rod breakage and extend equipment life.’ Intellipumper can be deployed as a standalone unit or as a component of a ‘complete digital oilfield solution.’
Horizon’s primary activity is hands-on E&P in conventional and unconventional onshore US properties. Scalable, low cost, opportunities are developed through the application of modern, proprietary technology (like the Intellipumper) and expertise in finding ‘unconventional solutions.’