Jim Boots’ new book, ‘BPM Boots on the Ground,*’ provides many fascinating insights into his 30 year career with Chevron where he was latterly manager of Chevron’s BPM efforts. It is an unusual, possibly unique account, combining in-depth domain knowledge and opinion, with much chapter and verse from the transformation of Chevron’s fuel supply chain and the development of an upstream process model.
Boots on the Ground is a well written narrative that begins with Boots’ earlier experiences of total quality management, six sigma, ERP and supply chain management initiatives. These included Chevron’s project development and execution process in the 1990s and the 2001 operational excellence push, now ‘an important part of Chevron’s culture.’ A key milestone in Chevron’s BPM journey occurred in 2004 when its shipping unit identified Nimbus Control’s BPM technology as bringing ‘a whole new level of rigor and engagement to process management.’
Boots believes that pinning down business processes so that they cannot be circumvented by short cuts would help avoid disasters. ‘In ten years or so, disasters due to process failures, like the Deepwater Horizon will not happen because computerized verification will ensure that each compliance step has been completed in the proper sequence.’ A process model helps articulate and pinpoint shortcomings ‘in ways that are simply not possible when relying on language alone.’
Curiously, Boots, while acknowledging the Object Management Group’s notation (BPMN) as a route to cross vendor standardization, considers its semantics ‘befuddling’ and prefers Tibco/Nimbus’ proprietary Universal Process Notation. Microsoft SharePoint (but not BizTalk) is recognized as ‘an important enabling platform for process management.’ Chapters include a scorecard of Chevron’s maturity in each field. Today, Chevron’s uses mostly custom code and commercial off-the-shelf applications. Boots predicts that ‘within five years, almost every Fortune 500 company will be utilizing a workflow platform.’
Boots on the Ground provides an informative world view from the standpoint of a BPM practitioner. It could benefit with clarification on the relationship between BPM, workflow, enterprise architecture and other contributing technologies. Visit Boots’ base camp and Nimbus.
* BPM Boots on the Ground. ISBN: 9780929652160. M-K Press.
CGGVeritas has acquired Fugro’s geoscience division in a ‘transformative’ deal for the high-end integrated geology and geophysics and reservoir market. The agreement includes a new seabed acquisition joint venture, a commercial agreement for the resale of Fugro’s multi client data and a mutual preferred supplier agreement.
The deal is worth €1.2 billion in cash and was financed with 1/3 equity and 2/3 from debt and from the sale of shares. A subsequent rights issue brought in some €414 million with the issuance of 24 million shares.
The deal is said to accelerate CGGVeritas’ move to the ‘less cyclical, less capital intensive’ geoscience consulting business. The addition of Fugro’s reservoir characterization and modeling software and services will complement the Hampson-Russell unit. Some 2500 Fugro employees will transfer including the legacy Robertson and Jason brands.
The new seabed joint venture will be owned 60% by Fugro and 40% by CGGVeritas with the latter contributing shallow water, ocean bottom cable and permanent reservoir monitoring services. Fugro brings its ocean bottom node business—along with a €225 million cash contribution. More from CGGVeritas.
Speaking at last month’s ECIM data management conference, Malcolm Fleming of the UK’s Common Data Access*, advocated ‘professionalizing’ data management with a program of training and accreditation. The idea is to develop a universal ‘competency map’ for E&P data managers. The map describes the core skills required and sets minimum levels for different degrees of competency.
The idea is for a uniform reference tool to aid in education, hiring and career development. The intention is to develop the competency map into an accreditation governed by a ‘certifying authority.’ The joint CDA/PPDM/ECIM initiative has established requirements for well and geophysical data management and at last month’s ECIM, kicked off a new ‘map’ for geospatial data management.
All of this is to be available in a ‘portal’—to be rolled out by year end 2012. Future users will be able to sign up and register as ‘a data manager’ and create their own profiles through self assessment—which can then be verified by a third party. Fleming intimated that this might be the UK Geological society or possibly the Science Council—which already awards certificates to ‘chartered scientists.’ A pilot is underway with five oil company sponsors, three service companies and around fifty users.
~
Recently I started receiving curious messages from LinkedIn saying that so and so had endorsed me for competency in various fields—Petroleum, Geology, Energy and so on. I was then invited—as I am sure many of you have been—to endorse them for similar accreditation. I clicked mindlessly on a few of these, before I realized that LinkedIn would have gone on offering me more and more opportunities to ‘endorse’ until I had run through my ‘500+’ connections. While this was entertaining, it was not as entertaining as playing solitaire. I did wonder if this kind of cross-endorsement might evolve into a future academic qualification—perhaps along the lines of a ‘third party assessment.’ Maybe someday folks will speak with pride of their degree from the University of Facebook.
I have to say that I am not at all sure that the CDA competency map is the right way of going about things. It grinds the subject finer and finer without indicating what teaching occupies each slot. That would be a rather harder thing to do because upstream data management is a mixture of some very dull stuff and some very complicated stuff, all embedded in a rather disorderly way of working.
I’d like to know what a ‘self assessed’ spatial data manager would do when confronted with a large project where half the data has been loaded with the wrong projection—would he or she spot the error? Would he/she be able to fix it? The reality is that such issues require the sort of specialist knowledge that comes from in-house domain experts (oil companies used to employ surveyors) or a third party. Logs and seismics likewise hide a host of non trivial problems for the data manager which are hard to solve without in-depth domain knowledge.
The CDA approach focuses on the geoscience side of the equation which is clearly key. But it is not the whole story. My personal view is that you will not be able to do much good in data management without a pretty good grasp of the fundamentals of IT as well. Even if your company has a strict demarcation line between data management and IT, you still should have a good idea of what IT has to offer so that you are a competent buyer/user of its services.
You need to be aware of the many different ‘solutions’ that IT offers to a particular problem. One is to make simple tasks unbelievably complicated—witness the famous ‘hello world’ program in just about any modern language. But it is not all bad. Back in the 1970s, there was a general realization that IT—then dominated by IBM—was getting far too good at carving out more and more work for itself, making things unnecessarily complicated and obliging folks to spend more and more time on operating system upgrades. Fast forward to October 2012 and we have, not IBM but Microsoft, releasing Windows 8—with yet more upheaval. The process continues unabated.
So what did those guys in the 1970s do to ‘solve’ the unnecessary complication and constant change imposed by the vendors? They observed that most all IT tasks involve common problems that can be solved with a set of standard tools. These involve reading and writing files, looking for and editing strings, piping data around a workflow, sorting, counting and so on. All the stuff that makes up data management in fact!
The work done by these pioneers is the one part of the IT canon that should be in every data manager’s ‘competency matrix.’ If you did not guess, the toolset is the Unix/Linux Shell. This has stood the test of time because it addresses the unchanging fundamentals of IT—rather than its ephemeral manifestations like the shiny new graphical user interface of Windows 8.
Most all of my own data management experience involved programming the Shell which I learned from books and a few courses. While many may be a bit out of date some are still pretty valid. Anyhow I thought that a ‘useful books’ page would be a good addition to the Oil IT website. Check it out on the link above. Feel free to email me (info@oilit.com) with your own, maybe more recent, suggestions. You want to be a data manager? Read a book. Oh, and by the way, sixteen years ago, a publication called ‘Petroleum Data Manager’ hit the newsstands. It is now called Oil IT Journal and it includes a significant ‘body of knowledge’ on upstream data management. OK, it could be better organized—another item for the ‘to do’ list I guess.
* Visit CDA’s competency page.
At the 2012 edition of the UK Institute of Mechanical Engineers’ Process Safety event in London this month, Eddie Moreland spoke with authority on two counts. As CEO of the UK’s Health and Safety Lab (HSL), Moreland is tasked with ‘crime scene investigation’ of process and industrial accidents. The HSL’s own plant contains hazardous materials used in its investigations and so is a (regulated) duty holder in its own right. The bad news for process safety in the UK is that things have gone wrong and they still are going wrong—HSL averages one site visit per day. The good news is that ‘goal-based regulation offers guidance, encourages thinking and an ongoing search for learning and improvement.’ Moreland stressed that process safety should not be confused with conventional HSE, focusing on stuff like folks holding hand rails.
The HSL’s remodeled Control of major accident hazards (Comah) rules presuppose competency from both the regulator and regulated. These derived from HSL’s investigation of the UK Buncefield oil tank farm fire. This was a puzzler initially as computations failed to explain the size of the explosion. HSL conducted an experiment by spilling hexane to re-calibrate the code and found the big bang was caused by a very still day, allowing a huge vapor cloud to build-up.
HSL’s ‘Safety climate tool’ powered by Snap has been made available to industry and exposes an eight point plan for assuring safety culture across the organization. The tool helps measure the ‘safety gap’ between management, where things may appear OK, and the shop floor, where they may not! All the same, the UK has made ‘tremendous progress in stopping people getting killed at work.’ With some 5,000 per year in 1900 to a current 170 per year today (all industries).
Colin Dennis of the UK Rail Safety and Standards Board described some of the perverse effects of safety performance indicators (SPI). In 2004 Network Rail’s attempt to drive down reportable incidents ‘worked’ in that reporting was greatly reduced—but this did not result in a comparable reduction in accidents. What was happening was that the new regime had caused significant under-reporting of injuries with some 34% of accidents not reported. Such ‘unintended consequences’ resulted from ‘real and perceived pressure’ from management. UK operator Network Rail has introduced a safety leadership program and is researching the optimal use of leading and lagging indicators. A guidance/good practice document ‘should be applicable to other industries.’ SPIs, properly handled, ‘appear to genuinely help manage risk’ and can complement existing safety management systems and processes. The reporting conundrum can be solved with anonymous reporting of ‘close calls’ without fear of being disciplined.
Ann Metherall (Burges Salmon) observed that only one in six accidents are due to a breach of regulations. The legal basis for process safety revolves around ‘reasonable practicability.’ Operators must take ‘proportionate’ safety measures—but what is ‘proportionate?’ Legal precedent suggests that industry good practice sets the standard. Buncefield was a wake-up call re manning levels, employee competency and leadership. Lessons were also learned by the legal profession in terms of where, in a joint venture, compliance responsibility actually lies. Corporate manslaughter legislation can expose the parent company to higher fine than a smaller sub.
Amec’s Howard Thompson returned to the Texas City incident observing that plant performance and occupational safety were target areas but that at the time, process and design safety metrics were relatively new. Even today, refinery damages paid are trending up and there is concomitant increasing public risk aversion. Thompson argues plants would benefit from inherently safer design (ISD) to ‘eliminate or reduce hazards completely.’ The Bombay High field fire involved a complex chain of events that began with a self inflicted injury to a cook. The subsequent medevac went badly wrong when a supply ship ran into the platform causing a fire and 22 deaths. Thompson thinks that better design would have avoided this—for instance putting the risers deeper inside the structure. Amec now has an ISD workshop process that augments ‘imperfect’ traditional hazard and risk management. More from Amec and Imeche.
At a recent public hearing on process safety performance indicators, safety expert Andrew Hopkins of the Australian National University in Canberra observed that ‘there is a need for indicators specifically related to the risks of a blowout.’ His reasoning comes from a study of the Texas City disaster where he spotted a need for a focus on process as opposed to personal safety. Following Texas City, the API issued its Recommended Practice 754 on reporting indicators. Hopkins has revisited these in the light of offshore drilling. Current drilling reportable indicators include well kicks, flammable gas leaks, fires and explosions and other factors but not all gas releases. Neither does the US require well kicks to be reported. Both kick and release reporting is mandatory in Norway and Australia. A kick on the Deepwater Horizon, a month before the explosion, went unnoticed for over half an hour.
Ian Whewell, formerly of the offshore division of the UK Health and safety executive’s hazardous installations directorate has produced a ‘definitive guide’ to KPIs major hazard industries. KPIs won’t control major offshore risks on their own. Industry leaders at board and senior management levels must use the indicators and associated data to inform decision-making at all levels. The Texas City findings resulted in the introduction of two new KPIs—the backlog of safety critical maintenance and the current status of safety critical equipment. Elsewhere it was observed that oil and gas attaches too much importance to personal safety measures such as lost time injury frequency ‘despite their well-known weaknesses.’ More from the CSB.
Toronto headquartered geophysical software specialist Geosoft is now offering potential field inversion services using a secure data connection into the Microsoft Azure cloud. Geosoft’s Voxi earth modeling (VEM) package generates 3D voxel models from airborne or ground gravity and magnetic data. VEM is available through Geosoft’s Oasis montaj desktop application.
For larger inversion jobs, the Azure cloud is invoked. Data is converted at the client to a ‘package’ containing the information required for the inversion. Data is transferred encrypted which means that binary Voxi objects are only readable by the Voxi service. Intrinsic Azure security offers an additional layer of protection to data during inversion.
Once the job is in the cloud, the Azure service instantiates the appropriate number of cores and a virtual cluster is formed to run the inversion process. The inversion package and model results are only stored in the cloud for as long as they are needed to complete and download the inversion results to the client. All data is then immediately deleted from the Azure service. More from www.geosoft.com.
A new white paper from French geostatistical specialist Geovariances outlines its drill hole spacing analysis (DHSA) methodology, originally developed for the coal mining industry. DHSA may be of interest to companies developing and assessing unconventional resource plays where the ‘mining’ paradigm is in use. DHSA is a means of optimizing borehole spacing by providing an objective, geostatistical assessment of resource confidence. DHSA also provides unbiased assessments of resource size for reporting purposes.
Many deposits, variables, and most importantly, a range of geological settings have been analysed using DHSA to obtain the optimal drill spacing information to support resource classification decisions. DHSA has been implemented internationally by Geovariances for over 12 years. In Australia, DHSA has been applied to coal operations and projects in the Bowen, Surat and Gunedah Basins.
DHSA assessments are used to support changes in resource classification and to prioritise targets by identifying areas requiring increased or reduced sampling density. According to Geovariances, geostatistical is a recommended best practice in the mining industry. The results may differ significantly from ‘conventional’ reporting guidelines. Read Geovariances’ white paper here.
The latest V14 release of Blue Marble Geographics’ Global energy mapper (GEM) features an enhanced site pad placement tool developed in partnership with Spatial Energy. GEM now provides access to online data from Spatial Energy with on-click data ordering. A terrain analysis function offers contour grid generation, watershed generation and tools for locating ridge lines and calculating volumes between surfaces. Blue Marble president Pat Cunningham said, ‘This release was a great opportunity to combine two great software companies while expanding the power of affordable GIS software.’
For pipeline analysts, GEM offers raster management tools and geospatial data management functionality co-developed with the US Geological Survey (USGS). The new release also adds support for seven types of spatial databases and the ability to consume web feature services and web map tile services. Other new formats include Garmin JNX, ArcSDE, personal and file geodatabases, Oracle spatial, PostGIS, MySQL Spatial and more. GEM has an entry-level price of under $400.
In a recent blog post, UK-based ETL Solutions’ Richard Cook offered some advice to users of the Professional petroleum data management association’s data models. While standards like PPDM are important to the upstream, they present challenges at both adoption and when transitioning between versions. Databases grow in size and complexity as volumes increase, as metadata is added and especially, when interpretation results are captured. ETL uses the example of how checkshot data is captured to a variety of PPDM databases. The 3.2 edition represented checkshot surveys simply, with little metadata. 3.3 added many new checkshot and foreign keys to reference tables where more data can be stored.
Intermediate PPDM flavors introduced subtle, albeit important semantic changes, but there was a major shift with 3.6. The checkshot table has gone and checkshots are now considered seismic data. 3.7 added yet more attributes to record the lifecycle of these objects and 3.8 introduced ‘potentially far-reaching’ changes. Several tables’ use is now deprecated. Others have moved and the precision of some numeric values has been increased which could result in tests using them to fail.
For those scratching their heads, ETL’s Transformation Manager offers a structured approach to data migration, separating underlying data structure from implementations. TM is claimed to be ‘the key to handling data stored in PPDM and other highly normalized models.’ TM generates rule based Java integration components to manage complex data translations.
The 2012 edition of Baker Hughes’ JewelSuite reservoir modeler targets unconventional plays with a comprehensive 3D display of the reservoir. Blue Marble’s geospatial technology is now included as is data mining of large well datasets.
Drilling Info’s DI Pro is an E&P business intelligence toolset for identifying and evaluating unconventional plays. The subscription-based service includes land, geology and analytics modules.
The updated non conventional reservoir simulator, Comet3 from Advanced Resources, includes a triple-porosity/dual-permeability option, multi-component sorption and a ‘robust’ permeability model.
ESG Solutions has developed a hybrid downhole/near-surface microseismic solution for detection of seismicity induced during hydraulic fracturing and steam injection activities. The system has a -4 to +4 Richter dynamic range.
Caesar Systems’ Fiscal systems modeling framework, an optional component of its PetroVR package, offers ‘fast, transparent auditable’ evaluation of international investment options.
V5.0 of Invensys operations management’s Avantis provides role-based asset management analytics and reporting. Avantis DSS includes data aggregation and visualization components and Microsoft SQL Server integration services for extracting source data.
Epsis’ TeamBox 5.0 captures meeting documentation and other data sources for wall display during meetings. The software runs on any screen or projector.
New Century’s Centerline Browser 3.1 adds geolocation support and enhanced interaction with PODS tabular data and pipeline events.
OGsys has announced OGdash, a web-based dashboard providing access to information in its eponymous accounting package. The dashboard will include AFE, accounts receivable and payable, lifting costs, vendor and owner ‘gadgets.’
SPT Group’s Olga 7.2 adds shut-in functionality, new multiphase pump models and an enhanced well editor and library. Simulation speed and stability are also improved.
V6.0 of Open IT’s IT resource monitor offers Active Directory integration via an ‘industry standard’ database. Users can use standard reporting tools or roll their own Excel dashboards.
The 6.3 release of Pen-Lab’s mud logging and drilling control software adds comments and new gauge controls to the real time data feed along with support for ‘Bloodhound’ gas chromatography data over a WITS data link.
Symmetricom has released ‘Quantum,’ a chip scale atomic clock with potential underwater seismic applications.
Technical Toolboxes has embedded Seikowave’s structured light 3D technology in a new hardware and software bundle, the 3D Toolbox, providing ‘repeatable and accurate pipeline defect measurement.’
The new release of McLaren Software’s FusionLive project collaboration solution adds many enhancements and runs from McLaren’s ‘On-Air’ cloud infrastructure of secure data centers located in the USA, UAE and UK.
Halliburton introduced its ‘Knoesis’ service at the SPE ATCE in San Antonio this month, a ‘family of software applications’ used by Halliburton’s stimulation technical advisors to optimize frac jobs and completions. Knoesis currently includes design and analysis services, ‘Foray’ for microseismic fracture matching and ‘Delve,’ a technical data mining and analysis service.
Foray generates a three-dimensional representation of fractures from microseismics, offering a novel planar representation of fractures rather than the more usual ‘bubble’ plots of microseismic events. These are generated automatically using ‘advanced mathematics’ that is claimed to offer unbiased, objective fracture characterization. Delve ‘mines’ historic and current job data and operating experience to optimize stimulation design and execution. Delve offers access to terabytes of historical frac data from around the world. Other Knoesis components, ‘Savvy’ for complex fracturing and ‘Melt’ for acid treatment design will be available ‘real soon now.’
ISN blogger Akmal Shah has been checking out the 2012 edition of Microsoft Windows Server and sees potential for oil and gas IT shops. The new graphical user interface is based on the ‘Metro’ Windows 8 style GUI. Although Metro has been ‘much criticized,’ its use in Server 2012 is ‘less contentious’ as the interface defaults to the command line—making the new GUI optional.
A new Hyper-V virtualization engine promises to offer competition for Microsoft’s big rival VMware. Hyper-V’s scalability is much improved with, at the top end, 64 nodes capable of hosting up to four thousand virtual machines. An increase in virtual CPUs means that Hyper-V is now better suited to intensive workloads such as those seen in oil and gas companies.
Another key differentiator for oil and gas, according to ISN is Server 2012’s Storage Spaces disk I/O performance. This is ‘not far off native speeds’ and provides a cost effective means for adding and managing capacity and high availability.
Networking is enhanced with ‘DirectAccess,’ a VPN-like secure tunnel from any endpoint back to the corporate network without the overhead and performance hit of a true VPN. Shah believes that this functionality ‘will have valuable yet economical remote working benefits for our oil industry customers.’ Read Shah’s blog here.
The Society of Petroleum Engineer’s 2012 Annual Technical Conference and Exhibition’s opening session debated this year’s titular theme—’making unconventionals conventional.’
Halliburton CEO David Lesar observed that unconventionals elicit strong opinions and ‘outright falsehoods’ which need to be addressed head on with better communication of the economic benefits. These include a trillion dollar boost to the US balance of payments, $100 billion in income taxes and two million jobs created at a $90k average salary. Next, industry needs to demonstrate ‘sustainable’ production by delivering long lived unconventionals reserves. Today, there is not enough production history to convince the skeptics. On the delicate topic of hydraulic fracking and water use, there have been ‘many false statements.’ While industry is not faultless there have been ‘zero cases of fracs impacting the water table. On the topic of frac additives, Lesar described Halliburton’s CleanStim as entirely made from foodstuffs adding, ‘I’ll even drink it for you’ which he did to applause, even though ‘it doesn’t taste so good!’ Today, unconventionals are a business free for all—a huge lab experiment where some operators own equipment and service companies may own land! But this chaotic process is generating a productive outcome and has become an intellectual storehouse allowing near instantaneous development of a discovery. Foreign companies see the US as an unconventionals university that they have to attend.
ExxonMobil’s Mark Albers observed that while unconventionals used to be synonymous with uneconomical, today shale oil, shale gas and oil shales are becoming essential to world energy supply. There are substantial prospects for going global. Government needs to put a framework in place and industry needs to increase understanding of the novel plays by educating the public on the risks and how they are handled and engaging policy makers and regulators.
Pioneer’s Tim Dove described how conventional exploration can turn into unconventional. Pioneer’s conventional exploration of the South Texas Edwards cretaceous reef play led to the targeting of the Eagle Ford shale. Early vertical wells were not so good—but things took off with horizontal drilling, backed up with 3D seismic that had been acquired for the Edwards. Pioneer’s Midland basin Spraberry/Wolfcamp shale was discovered in the 1940s and thought to be ‘the largest uneconomic US oil play!’ This is changing now. Dove opined that ‘factory drilling,’ a.k.a. ‘carpet bomb drilling’ has actually missed many plays. Pioneer estimates there are 50 to 100 million barrels per 960 acre section, making for a ‘brave new world’ for the Permian basin. Wolfcamp development will require around 55 wells and an investment of around $180 million per section. This is a massive investment but at low risk as it is essentially ‘mining’ oil and gas. Dove concluded saying ‘the game has not started yet really.’
Steve Holditch of the Texas A&M Energy Institute agreed that the US Is becoming a ‘university’ for unconventionals. There is a huge amount of information available on resources and on technology. Texas A&M is using this data to re-calibrate the resource assessment. The previous assessment by Hans-Holger Rogner is now considered conservative. The US government’s ‘90 day report’ on shale gas is ‘very balanced.’ Industry needs to put more information out there leveraging the FracFocus resource. Industry needs to reduce diesel fuel in emissions—notably from ‘all the trucks running past folks’ houses.’ Some of these are appropriately switching to natural gas. Holditch wound up with a plea for more R&D funding and a suggestion that ‘States should be the primary regulator.’
The groundwater question was raised in the Q&A. Albers observed that the oil industry has a good track record of protecting groundwater all over the planet. ‘We need to tell the tale through organizations such as the API and the Gas Association. Also water needs putting into context—oil and gas uses less than 0.2% of water in most of states. Lesar observed that ‘it is hard to win an emotional argument with facts.’ We need to get out there with the economics. Water, stuff in ground is emotional—we need to ‘walk the talk,’ the battle is not yet won. Albers acknowledged that there have been ‘isolated incidents’ with some operators. Some folks ask ‘why not police yourselves like nuclear?’ But this highly competitive industry needs state regulators and competent enforcers. One questioner asked if imports will disappear. Albers observed that while Governor Romney has suggested self sufficiency by 2020, this ‘may not be what we want.’ What is required is security of supply, ‘We don’t have food independence!’
John Lee (formerly with Ryder Scott) presented the updated SPE’s revised Petroleum reserves management system (PRMS) guidelines, developed by the SPE joint committee on reserves evaluation training (Jcoret). The original idea was to provide ‘real world guidance, not a manual of geology or petroleum engineering.’ The Jcoret group tried to get examples from industry to illustrate the guidelines but ‘industry said, no.’ The 220 page document includes a new chapter on unconventionals reserve estimation. In the Q&A Lee offered that this was ‘much more concise’ than the SPEE’s monograph on evaluating resource plays.
Marathon’s Marcy Woods continued with the unconventional evaluation theme, walking through the labyrinthine new SEC reporting rules as applied to the Anadarko basin’s Woodford play. The SEC recently extended what are considered acceptable evaluation methods to include ‘reliable technology’ and probabilistic analyses. Marathon has leveraged the methodology as outlined in the SPEE monograph. This includes the concept of a ‘performance area,’ decline curve and rate transient analyses with the ‘expanding concentric radii method.’ The results are used to define the ‘proved’ area. A five year simulated drilling program is bolted on and all is rolled up into proven undeveloped reserves. Woods believes that this is the first public application of the SPEE methodology.
The Digital Energy session on collaboration technology heard from Sami Alneaim (Saudi Aramco) who opined that there is now ‘quick take up of digital—there no need to justify every project.’ Digital is key to Aramco’s efforts in mega fields, well test, production optimization and novel facilities. Intelligent, digital technology ‘will be the main source of incremental oil production in the near future.’
Mike Hauser traced Chevron’s now rather mature i-Field effort. I-Field success boils down to three things, ‘leadership, leadership and leadership!’ 80% of the effort is ‘navigating people and processes’ rather than technology. Change is best seen as a positive experience in the transition to a future state that works for end users. The approach is working well as individuals gain understanding, involvement. Chevron now has developed many ‘soft’ building blocks—for competency development, leadership and value creation.
Adel Al-Abbasi (KOC) presented the Kuwait integrated digital field (KwIDF). E&P organizations are too reactive and would benefit from better collaboration. The IDF is less about the digital oilfield—but rather about change management which needs to be ‘not too fast and not too slow.’ The IDF ‘dynamic decision making world’ sits alongside the storage/big data mining environment. The IDF challenge is to transform KOC from an asset-focus to a collaborative organization.
Husky Oil’s Andrew Montes presented results from trials with Resman’s novel permanent tracers for inflow control valves. Resman’s tracers are strips of a material placed at strategic intervals in production casing. Different tracers leach in the presence of oil or water. Tests on Husky’s offshore Newfoundland White Rose field demonstrated that a ‘pseudo production log’ could be obtained from the tracer data, identifying one level with water breakthrough. All tracers were functional and received over the 10km flow line.
A enthusiastic Allan Rennie showed off Schlumberger’s ‘Ipzig’ at-bit imaging gamma ray tool which, used with Schlumberger’s own PZS pay zone steering software and real time mud pump data transmission, keeps unconventional and conventional wells clear of bed boundaries.
The SPE is quietly remodeling the ATCE. Regretfully, we note the demise of the Fun Run. Editor Neil McNaughton was an occasional participant in this event, slipping back from a place in the first ten to an ignominious ‘finisher’ status in later life. The SPE’s own ‘show daily’ has bumped-out the Harts equivalent. The SPE has also introduced an ‘SPE TV’ channel to the program. There are now LinkedIn, Facebook, Twitter and YouTube pages and, for attendees, ‘free’ iPads with the SPE’s own ‘App.’
Speaking at the Pipeline open data standard association’s 2012 user conference in Houston earlier this year, PODS director Janet Sinclair welcomed Chevron and BP into the PODS fold and observed that attendance, 267 from 9 countries with a 50/50 split between operators and service companies, had doubled since 2010.
PODS president Paul Herrmann outlined the organization’s accomplishments and future strategy. 2012 saw the finalization of the 5.1.1 release of the joint PODS/ESRI spatialized database. PODS has also released a major new version– V6.0 of its model and is working on an ‘open’ (i.e. not necessarily ESRI) spatial model. The strategic directions see further growth of the data model with extensions to distribution and gathering and support for international users. The model is also to be modularized to mitigate its growth and facilitate domain-specific deployment. PODS is also developing sample data sets and best practice guides for use.
Sinclair reported from PODS work group activity. The gas distribution model workgroup is evaluating the feasibility of incorporating its gas distribution model into the main PODS model. PODS has also been working with the Geneva, Switzerland headquartered International pipe line and offshore contractors association (Iploca) on data standards for new construction. The idea is to develop a new PODS construction module for use in the front-end engineering design and construction phases of a new onshore or offshore pipeline and to support the transfer of critical ‘as-built’ information from the construction phase into an operator’s PODS database. The initiative would inject pipeline stationing concepts into the design and build phases and allow the linkage of procurement data, field survey and other information across the pipeline lifecycle.
Tim Williams of Boardwalk Pipelines presented the PODS open spatial project that builds on the Open geospatial consortium’s (OGC) standard portfolio. For operators, storing data in a standard, open way means access to a larger pool of potential service and application providers. Whereas the current flavor of PODS Spatial requires GIS specialists for its management, PODS open spatial is Microsoft and Oracle compatible and can be managed by ‘standard’ IT resources. This potentially means that GIS functionality like proximity-based searching is available to non-GIS developers using an extended SQL library.
Gary Hoover described ‘Project 2020,’ Enterprise Products’ ground-up redesign of its midstream assets data model. The 2020 objectives include an engineering stationing-less deployment, an embedded open source database and open geospatial data types covering all midstream facilities and a service-oriented architecture. Hoover observed that the current highly normalized PODS model meant that a query for a single joint of pipe involves 30 Tables, 46 Table Joins and some 49 records. Even moderate denormalization means a 90% reduction in the number of joins and records required. Migrating from the proprietary spatial model to the OGC can eliminate data reformatting for mapping. The prototype also promises a 90% reduction in storage needs for inline inspection data. Overall Hoover reports a 10x reduction in model opex and ‘better alignment with current GPS survey techniques.’ Read the PODS presentations here.
A £10 million Innovation Fund has been opened in Scotland for improving the integrity and reliability of assets in the oil and gas industry.
Former MD of Roxar, Mark Bashforth, has joined Fugro-Jason as General Manager for Global Sales and Operations.
Rodolfo Hernandez recently joined the Texas Bureau of economic geology to develop the seismic interpretation program for the State of Texas advanced oil and gas resource recovery project.
Fernando Moraes heads-up dGB Earth Sciences’ new office in Rio de Janeiro.
Caesar Systems has appointed Jim Thom as VP Client Development to expand the use of its PetroVR software.
Former Director of operations for the US Cyber Command David Lacquement has joined SAIC’s cyber security business unit.
The UK Energy Industries Council has appointed Paul Mitchell as chairman of its board.
Experient Group has recruited Joe Osbourn as Executive VP and Butch Benford and Feryal Hendricks as Vice Presidents.
Jeff Miller has been promoted to Executive VP and COO of Halliburton.
Ron Masters has joined Headwave as geoscience advisor. He was formerly with Shell. Rekha Patel has also joined the company as SVP of strategic business relationships. She hails from Ikon Science.
Ikon Science has entered into an agreement with the Instituto Alberto Luiz Coimbra de Pós-Graduação e Pesquisa em Engenharia to research the field of geoscience software and related subjects for oil companies operating in Brasil. Ikon also opened an office in Calgary headed-up by Rob Dudley.
Former Constellation Energy CRO, Brenda Boultwood, has joined MetricStream as VP of Industry Solutions.
Mechdyne Corporation has opened a new office in Abu Dhabi (UAE) and hired Hamdi Aslan as new Business Development Associate for the region.
Jonathan Kissane has joined NetApp as chief strategy officer and SVP, and Jay Kidd has been appointed CTO and senior VP. Kissane hails from CA where he was SVP, Corporate Development.
Guy Gueritz, formerly with Bull, has joined Nvidia as Business Development Manager, Oil & Gas, taking over from Olivier Blondel who has moved to another position.
Tradeshift and OFS Portal have enabled data exchange between their respective e-commerce platforms.
Parker Drilling has appointed former VP of global sales for Baker Hughes, Gary Rich, as president and CEO, replacing Robert L. Parker Jr., who will remain executive chairman.
Former executive VP Fred Richoux is now president of Ryder Scott replacing John Hodgin who has joined the SEC.
Lance Cole of the PTTC has moved to The Society of Exploration Geophysicists as Inter-Society Collaboration Manager.
Chris Peeters is now director of Schlumberger Business Consulting’s EMEA region. He hails from McKinsey & Co.
Bernard Montaron heads-up the new Schlumberger China Petroleum Institute in Beijing. The institute includes data, geoscience and petroleum engineering services.
Retired ConocoPhillips executive John E. Lowe has been named senior advisor of Tudor, Pickering, Holt & Co.
Fugro has sold its stake in Electromagnetic Geoservices (EMG) for some NOK 444.4 million (€60 million).
Bentley Systems has acquired provider of asset performance management software solutions, Ontario-based Ivara Corporation.
Geoforce has acquired Sypes Canyon communications.
ENGlobal has contracted Simmons & Co. as financial advisor to explore strategic alternatives for its long-term future. These could include raising capital, a sale or merger.
FMC Technologies has completed the acquisition of Pure Energy Services for approx C$282 million ($285 million).
GroundMetrics has raised $1.2 million financing to develop its electric field sensing technology. The company also recently received a Phase I SBIR grant from the Department of Energy to conduct research into carbon capture and storage applications.
Viking Oil Tools has been formed with a ‘significant’ equity investment from NGP Energy Technology Partners.
Pansoft has gone private in a merger with Timesway Group and Genius choice capital. Shareholders will receive cash payments of $4.15 per share.
RSI has raised £3.2 million and obtained shareholder approval for issuance of warrants which could raise a further £2.6 million within the next two years.
Genesis Oil and Gas Consultants has signed an agreement to acquire Suporte Consultoria e Projetos, a Brazilian pipeline and structural engineering company based in Rio de Janeiro.
Wood Group is to acquire maintenance, installation and fabrication service provider Mitchell’s Oil Field Service for an initial consideration of $135 million plus future ‘earn out’ payments.
Kirk Coburn, founder and MD of Houston-based ‘Surge’ energy software accelerator, reflecting on its first year of operations in his blog, believes that entrepreneurs are best placed to react to changes in cultural, technological and legal context. When the context evolves, entrepreneurs create revolutionary ideas. Surge offers early stage capital funding for energy software developers from its network of venture capitalists and corporate development groups.
Surge tracks contextual changes in real time—one of which is the generational crew change in the energy industry which has created a ‘massive gap’ in experience between the baby boomers who are about to retire and the majority of young and as yet inexperienced younger employees.
Oil and gas companies were early innovators in supercomputing and data visualization and analysis. But most of these technologies were built by baby boomers using old platforms ‘like Fortran.’ Today, entrepreneurs are ‘challenging the old paradigms and developing new concepts—if the incumbents cannot keep up, new start-ups will.’ Apply to Surge or email melanie@surgeaccelerator.com.
FFA CTO Jon Henderson demoed its new ‘GeoTeric’ interpretation technology at the recent SEG IQ Earth Forum. FFA’s GeoTeric, which combines image processing with ‘adaptive geobodies’ technology, co-developed with Lundin Norge, runs on Nvidia Maximus workstations. ‘Adaptive geobodies’ is described as a ‘data driven, interpreter guided’ approach to geobody extraction and characterization.
The data driven side of the equation, geobody ‘region growing’ is driven by a multi dimensional classifier whose dimensions represent attributes of the input volume under study. Seed picks define the data statistics of interest and the contrasting surrounding matrix. FFA’s image processing for greyscale, spectrum and ‘heat’ does the rest. The process not only provides insight into geobody architecture, but also computes metrics such as areal closure. View Henderson’ SEG IQ Earth Forum presentation here and visit GeoTeric.
The inaugural meet of the Society of Petroleum Engineers PD2A technical section was held in San Antonio’s Iron Cactus Grill this month. PD2A promotes the application of data-driven modeling, data mining and predictive analytics R&D in the upstream.
Keynote speaker Matt Denesuk (IBM) set the scene, describing the ‘three big things’ in data today as 1) a meeting of the physical and digital worlds, 2) data-driven analytics and 3) heterogeneity and integration. The data driven approach, a.k.a. machine learning means ‘doing stuff with data without understanding the underlying processes.’ ‘Big data’ is essentially a marketing term. What is more interesting is the relationship between data complexity and size. This separates the data boys from the men—with Google fine for low complexity, high volume data and IBM’s Watson coming into its own at the other end of the spectrum.
IBM’s offering to industry actually combines the data-driven approach with domain knowledge. This is helping overcome resistance to previous attempts at machine learning which generated too many false positives.
Adco’s Fareed Abdulla then presented several case histories of Adco’s use of machine learning which we will report on in next month’s Oil IT Journal. The SPE PD2A meet was sponsored by BP, ExxonMobil and Hess. More on the embryonic section here.
Houston-based Merrick Systems has launched a production management and hydrocarbon accounting platform, Merrick Production Manager (MPM). Merrick VP production Clara Fuge said, ‘Production teams face a growing number of wells to manage and increasing volumes of field data to analyze, especially in unconventional operations. Reporting requirements at state and federal levels add to the burden with tasks like tracking produced water and managing massive volumes of run tickets. MPM address all these needs, optimizes operations and assures compliance.’
MPM components include eVIN for data management, ProCount for hydrocarbon accounting and Carte, a web-based production dashboard.
According to Merrick, its production operations and accounting software is used on one in five US oil and gas wells by over 90 companies and some 7,000 field operators. MPM was developed on Microsoft’s Silverlight web-enabled interface. More from info@MerrickSystems.com.
WorleyParsons has renewed its commitment to Aveva with multi-year agreements for Aveva Plant and Enterprise. WorleyParsons also signed a five-year agreement with Intergraph, covering the SmartPlant portfolio, CADWorx, CAESAR II and more.
Aker Solutions has been selected as management contractor for Brunei Shell’s offshore construction and maintenance. The five year contract is worth NOK2.3 bn.
Vopak has awarded Emerson Process Management a three-year global framework agreement as provider of process automation systems and services for its liquid terminal facilities.
Emerson was also awarded a $21 million contract for integrated control and safety systems for BP’s new FPSO.
Anton Oilfield Services Group and Schlumberger have signed a joint venture agreement to provide integrated project management services to China’s onshore oil and gas projects.
Aspen Tech has acquired the PSVPlus software product from Softbits Consultants.
Dassault Systèmes and Tata Consulting Engineers have announced a strategic partnership to deliver solutions to the energy, process and construction sectors.
Ensyte Energy Software has partnered with Houston-based Centre Technologies to offer enterprise consulting and cloud computing services.
GE Oil & Gas and Petrobras have signed a subsea wellhead production contract, worth nearly $1.1 billion. GE Oil & Gas is also to supply gas compression trains for Cheniere Energy’s Sabine Pass liquefaction expansion project in Louisiana.
Petrofac has chosen Intergraph SmartPlant Enterprise for major international projects.
Aveva and Infosys have signed a partnership agreement to deliver engineering information management solutions to process plant and power operators.
Oryx Petroleum Services has adopted Paradigm’s well planning and drilling engineering product suite Sysdrill 2012 as its corporate standard for drilling operations. Paradigm has joined the NetApp alliance partner program.
Petrosys and FaultSeal have signed a global reseller agreement for FaultRisk software.
Process Systems Enterprise has joined Rutgers University’s partnership for advanced process modeling.
Mahindra Satyam has signed a Memorandum of Understanding with University of Petroleum & Energy Studies, Dehradun. The MoU aims at facilitating joint research and development projects and consultancies for mutual benefit.
Senergy and Apache have announced a technology transfer agreement covering Apache’s patented ‘PetroSleek’ geosteering software. The technology will be used within Senergy’s Interactive petrophysics and Oilfield data manager packages.
Elsevier and the Society for Sedimentary Geology have announced the integration of more than 18,000 geological maps from SEPM into Elsevier’s web-based research tool, Geofacets, which now houses over 225,000 maps.
Shell has selected Fluor as the engineering, procurement and construction contractor for its flagship carbon capture and storage ‘Quest’ project at the Athabasca oil sands development.
Telvent has partnered with Industrial Defender to enhance the security of Telvent’s solutions.
Tradeshift has signed a new interoperability agreement with OFS Portal, enabling OFS Portal Members to adopt Tradeshift as a platform for exchanging data and transactions with their oilfield customers through the use of open, non-proprietary standards.
Provider of real-time cable-less seismic acquisition systems, Wireless Seismic has sold its first RT System 2 seismic data acquisition system to Michigan-based Bay Geophysical.
Yokogawa Saudi Arabia and Saudi Aramco have signed a corporate procurement agreement for process automation.
The Global reporting initiative has launched a research project into sector-specific guidance including the oil and gas vertical.
The Open geospatial consortium (OGC) membership has approved the Enhanced data model extension to the OGC Network Common data form (netCDF) core encoding standard. OGC has also announced a work group to advance the GeoPackage standard for disconnected mobile GIS.
The new modularized edition (V6.0) of the Pipeline open data standard association’s data model has been released for member comment.
The W3C has published R2RML, a language for expressing customized mappings from relational databases to RDF datasets.
DNV has published a recommended practice (RP) for the life cycle of shale gas extraction, based on risk management principles. The reference document supports independent verification and is intended to be a global standard for safe and sustainable shale gas extraction.
The new ECCMA Corporate dictionary manager allows members to create and manage corporate dictionaries.
A new Fiatech project, ‘Harmonizing industry standards to exchange equipment data’ will leverage electronic data exchange to integrate and automate the processes and software used to manage pumps, valves and control systems. The intent is to blend models from ISO 15926, AEX and HI EDE 50.7.
The .NET Standards DevKit, developed by ExxonMobil and licensed to Energistics for ongoing maintenance and support, is in the process of being updated to support WITSML v1.4.1.1, PRODML v1.2.2 and RESQML v1.1.
According to the IEC, ISO and ITU standards organizations, October 14th was world standards day—we missed it!
Speaking at a press gathering in Paris this month, Dave Wheeldon, Aveva CTO, outlined a new ‘lean’ philosophy engineering data management. The kaizen-ish idea is to eliminate data ‘waste,’ by building on previous software investment rather than introducing new products. Aveva’s developments are designed by engineers and can be customized by the client without developer involvement. Plant information is managed in a ‘neutral’ model that integrates with third party software. Poster child for Aveva’s engineering software use is Shell’s ‘Prelude’ floating liquefied natural gas vessel, the largest offshore platform in the world at 488x74 meters and 60,000 ton displacement. An EPC joint venture between Technip and Samsung Heavy Industries is using Aveva PDMS 3D modeler in the Prelude FEED.
This month saw the consecration of Aveva’s 3D/lean philosophy with the roll-out of ‘Everything3D,’ Aveva’s next generation plant design platform that embraces innovations in mobile computing, cloud computing and laser scanning. This was demoed on a Samsung Slate running Windows 8 which connected to a data center in the cloud. The Slate provides GPS, compass, camera and security to constitute a ‘portable digital plant.’ The E3D GUI leverages Microsoft’ Fluent (ribbon) interface. More from Aveva.
Colorado-based Flow Data has announced PadPro, a new well site control system providing access to real time data via an Android-based GUI. PadPro is said to minimize IT resources at the wellhead and to empower less experienced workers. Flow Data’s Eddie Mechelay said, ‘PadPro lets users configure the system to their needs and offers the simplicity of Android technology, enabling access to the system from on-site or remote locations.’
PadPro’s built-in local WiFi access point allows the controller to be accessed locally or remotely from a smartphone, tablet or other device. The system was designed for use in harsh environments. Configurable modules include WellMgr for liquid and gas wells and TankMgr for monitoring inflow and outflow from tank-level devices. Modules include alarm, shutdown and email notifications and a SQL database for data trending.
PadPro contains a 32-bit ARM7 CPU, 16 MB of flash memory, a lithium battery that can retain its contents for two years with no power, USB A and B ports, 10/100BaseT Ethernet port, and a two-year warranty. It is cased in corrosion resistant zinc-plated steel.
Siemens has acquired 3D virtual reality specialist VRcontex, developer of the Walkinside line of 3D visualization and training software for complex engineering data. Purchase price was not disclosed. Walkinside is currently used in plant operation, maintenance and servicing in more than 200 companies in over 30 countries. The software accesses plant information to display the current status of a plant in visually appealing, realistic 3D graphics. VRcontext’s focus to date has been offshore oil and gas installations where the company has created a ‘globally accepted standard’ for 3D visualization. Total Angola reported successful use of Walkinside in commissioning its Pazflor FPSO last year (Oil ITJ July 2011).
Eckard Eberle, CEO of Siemens industrial automation unit said, ‘Integrating Walkinside into our industrial software portfolio will make plant engineering and operation safer and more efficient in many industry sectors.’ Walkinside will be integrated with Siemens Comos plant management tool, linking the VR model to up to date plant information—with Comos acting as a ‘global data center.’ More from Siemens.
According to Iron Mountain, ‘companies commissioning business software from developers are putting their investment and valuable intellectual property at risk if they fail to protect the source code.’ Iron Mountain has teamed with ERP vendor SAP to offer such companies a ‘software escrow service’ to address the challenge. SAP’s Business One unit is to refer its software solution partners to Iron Mountain to secure their add-on solutions’ code.
Iron Mountain’s Par Keddy explained ‘Software escrow is a legal agreement between the developer, the licensee and a third party, who holds the software source code. It reduces the licensee’s exposure to risk and helps protect what is often a significant investment.’ Once the source code is deposited at a secure facility, users are protected against the risk of the developer going out of business or failing to support or maintain the application. Escrow assures that source code will be accessible in the future for updates. The offering is driven from SAP’s Business One unit which has some 350 add-on solutions developed by SAP partners.
Conventional wisdom has it that reservoir flow simulation is hard to parallelize. Most simulators are limited to around a few tens of cores. Rock Flow Dynamics claims to have broken the core count barrier in a test of its tNavigator simulator running on Moscow State University ‘Lomonosov’ cluster. The trial data set was a real field black oil model with 43 million active grid blocks and 14,000 wells. RFD’s approach centers on the use of compression to speed data access. Input data files are compressed 10x with the public domain Gzip library. To demonstrate the use of potentially sensitive data in the cloud, data was also encrypted. The model ran on an increasing number of nodes of dual four-core Intel Xeon 5570 Nehalem processors—up to 512 nodes (4096 cores). Speed-up maxed-out at 350x on 2048 cores.
In another test, on a 22 million cell three-phase model, calculation time was reduced from 2.5 weeks down to 19 minutes after numerical experiments and tuning. This represents a 1300x speed-up over a single core which is probably, according to RFD, a world record. The test showed linear speed-up on up to 2048 cores. From then on, core count doubling to 4096 showed a slight tailing off—but still produced a 1.4x boost.
RFD has also performed tests comparing Nvidia Tesla and ATI Fire GPUs against the Xeon and have found that re-coding to CUDA and OpenCL is ‘far from trivial.’ Moreover, the tests show that ‘benefits from the CPU/GPU combo remain unclear—and it may take generations of CPU architecture to become competitive with the Xeon.’ Read the full test results in SPE paper 163090. More from tom.robinson@rfdyn.com.
GSE Systems has just released a ‘Virtual flow loop trainer,’ an education aid that combines GSE’s expertise in high fidelity simulation modeling with ‘serious gaming” 3D technology. With the cost of a ‘brick and mortar’ flow loop training facility running at around $1 million, the VFLT offers a cost-effective way of training operators using a company’s existing IT infrastructure.
Serious gaming technology, according to GSE, is a proven technology that is ‘widely used by the US military.’ The VFLT uses it to provide ‘cost-effective’ VR that combines simulation expertise, system design and visualization technology. The system is applicable to any organization that needs to train staff on the fundamentals and practical operations of a wide variety of pumps, valves, controllers and instrumentation.
The VFLT includes real-time, high fidelity simulation data into the scenario for a deeper understanding of component and system functions. Instructors can confront trainees with phenomena that are difficult for students to visualize and understand—without risking any safety or environmental issues. Virtual environments are said to be well suited to the next generation of plant workers, which is well-versed in the digital world and expects a hands-on learning experience.
National Instruments is making a concerted push into the oil and gas vertical with a suite of ‘ready-to-use’ equipment and software bundles leveraging NI’s LabView FPGA technology to support oil country control and monitoring applications.
The bundles target nonconventional hydraulic fracturing operations with a pump monitoring and data collection system for fracking operations, a coiled tubing operations supervisor and an emissions monitoring system.
The frac pump bundle includes a tractor mounted rig with a diesel engine and pump combo. An electronic interface monitors critical functions and provides diagnostic information over an SAE J1939 bus. The NEMA 4X and Zone approved packages operate in temperatures from -40 to 70 °C. The FPGA-based ‘open platform’ interfaces with multiple sensors and software protocols.
An NI LabView system underpins Pemex’ Sistema de monitoreo de variables operativas (Simvo) that monitors and controls around 1.5 million bopd. Simvo replaced Pemex’ legacy systems that relied on phone and e-mail. NI’s field programmable gate array (FPGA) based systems provide high performance compute horsepower in a ruggedized unit. More from National Instruments.
Speaking at the Petroleum Industry Data Exchange (PIDX) fall conference in Houston this month, Pervasive Software CTO Jim Falgout spoke of the ‘big data challenge’ to oil and gas. Falgout outlined different approaches to big data. Analytic databases from Neteeza and Teradata are performant but ‘incredibly expensive to scale.’ ‘NoSQL’ data stores such as Cassandra, HBase and the open source Hadoop/MapReduce system offer ‘very cost effective scalability.’
Hadoop’s MapReduce programming model is powerful but hard to learn. Enter Pervasive’s ‘Dataflow’ programming paradigm which is scalable, supports iteration and is easy to learn. Dataflow offers a graphical programming environment for big data that can run locally or on a cluster. Dataflow programming is explained in a Dr. Dobbs article where Falgout traces its ancestry back to a seminal 1974 paper on parallel programming from Inria researcher Gilles Khan. More from Pervasive.