December 2012


Chevron virtualizes data

Chevron deploys Composite Software’s ‘data virtualization’ to extract business intelligence from complex operations on giant oilfield. Real time, ERP data leveraged to automate work orders.

Like BP last year (Oil IT Journal November 2011), Chevron has joined the ranks of companies deploying Composite Software’s ‘data virtualization’ approach to data federation and business intelligence.

Chevron uses data virtualization to combine real time and historical data from around 10,000 producing oil wells on its Kern River operations. Composite data virtualization produces daily work orders that optimize deployment of repair crews and work over equipment.

Operations involve federating real-time data on crew, equipment and status data from the wells and from SAP’s maintenance management system. Blending historical surface, subsurface and business data from the enterprise data warehouse leads to faster repairs, better uptime and more production.

Critical legacy systems were successfully migrated to the new platform with minimum disruption by ‘virtually masking’ the migrated systems. The project was completed ahead of schedule and with ‘lower risk.’ What was forecast as a thirty month project was completed in 18 months, saving ‘significant’ project costs.

Chevron’s Anestacio Rios, project lead for business and engineering application support received the leadership award for innovative solutions at Composite Software’s data virtualization day in New York earlier this year.

Composite’s Information Server is described as connecting to existing data ‘non-invasively,’ federating disparate sources and delivering information as data services. The server includes a graphical development environment for the design and development of database-centric objects such as relational views and XML web services.

In BP’s 2011 deployment, the performance hit of the virtualization layer was mediated by running the virtualization server on a hardware appliance from Netezza. Composite now claims that its ‘300 man years’ of query optimization research and optimized adapters provide significant performance optimization. A data source evaluation capability assures optimal federated query performance.

Composite’s approach of comprehensive enterprise-level data federation comes with a bewildering array of tools and technologies for configuring a virtualized infrastructure.

Data can be kept in original source data stores—or cached in a variety of databases and appliances including Netezza, Teradata (we have it on reasonably good authority that Chevron has one) and, most recently, SAP’s own HANA. More from Composite.


DI bags Wellstorm

Wellstorm’s WitsML toolset to form data bridge from Drilling Info’s solutions to data inside client companies’ firewalls.

Austin, TX-based Drilling Info, a supplier of hosted oil and gas business intelligence and decision support technology, has acquired the assets of Wellstorm Development Inc., an information management and development company based in Dripping Springs, Texas. Wellstorm’s technology will create a bridge between Drilling Info solutions and the data behind client companies’ firewalls.

Drilling Info CEO Allen Gilmer said, ‘Wellstorm’s technology aligns with our strategy and R&D objectives.’ The deal sees Wellstorm CEO, Hugh Winkler, taking the position of Drilling Info’s director of research. Wellstorm has developed a suite of well data communication tools based on the Energistics WitsML protocol. Its Subsurfr (Oil ITJ Sept 2012) displays WitsML data hosted on Wellstorm’s servers using open source mapping technology from MapQuest.

Drilling Info is to dedicate additional manpower to support ongoing research and development initiatives related to enhancing Wellstorm’s technology. In 2013, Drilling Info plans to add approximately 60 people to its current team of more than 400 employees. More from Drilling Info on Drilling Info.


Zen and the art of standards maintenance

Neil McNaughton celebrates this ‘standards special’ issue with a ‘rushed and poorly thought-out’ guide to standards. How far-off is a standard’s ‘event horizon?’ Are you near the community’s barycenter? Is the technology fit for purpose? Can you understand your technologists?

In this issue we have played catch-up with the conference season. While we attend quite a few ourselves, there is no way we can be everywhere. Fortunately, many conference organizers and standards bodies have the good grace, and I think good judgment, to put their proceedings online, allowing us to capture what I hope is at least the gist of what took place. This month we provide virtual reports from PPDM, OPC-UA, GITA/PODS, POSC/Caesar, a ping from the W3C Semantic web oil, gas and chemicals business group and some new ‘P-formats’ in our ‘Standards stuff’ feature.

Many have tried to describe the essence of what a standard is. I don’t propose to go there exactly, just to look at some of the things that characterize the different initiatives above. But before that, let me say what standards are not. Standards are not a ‘motherhood and apple pie’ phenomenon, an unequivocally ‘good thing.’ Neither, as been suggested, are they a sure-fire way of ‘avoiding major disasters’ or necessarily, enhancing operations in the environmentally tricky ‘high north.’

Standards generally involve a time-consuming, possibly costly effort of uncertain outcome. How costly? Well the latest ‘semantic’ EU project, ‘Optique’ will consume a cool €14 million of (mostly) taxpayers’ cash. Standards may fail completely or become commercial ‘footballs,’ kicked around by vendors, used and abused. Thankfully, they do sometimes work and may even be gratified with the ultimate accolade—take-up!

What follows is a rushed and poorly thought out guide to how to spot a ‘good’ standard that will hopefully bring you a degree of success with a minimum of headache.

First reflect on where the standards’ home community is in relation to you. If you work in the upstream, some of the kit that is deployed on say a production platform may be the same as that used in a refinery. This gives an apparent degree of comfort in that you are thus member of a community that extends to, say, the refinery. But but it may have a downside if the kit is so widely used that its center of gravity is out in the discreet process (manufacturing) business. Then you will have a smaller voice in how the standard develops. A related issue is the standard’s ‘event horizon.’ The larger the scope, the more ‘foreign’ problems come into view and the more complex and intractable becomes their solution. Even apple pies may be too big to digest!

Next check out the underlying technology. Today, standards are delivered as Word documents, XML schemas, SQL data models. All of which are by now reasonably familiar technologies. They have the advantage that you should not find it too hard to hire people who understand the basic mechanics of the protocol. If the standard uses more esoteric technology like the semantic web (as PCA and its US cousin Fiatech have), then you are talking to a much smaller community of practitioners and expose yourself to the risk of being bamboozled by the technologists.

Another consideration is what exactly is being standardized and how ‘mission critical’ does it have to be. In telecommunications and finance, phone calls and cash circulate around the world thanks to prescriptive, tightly drafted standards. In the upstream, there is in general some wriggle room for personalization and preference. An SEG tape standard or a PPDM database will often require some inspection before use. Why don’t we have tight, prescriptive standards in oil and gas? That is a good question...

You might think that all of this is just an amusing part of technology’s life and that it doesn’t really matter that standards progress slowly—they are after all great forums for getting together and discussing stuff—not at all a bad thing. But what would Neil McNaughton do if he were in charge of a standards project?

Taking it from the top I would suggest the following rules. Use a technology that already has traction (XML or SQL). But also use a technology—don’t deliver in a Word document or a ‘spec’ for an 80 column Ascii card. The consensus today is that XML is the bees knees. When you are through developing your standard, provide a validation mechanism . This is part and parcel of the XML package, less obvious in the SQL world. Use familiar terms that your community really understands and (preferably) agrees on. If you use a standard vocabulary or ‘list of terms’ do not pretend it is an ‘ontology.’ Do not seek to or (even worse) claim that you will ‘boil the ocean.’

The unfortunately rather common ‘boil the ocean’ phenomenon works at two levels. First any project that includes the word ‘enterprise’ implies ocean boiling. Second, as soon as IT is involved, there is a tendency to want to abstract a problem set into generalizations. These tend to extend the standard’s event horizon with stuff like ‘data agnostic’ transport mechanisms. This leads to amusing situations such some slide ware showing POSC/Caesar’s ISO 15926 as carrying OPC-UA traffic and elsewhere, OPC-UA as carrying ISO 15926.

You may or may not be impressed by all of this. You would be in good company. You only have to look to how BP (OITJ Nov 2011) and Chevron (this month’s lead) approach enterprise data integration. Both companies, while supporting plethoric standards efforts, are more circumspect when deploying their ‘real world’ solutions and have deployed Composite Software’s data virtualization toolkit—buying or building adapters to native data sources. This may be because at the enterprise level, there are so many ‘standard’ protocols around that there may as well be none!

To sum up. May your standards be focused—small is beautiful. Use your domain specialists to fix domain problems. Don’t fret about stuff beyond the event horizon—there are many off the shelf solutions for data integration at the enterprise level now. Use tried and tested technologies deployed by knowledgeable and straightforward technologists. Talk to them! Ask for their ‘elevator pitch.’ If it’s all Greek to you that is a bad sign. If they say ‘enterprise’ or ‘ontology,’ you could even show them the door. Happy holidays.

Follow @neilmcn


Letter to the Editor

Chief executive Malcolm Fleming defends Common Data Access’ ‘training the data managers’ initiative and argues that subsurface data management is ‘not a sub-division of IT.’

The article captioned “Training the data managers” that appeared in the October edition of the OIT Journal included several important inaccuracies and misunderstandings which I would like to correct.

CDA is working closely with ECIM, PPDM and representatives from vendors and service contractors (all of whom are very actively engaged in subsurface data management) on a broad program to ‘professionalize data management’. Mapping domain competencies is one dimension of this program. Another is the transformation of these competencies into a curriculum which will be made available for the development of consistent education courses and training classes. The ‘portal’ and the self-assessment and third-party skills verification (which you indirectly and disingenuously equate to the questionable LinkedIn endorsement scheme), is intended only as an interim measure on the road towards the third important dimension of our program—an independent, professional accreditation scheme.

It’s clear from the tone of the article that you and we hold significantly different understandings of the term ‘subsurface data management’. We define it as ‘the development, execution and supervision of plans, policies, programs and practices that control, protect, store and deliver subsurface E&P information (comprising geological, geophysical, petrophysical, production and reservoir engineering and associated cultural information and metadata) in all its physical and digital forms and formats.’ No doubt, elements of this definition can be challenged but we think it adequately describes the scope and purpose of subsurface data management. We are very clear that subsurface data management is not a sub-division of IT (a worthy but separate profession, with its own pioneers and ‘canon’) and in our view subsurface data management does not include such laudable achievements as ‘programming the [Unix/Linux] Shell’ as you suggest. Although the subsurface data management function is frequently located within the IT function, in our view it embodies very many important distinguishing features which are independent of information technology, both proprietary and general. We of course recognize that subsurface data management exists within and depends upon a complicated contextual framework which includes IT, geoscience, applications management and other activities and disciplines but we feel strongly that it has its own, distinct identity.

We see two important reasons for describing and professionalizing subsurface data management. The first pertains to people. It is well documented that industry faces a challenge to attract young people, both in general and in particular to help deal with the so-called ‘crew change’. It is doubly hard then for us to attract graduates (and others) into work that is perceived by many (and here I quote from your article) to be ‘a mixture of some very dull stuff and some very complicated stuff, all embodied in a rather disorderly way of working.’ It is precisely this view that we must change if data managers and the work they do are to be fully recognized and appreciated. It is also crucial that both new entrants and incumbents see a tangible and rewarding career path within this profession. The second argument draws from the crucial contribution that data management makes to understanding the subsurface, and by extension, to the quality and timeliness of E&P business decisions. In a study* we conducted last year, we showed that more than one quarter of business value depends on data and its professional management, yet today we have no consistent way of measuring or assuring the competence of those working in this area.

There are still many detailed aspects of our program that remain unclear but we firmly believe that there is a compelling business need to professionalize workers in this area. It is easy to criticize and condemn, especially something that is immature and aspirational as in this case, but I (and more importantly, those data managers who have put their time and effort into the program so far) would really appreciate a more balanced and more charitable account.

With kind regards

Malcolm Fleming

* The Business Value Case for Data Management, CDA, February 2011.


Flare and the ‘narrow search paradox’

Paul Cleverley compares Google-style search with query expansion on major’s global library.

Flare Consultants’ Paul Cleverley observes that the emergence of tools such as Google’s Knowledge Graph, Apple’s Siri and Wolfram Alpha are demonstrating that enterprise search is ripe for a shift from the ‘keyword’ paradigm to more sophisticated techniques. In a recent paper, ‘Improving Enterprise Search in the Upstream Oil and Gas Industry, Cleverley discusses some of these statistical and semantic knowledge representations are set to improve an area where oil and gas ‘has so far lagged behind.’ Organizations face a disconnect between the terminology used in search and the inherent ambiguity of information sources. The mismatch leads to critical information being missed.

Cleverley’s ‘narrow search paradox’ occurs when, as the number of words used in a search increases, the proportion of relevant results actually decreases. He believes that internet-style addition of more terms to ‘refine’ queries may be a problem today.

Enter automatic query expansion (QE) leveraging semantic web technologies and public domain glossaries such as ISO 15926, Schlumberger’s oilfield glossary, and word lists from IHS, the USGS and others. Tests performed on the global document collection of ‘one of the largest corporations in the world*’ showed that QE retrieved, on average, an additional 43% of ‘relevant’ results.

In Google-like search, what counts is what’s on the first page. Corporate search may have different expectations, such as returning an exact list of results. In general there is a trade-off between information recall (completeness) and precision (accuracy). Cleverley shows how semantic technology (Protogé), along with publicly available taxonomies, can be used to improve classification and obtain a better balance between recall and precision.

* Shell is mentioned in the text.


BP’s new supercomputer—now at ‘up to’ 2 petaflops

67,000 CPU system with 536 terabyte RAM shows 267 fold speed-up over 2003 machine.

BP is to up capacity at its high performance computing center in Houston to what it claims is now ‘the largest supercomputing complex for commercial research in the world.’ The new HPC center, scheduled to open mid-2013 at BP’s Westlake Campus will be a hub for processing and managing geologic and seismic data from BP’s worldwide operations. BP’s current HPC center, with a petaflop capacity has maxed out in power and cooling capacity in its current location. A new three-story, 110,000 square foot facility is being built to house the new machine.

We reported on BP’s supercomputer back in our April 2003 issue and it is interesting to see how things have evolved in the past ten years. The 2003 system, built by HP at a cost of $15 million, comprised 259 cluster nodes each with four processors and 32 GB of memory, making for a total of just over 1,000 CPUs, 8 terabytes of RAM and a 7.5 teraflop bandwidth. By 2007, BP’s system had grown to 14,000 cores and 100 teraflops (OITJ May 2007). The 2013 system’s figures are 67,000 CPUs 536 terabytes RAM and a bandwidth of ‘up to’ two petaflops. The result, according to BP is that an imaging job that would have taken 4 years in 2003 will now run in just one day.

In an interview with HPCWire, BP head of supercomputing Keith Gray confirmed that BP still eschews the GPU as a computational accelerator. The new machine is built around a variety of Intel Xeon multi-core CPU nodes from HP and Dell. The network is Arista’s Ethernet and storage system providers include Panasas, IBM, and DataDirect. More from BP.


BGS leverages GSI3D mapper in 3D UK geology

New geological fence diagrams offer non conventional, geothermal exploration insights.

The British Geological Survey (BGS) has released GB3D, a new 3D geological map of Great Britain, a set of geological fence diagrams that crisscross the country, illustrating the UK’s solid geology. GB3D targets users in the natural resources exploration, educators and the public.

GB3D, a.k.a. the National bedrock fence diagram can be downloaded from the BGS website in a variety of formats.

Cross sections were created using the GSI3D geological modeling package (0701) along with surface mapping, boreholes and geophysical data. BGS executive director John Ludden said, ‘The new model clearly shows the structure of our key aquifers. Understanding their 3D geometry of will help safeguard these resources and provide a foundation for the development of new resources such as shale gas and geothermal heat sources.’ Download GB3D here and read the ‘making of’ here.


2012 PPDM Calgary data management symposium

ERCB’s GeoDiscover portal. Noah on ‘Well_Component.’ Neuralog on raster data, ETL’s ‘practical data management.’ EnergyIQ on business rules. Safe’s data ‘harmony.’ geoLOGIC on PPDM usage.

Dwayne Popowich (Energy Resources Conservation Board Alberta, which aims to be the ‘best non conventional regulator in world’) is working on an ‘open data’ concept to provide public access to electronic data in accessible and machine readable formats with minimal restrictions Enter the GeoDiscover and the Oil sands information portals. ERCB is working on geodetics, reserves classification and metadata standards for text and spatial data.

For Noah’s Paul Haines and John Ruddy, the PPDM Well_Component table (new in V3.8) is the key to enterprise data federation. Noah uses the table to tie different PPDM sub-models together in an ‘architecturally consistent way.’

Neuralog’s Rob Best provided guidance on raster log management—showing how to migrate an inefficient combination of files on disk and data in project data stores to a managed environment in NeuraDB. NeuraDB loaders offer a range of tools to match raster logs to wells, extract header data from SIF files and crawl network drives for images. Metadata can be built from file names and folder structures or extracted from project data stores.

Richard Cook (ETL Solutions) is an advocate of ‘practical data migration’ supported by documented templates. These allow for systematic data discovery, cleansing, test, migration document and reporting. A key issue is how to manage the legacy and new systems in parallel, which one is active, and how and when to switch off the old system. Good tools for data migration include Altova Map Force and ETL’s own Transformation Manager suite. Cook concluded observing that migrating to PPDM 3.8 was challenging and required standard mapping rules, test data and a good understanding of the data model.

Steve Cooper (EnergyIQ) argued that the PPDM business rules initiative should include ‘a consistent process’ for their application and more quantitative assessment of data quality. The key is to group related information into manageable data objects such as well location, directional surveys and tests. Rules are described in standard SQL and stored in the PPDM data model.

Kris Majury, FME ‘ambassador’ with Safe Software enumerated some of the many tools in SQL, Python and applications in the data manager’s arsenal. Majury showed how Safe’s FME Workbench and Server can be deployed to perform data harmonization tasks on-demand or as scheduled tasks.

Wes Baird gave some practical advice on PPDM usage based on his experience with Geologic Systems. Baird showed that a good knowledge of data and the model is required to realize the vision of an ‘enterprise data repository.’ You need to ‘understand chaos,’ ‘un-layer’ data and stick to a schedule. It is a lot of work! In the context of Baird’s advice, it is interesting to reflect on the fate of the ‘PPDM in a box’ initiative. This attempt to ‘shrink-wrap’ a PPDM database appears to have stalled.

The PPDM presentations are now online.


Software, hardware short takes

Blueback Reservoir, Dynamic Graphics, ClearEdge3D, Energy Graphics, Energy Solutions, Kongsberg, Yokogawa, Schlumberger, Oncam Grandeye, OSIsoft, Pegasus Vertex.

Blueback Reservoir’s Toolbox V13 includes a project history overview, spectral blueing and a well testing workflow including pressure deconvolution. A new version of the Geodata Investigator supports data analysis in Petrel combining well data, seismic, 3D models and surface data in cross plots and histograms.

The latest 6.0 release of Dynamic GraphicsCoViz 4D includes ‘Pem2Seis’ a petroelastic model to synthetic seismic generation tool, a new integrated well designer, viewer-integrated time-to-depth conversions and 3D stereoscopic display.

Release 4.0 of ClearEdge3D’s EdgeWise Plant extracts ‘up to’ 90% of the pipes in a scene providing dimensionally accurate valve, flange and other component placement. A virtual demolition tool removes pipes and associated points from billion-plus point clouds.

Energy Graphics has touch-enabled its Intellex suite offering interactive wall maps and data reports on the iPad, managed and served from the new Intellex GIS server.

Energy Solutions International’s Synthesis 11.0 pipeline manager includes enhancements for multiple tariffs on nomination measurement and operations and configurable data fields for ticketing.

Kongsberg’s LedaFlow 1.2 enhances hydrodynamic slug prediction and helps mitigate flow disruption. A new buried pipe feature evaluates corrosion risk and helps material selection for flowlines, risers and pipelines.

Yokogawa’s ProSafe-RS safety instrumented system now operates reliably in high-temperature environments such as desert locations. Modbus/TCP communications improves compatibility with third party control systems.

Schlumberger has announced a shale field development plug-in for Petrel. The package combines reservoir and completion quality indicators into a map of ‘sweet spot.’

Oncam Grandeye has announced Evolution ExD, a 360°, explosion-proof surveillance camera for harsh and hazardous environments such as on an oil and gas production facility.

The 2012 release of OSIsoft’s PI Server promises improved performance, reliability and security. PI Server scales to capture ‘burst’ data rates from millions of data streams and lets users keep all their historical data online. New ‘backfilling’ features capture data from multiple systems into a single data store which can be viewed with a new PI Coresight iPad app.

Pegasus Vertex has announced CEMLab, a cement lab data management system that assigns tasks, manages test results and generates reports, replacing ‘tedious’ manual bookkeeping. Users can login to the tool from a web browser.


National Data Repositories 11, Kuala Lumpur

Petronas-hosted event provides snapshot of NDR deployments around the world.

The 11th edition of the Energistics-sponsored, National Data Repositories (NDR) gathering was hosted by Petronas in Kuala Lumpur earlier this year. In this brief report we have grouped the submissions (to the best of our ability) by the main supplier involved.

Oldar, the oil data library of the Azerbaijan Republic was delivered by Fugro Data Solutions earlier this year and is currently under further development. Oldar provides data to entitled users within Socar, the state oil company.

Columbia’s E&P information service (EPIS) is run by local service providers Colosoft and Geoconsult using software (Whereoil) from Kadme. The 72 terabyte geoscience NDR is owned by the regulator the Agencia Nacional de Hidrocarburos. A second ‘Production Monitor’ database is under development by Schlumberger.

Norway’s Diskos, the granddaddy of all NDRs is currently operated by Landmark using its PetroBank software (originally developed for Diskos.) The very mature deployment (Diskos started out in 1995) now holds some 320 terabytes of data. Suriname’s Staatsoile operates an OpenWorks repository alongside the iGlass seismic data management system from Kelman (now Katalyst). Pakistan’s Petroleum E&P data repository (PPEPDR) is likewise a PetroBank-based solution extended with LMKR’s Exploration Management System, a multi-user desktop GIS and data management application. LMKR also manages the PPEPDR which started in 2001 and is now a 22 terabyte data set.

In the Schlumberger camp, we have Algeria whose Alnaft NDR is under development leveraging ProSource and the Seabed data model. The Nova Scotia offshore petroleum board’s data management centre is run by Schlumberger under a service contract. The ‘small’ data center nonetheless holds over 3 terabytes of log curves, images and seismic data. A hardware and software refresh is scheduled for March 2013. South Africa’s Petroleum Agency also deploys Schlumberger’s technology—Finder, LogDB and AssetDB—along with the LaserFiche document management system. The UK’s 20 terabyte Common Data Access NDR for offshore data is currently under Schlumberger management while the onshore UKOGL database is run by Lynx.

Other presentations either did not name a particular software provider or implied at least that theirs were in-house developments. Several of these (like Uganda’s Crane E&P database) are home brew developed on Microsoft Access, SharePoint and other off-the-shelf software. Some (Ghana, Kenya) have support for an upgrade path from Norway’s Petrad.

2013 looks to be an interesting year for providers of NDR products and services as Brazil’s ‘BDEP’ NDR is due for a major technology refresh (the tender is out). Meanwhile, Mexico’s National hydrocarbons commission (CNH) appears to be wrestling control of state data from Pemex with a legal mandate for a new E&P data base—currently a Microsoft SQL Server pilot with plans for a business intelligence platform.

Holland’s ‘Dino’ national subsurface data centre is likewise subject to new legislation that promises open access from a Dino 2.0 next generation register of subsurface data integrating with the country’s eGovernment systems. Visit the NDR home page here.


SEG 2012—Las Vegas

Plenary ‘social responsibility’ session. M-OSRP and seismic ‘group therapy.’ Cable-less acquisition surveyed. Automated and touch-enabled interpretation. Micro seismic monitoring of frack jobs. SpiceRack autonomous marine sensors. Fractal methods. Nvidia’s Index. Petrel on SQL Server.

President Bob Hardage gave his state of the society address at the SEG council meeting. Hardage opined that the SEG’s focus should be technology and the advancement of appliedgeophysics. Geology and engineering should be the bailiwick of the AAPG and SPE respectively. The SEG is working to implement new governance and improve communications between its executive council and the 33,000-strong rank and file. A new ‘Interpretation’ publication is to hit the newsstands mid 2013. Corporate sponsorship is at a historic high and overall revenues are nearly $20 million. A new unconventional resources technology conference ‘URTeC’ will kick of, in partnership with AAPG and SPE, in Denver next summer.

Five years ago, the SEG ‘almost came to extinction’ following a constitutional crisis. This was solved with a ‘great compromise’ as the board replaced the council as the SEG’s governing body. Accordingly, Hardage wound up by ‘retiring’ the old SEG gavel and handing a brand new one to incoming ‘first ever’ council chair, Mike Graul.

Peter Pangman reported from the SEG’s first ‘corporation,’ SEAM, the SEG Advanced Modeling Corp. SEAM filled the need for industrial strength earth models to test and collaborate on novel technologies. SEAM Phase I is now in its 6th year, funded by the US Govt and 24 corporate sponsors. Phase one has produced a 220 terabyte dataset of a deepwater target. 22 members have already signed up for Phase II which addresses onshore exploration including non conventional targets.

Perhaps the new organization explains why the SEG missed a trick or two in how the annual conference proceeded. The Honors and Awards session was well attended, but the opportunity for ‘communication’ was passed over as there was nothing in the way of discourse or address. Another disappointment came in the Monday plenary session, the worst-attended event we have ever seen at a major exhibition, with under a hundred present in the cavernous ballroom for the 10am kick-off. The reason? Well, the subject could hardly have been further from Harding’s exhortation to focus on geophysics. The BP-sponsored event elected to investigate ‘corporate and academic social responsibility; engagement or estrangement?’

Jonathan Nyquist (Temple University) observed that the ‘e-word’ (environment) is being replaced by the ‘s-word,’ sustainability (not shale!) The ‘Geoscientists without borders’ (GWB) program is facing a funding problem. Nyquist encourages corporations to get involved to enhance their reputation and ‘build the workforce’ in the face of a geoscience demographic deficit—forecast to be around 150,000 by 2010. CGGVeritas was the only corporation that responded to the SEG’s invitation, although the event was sponsored by BP. While Isabelle Lombert joked that she might get the Pinocchio ‘greenwashing’ prize, she made a good case for CGG’s efforts to limit the impact of seismics with narrow, meandering lines and minimal clearance. Elsewhere, in streamer design and high performance computing, ‘green’ equates with efficiency. CGG claims the largest oil immersion HPC data center in the world—with ‘90% energy savings.’ Employees have a corporate citizenship program and get ‘solidarity leave’ to work with NGOs. CGG also supports GWB and is a member of the UN Global Compact reporting initiative. This involved 90 key performance indicators and is a ‘daunting task.’ Mike Oxman, from consultants Business for Social Responsibility offered a new industry paradigm for transparency, social and legal impact and human rights. All of which requires navigation through a complex landscape of stakeholders (UN, OECD, ISO 14001, GRI, FCPA and SEC) and national laws. ‘Studies show’ that CSR has a positive return on investment. In the debate, one speaker mentioned the irony of a social responsibility discussion taking place in a ‘totally unsustainable city dedicated to conspicuous consumption.’

Art Weglein (M-OSRP) presented the first field data examples of direct depth imaging without a velocity model. Weglein believes that research is prejudiced in favor of the ‘velocity field’ and that if you claim, as he does, to have developed a method to process sans velocity, ‘everyone breaks out in hives!’ All processing objectives, multiple removal, attenuation, depth imaging etc., can be achieved without subsurface information. Weglein’s ‘group therapy,’ a.k.a. the inverse scattering series (ISS) has layers ‘chatting amongst themselves’ until they output correct depths and flattened gathers. The latest M-OSRP tests, Weglein claims, demonstrate the viability of ISS. The overburden reflections avoided in conventional imaging are precisely what ISS imaging leverages. More from M-OSRP.

Total’s Henri Houllevigue reviewed the state of play in ‘cableless’ acquisition, observing that there is probably only one true cable-less system in use today. Power and communications requirements mean that most are a combination of cable and wireless systems. But there is a general recognition that cableless is enabling better resolution with point receivers.

As indeed was borne out by presentations from Saudi Aramco (covered in last month’s editorial) with a 100,000 trace land trial that heralds a revolution in seismic quality and a corresponding boom in data volumes. The quality enhancement is driving new automated workflows for both processing and interpretation. As Brian Wallick observed, horizon autotracking on the old data was a ‘70% solution.’ Point source has brought this to ‘nearly 100%.’

Several presentations and vendors presented work in this space. Paradigm has announced new ‘constrained autopicking’ in Skua—with the ability to track ‘hundreds’ of horizons in a depositional context. Eliis’ Paleoscan adds a ‘seismic stratigraphic’ element to a semi-automated interpretation workflow.

XinMing Wu—(Colorado School of Mines) presented a three step process to go from seismics to the seismic stratigraphic Wheeler diagram using instantaneous phase and a cost minimization algorithm to transform the image to relative geological time. The results were displayed as a Wheeler volume movie.

More compelling was the presentation by Saverio Damiani of Schlumberger’s ‘Extrema/seismic geology.’ Extrema promises automated seismic stratigraphic interpretation by identifying events such as horizon terminations and delimiting sequences. Output is again the Wheeler diagram along with seismic facies. Today’s interpreters are confronted with numerous subtle reflector terminations and unclear chrono stratigraphy and problematic surface extraction. Extrema’s objective is to standardize interpretations. Curiously, the ‘automated’ technique is currently only available from Schlumberger’s petrotechnical service unit using an ‘internal’ Petrel plug in. The results were shown using offshore Angola data and are said to be useful in jump correlation from one basin to the next and to create a detailed static model for input to Schlumberger’s Visage geomechanical modeller. The tool is claimed to result in a ‘huge increase in interpreter productivity.’

Terraspark’s ‘Turbo AFE’ accelerated fault extraction leverages GPU technology for automatic fault extraction. It works as a background task on an undecimated 3D volume. The tool runs on a ‘desktop,’ actually a rather chunky four unit box on the floor!

Halliburton/Landmark is blurring the processing/interpretation boundary with SeisSpace/Promax, now a part of a ‘greater DecisionSpace and Open Works framework.’ SeisSpace now offers high volume parallel processing on distributed memory machines. The SeisSpace API has been used to good effect by poster children Crestone Seismic and Canonical Geoscience. The DecisionSpace interpretation tool was being shown with a new touch-enabled interface using a hardware overlay from Perceptive Pixel (now bought by Microsoft). Touch-enablement is seen as key in the development of new interpretation workflows and for the mitigation of repeat strain injury—said to be ‘a huge challenge for clients.’ In the same vein, Halliburton now offers its pore pressure prediction app on the iPad—including access to the Cloud for historical data. Halliburton also uses iPads in the field with OpenWells mobile and for data entry to EDM.

The geophysical profession is throwing all it has into the non conventional boom with a variety of specialist seismic acquisition and monitoring technologies on offer. Peter Duncan, founder and CEO of MicroSeismic gave a spirited account of how surface monitoring is key to understanding what is happening during hydraulic fracturing. Currently, ‘only 4%’ of fracked wells are monitored. Additionally, monitoring ‘confirms containment’ i.e. can be used to demonstrate to environmentalists and others that fracs do not affect the water table. Monitoring may also used to provide alerts when surface motion exceeds a threshold as now mandatory in the UK. MicroSeismic has opened an online ‘Reservoir intelligence’ portal of monitoring information and offers real time communication of frac data to Houston and an engineers’ iPhone ‘so he can be golfing as he fracs!’

CGGVeritas CEO Jean-Georges Malcor and Baker Hughes VP Andy O’Donnel announced a collaboration on shale gas operations. Again, seismic monitoring is seen as key to ‘show environmentalists that we don’t affect the water table or activate faults.’

In another teaming, CGG hooked up with Saudi Aramco on ‘SpiceRack’ a very high end autonomous, cable-less node for full component sea-bed acquisition. SpiceRack gets a check out while on the mother vessel before sliding down the launcher to the ocean bottom. After the mission completes, it swims back under its own steam to a ‘dream catcher’ on the support vessel. Despite the sexy technology, it is hard to equate the deployment of a very limited number of high end nodes with the push for throw-away geophones for million-trace deployment. Whatever, vive la difference!

There was considerable buzz around Chris Green’s (Getech) poster presentation on Cryosat-2, a new high resolution geodetic altimetry satellite that promises a ‘renaissance’ of satellite gravity processing and ‘imaging.’ Some potential field ‘forgotten truths, myths and sacred cows’ were visited by Alan Reid (University of Leeds) including the likelihood that magnetic data is self similar i.e. ‘fractal.’ Self similarity ‘pervades geology’ and is ‘where the real profit is to be made in the next 20 years.’ Reid says ‘put your best graduate students onto it.’ The fractal concept should be recognized as a paradigm shift although ‘few of us [oldies?] can make it, we can start.’

On the exhibition floor, Trimble was showing ‘GateWing,’ a small photogrammetry drone with pinpoint GPS. The exchangeable polystyrene wings and body are good for ‘3 normal landings or one bad one.’ The $70k device was used to provide spectacular imagery of Easter Island in a recent documentary. Fraunhofer’s global address space programming interface GPI-Space provides a virtual memory layer across cluster and a generic workflow engine. GPI-Space is used by Statoil to parallelize SeismicUnix and legacy Fortran/C codes. Fraunhofer may consider open sourcing GPI-Space ‘if we can find the right business model.’ On the Paradigm booth, Nvidia showed work done for a major oil using its ‘interactive and scalable subsurface data visualization framework,’ Index. Index is a production of Nvidia’s advanced rendering center in Berlin whose ‘Mental Ray’ is used in film computer graphics (The Hobbit, Superman). Index is a specialization of the technology for seismic feature extraction and imaging. The toolkit includes a distributed computing library for parallelizing across heterogeneous CPU/GPU architectures, and a ‘geospatial’ library for ray tracing and visualization. The ‘simple’ API hides details of cluster allowing interaction with huge datasets from a ‘web browser or iPad.’ Paradigm is working to leverage the technology to provide compute intensive capability to e.g. Barnett Shale field workers.

Finally, word on the exhibition floor was that Schlumberger will be rolling out a version of Petrel running on a Microsoft SQL Server database ‘real soon now.’ We would like to tell you more but Big Blue is too successful at ‘navigating the media maze’ for us!


Folks, facts, orgs ...

API, Altor, Belsim, BP, CGGVeritas, Clariant, CSC, DataMarket, Drill-Quip, Energistics, ForgeRock, GeoNorth, Hercules Offshore, HP, Huntingdon Bank, IEA, Iocom, ION Geo, Iron Mountain, Mahindra Satyam, DNV, National Oilwell Varco, Deloitte, Phusion, Picarro, Sempra Energy, Fugro.

Marathon Oil’s Clarence Cazalot has been elected chairman of the API board.

Tor Krusell is now head of communications for Swedish fund Altor. He was formerly executive VP with Skanska.

Luc Rossion has joined Belsim as project manager of the Satorp. Suryaprakash Digavalli and Nikolaos Kalampalikas have joined the downstream engineering group. Belsim has also hired Karl Wilvers, Julie Couturier and Stephane Le Roux as developers.

BP has Lamar McKay, as chief executive upstream.

Daniel Valot is to sit on the CGGVeritas board representing the French government’s FSI shareholding.

Clariant has appointed John Dunne as head of its oil and mining business unit, replacing Christopher Oversby.

CSC has named Doug Tracy as CIO. He was previously with Rolls-Royce.

DataMarket has launched an energy data market portal, a subscription service for time series and survey data from official US and international energy information sources.

COO of CASA Exploration Terry Jupp has been appointed to Dril-Quip’s board.

Eric Toogood of the Norwegian Petroleum Directorate and Weatherford’s Espen Johansen have been elected to the Energistics board. New Energistics members are Beijing Rigsite IT, Energysys, the Iraqi Ministry of Oil, Key Energy Services, Macrosoft Sekal AS.

ForgeRock has named Mike Ellis as CEO. He hails from i2 Technologies and SAP.

GeoNorth has recruited Jason Kettell, Kevin Traver and Shaun Wilson to its development team.

Hercules Offshore has appointed Jim Noe as Executive VP, Cecil Bowden as VP engineering and capital projects, and John Crabtree as VP technical support, compliance and maintenance.

HP has appointed Mike Nefkens as executive VP enterprise services. He was formerly with EDS.

Stephen Hoffman has been appointed MD, energy banking for Huntington Bank.

Didier Houssin has returned to the IEA as director of sustainable energy policy and technology.

Dan Marchetto has joined Iocom as VP business development.

Ken Williamson has been promoted to Executive VP and COO of the Geo-Ventures division of ION Geophysical. Chris Usher has joined as executive VP and COO of ION’s new geoscience division.

Iron Mountain has appointed William Meaney as president and CEO.

Mahindra Satyam has appointed Rebecca Blalock as strategic advisor for its energy and utilities business. She was previously Senior VP and CIO of Southern Company.

Stefan Nerpin has been named group VP communications and external relations for DNV. He hails from Vattenfall.

Clay Williams has been named president and COO of National Oilwell Varco. Jeremy Thigpen takes his place as senior VP and CFO.

Deloitte has named John England as leader of its US oil and gas sector. He replaces Gary Adams, now Deloitte Consulting’s lead rep at a major oil and gas client.

Engineering IM specialist, Pearson-Harper has rebranded as ‘Phusion.’

Subra Sankar is now VP engineering with Picarro. He was formerly co-founder of Artium Technologies.

CEO Debra Reed has been elected chairman of Sempra Energy, succeeding Donald Felsinger, who is retiring.

Arnold Steenbakker is stepping down as chairman of Fugro’s board following a ‘difference of opinion’ over the company’s future direction. He is replaced by vice-chairman Paul van Riel.


Done deals

Aker Solutions, Canrig, McJunkin, PetroSkills, Platte River Equity, Weatherford, Kongsberg O>.

Aker Solutions has terminated its agreement with the seller of NPS Energy. Aker is to buy Canadian asset integrity management specialist Thrum Energy.

As a component of a strategic partnership, Canrig has acquired the rights to a portfolio of Oiltech’s rig products and technologies.

MRC Global ‘s US operating subsidiary, McJunkin Red Man has signed an agreement to acquire the operating assets of Midland, Texas-based Production Specialty Services, Inc.

PetroSkills has acquired the training and consulting businesses of John M. Campbell & Company of Norman, Oklahoma.

Platte River Equity has acquired Oklahoma-based The WellMark Company, manufacturer of liquid and pneumatic flow controls and valves serving the oil and gas and petrochemical industries.

The SIX Swiss Exchange has granted Weatherford’s request to an extension of the due date for publishing its half-year interim report for the first half of 2012 until December 17, 2012. Weatherford is to make amended filings reflecting adjustments to prior periods resulting from the company’s thorough review of its historical accounting for income taxes.

Kongsberg Oil & Gas Technologies is to acquire 100% of the shares in software developer Advali. Advali’s 120-strong workforce is based in Bangalore. Kongsberg Oil & Gas is also to acquire Apply Nemo AS, a supplier of engineering services, products and solutions for subsea oil and gas applications. Apply Nemo is headquartered in Oslo, Norway and has 172 employees. Turnover for 2012 is expected to be approximately NOK 275 million.


OPC-UA Inaugural tech summit Orlando

Schlumberger ‘OPC-UA a lifesaver.’ Siemens’ transition in full swing. Indusoft, Matrikon on board.

According to Tom Burke, OPC Foundation chairman, the new ‘universal architecture (UA), is a ‘multi platform, secure, interoperability protocol for moving data from embedded devices to the enterprise.’ The original OPC was based on Microsoft’s ‘object linking and embedding’ COM protocol. The inaugural OPC technology summit, held earlier this year in Orlando, focused on UA as the new ‘glue that binds innovative devices together.’

Mbaga Ahorukomeye (Schlumberger) sees OPC-UA as a route to Scada systems and which can be extended to industry-specific protocols like Wits/WitsML. Schlumberger leverages KepWare’s OPC-UA technology at the heart of a ‘generic’ rig interface that runs the auto driller, top drive, mud pump and controls real time data flows and rotary steerable systems. Domain specific protocols are converted to OPC-UA with RigAdapter, a software library that implements a control state machine leveraging OPC Foundation C# libraries. OPC-UA is seen as a way to interface with multiple legacy tools and services. Ahorukomeye concluded, ‘Adopting OPC UA in our drilling automation architecture was one of the best things we did.’

Siemens’ head of software Thomas Hahn outlined how OPC-UA offers a standard interface across all levels of automation. It is OS independent and provides better security than vanilla OPC. Siemens sees OPC-UA certification as a route to interoperability, ‘the transition to UA is in full swing.’

Dave Emerson (sic) presented Yokogawa’s UA Server for its Centum VP platform which translates the Centum protocol to OPC-UA. Yokogawa’s Fast/Tools is claimed to be the first UA application certified for Scada. Key benefits include ‘industrial’ web services, a multi-protocol information model and cross platform capability. UA will ‘increasingly be a component of Yokogawa’s products.’

Chris Coen described legacy OPC as one of the most popular (of 450) interfaces to OSIsoft’s PI System. A UA interface is under development—the hope is that this will ‘free us from DCOM and a Windows-centric world.’ Moreover the new web services architecture is ‘probably a lot more secure than aging legacy protocols with well documented deficiencies’.

Indusoft’s Scorr Kortier sees OPC UA as slowly being embraced by oil and gas drilling companies. OPC solves the limitation of WitsML in that it is ‘drilling only’ and provides limited opportunities for better well planning, optimized drilling and completion and generally lacks ‘enterprise level’ knowledge management and optimization. Drivers for change include disasters like Macondo, HSE and risk issues and better project management. The future will see UA-based integration from office to rig instrumentation. The digital oilfield is at a critical point in its development, real time data structures have evolved with little conideration to overall design and are too closely linked to proprietary protocols. The future is UA.

Jeff Gould revealed that 30% of MatrikonOPC’s business is now UA. This includes managing Linux data from a wellhead data logger and flow and level meter data. At a higher level, OPC-UA provides plant status and ‘enterprise level’ communications over http sans DCOM. ‘It’s the first time VPs have had real time visibility.’ OPC-UA also promises a longer software life cycle. On the security front, NERC-CIP and API compliance was cited. Chevron, Shell and Honeywell are users. Presentations available from the OPC Foundation.


GITA Oil and gas pipeline conference, Houston

PODS and IPLOCA team on new worldwide standard for pipeline construction.

Cheryl Bradley (Chevron Pipeline) set the scene at the Geospatial Information Technology Association’s (Gita) annual meet in Houston, describing the how the regulatory environment dictates an integrated approach to the maintenance of up-to-date, geospatially-referenced pipeline materials and integrity management data in its environmental, population and cultural context.

Earlier this year the Swiss-headquartered International pipe line and offshore contractors association (Iploca) met with the US-based Pipeline open data standards (Pods) to discuss data standards for new pipeline construction. The idea is to extend the current model, which concentrates on operations and maintenance, to include structures for new construction along with the transfer of critical information and metadata. Pods director Janet Sinclair stated that a joint workgroup is now gathering information pending formal project approval from the Pods and Iploca boards.

Questar’s Ted Peay told how accurate ‘as-built’ field notes can be captured with GPS devices such as Trimble’s GeoXH and a laser rangefinder. A methodology was presented from data dictionary creation through data capture and review. This allowed for the creation of lines, polygons and points from GPS shots with ‘sub foot’ accuracy.

Noble Denton’s Jeff Puuri teamed with Mike Greene (South Carolina Electric & Gas) to describe integrating field data collection with GIS for compliance with the US Pipeline safety, regulatory certainty and job creation act which went into law earlier this year. Section 23 of the Act governs maximum allowable operating pressure (Maop) determination, verification and reporting. The proposed solution blends the APDM (ArcGis) data model with PODS to visualize Maop status corridor analysis and network modeling.

Willbros Engineering’s Peter Veenstra analyzed the potential of the Cloud for pipeline data management. The cloud lessens the relevance of data models with a shift from storage and structure to processing and the ‘agility’ of noSQL and ‘discoverable’ data. Note that the move to the Cloud is not a GIS, IT or legal decision. It is a ‘C-level decision,’ about whether businesses ‘want to build another GIS or operate a pipeline.’ Delegating such decisions the IT department isn’t necessarily the right idea. Concerns of security are red-herrings. Presentations on the GITA website.


Sales, contracts, partnerships and deployments

TNO, Allegro, Aveva, Cameron, Schlumberger, Coupa Software, OFS Portal, Energy Software, Center Technologies, EqHub, FEI, Society of Economic Geologists, Elsevier, Intergraph, KBR, Petrofac, Rolta, SAP, IFS, Paradigm GE/Wayne.

TNO has successfully completed a pilot assessment of its production optimizer for Kuwait Oil Company covering two sector models of the Minagish field. The study showed a potential reduction of 15% in water injection and scope for increased production. TNO has also signed a four year frame agreement with Statoil for consultancy and R&D services.

Addison, Texas-based Murex has chosen the Allegro 8 platform to manage its petroleum trading and distribution operations.

Suncor Energy has implemented Aveva Net to support data management at a new bitumen processing project in Alberta. The tool provides tag data validation and the identification of maintenance-significant tags and key attributes for uploaded to Suncor’s SAP production and maintenance system.

Cameron and Schlumberger have created OneSubsea, a 60/40 joint venture to manufacture and develop products, systems and services for the subsea oil and gas market. Cameron contributes its existing subsea division and receives $600 million from Schlumberger. Schlumberger will contribute various of its businesses. The JV will be managed by Cameron.

Cloud-based spend optimization solutions provider, Coupa Software, has joined the OFS Portal community.

Energy Software has announced a new partnership with Center Technologies, a Houston-based IT company specializing in enterprise consulting and cloud computing services.

Teledyne Cormon, Glamox, Harris Norge, Den Norske Høytalerfabrikk, Wärtsilä Oil & Gas Systems, and Faroe Petroleum Norge have signed with Norway’s EqHub.

FEI has delivered a QEMSCAN WellSite analysis system to Kirk Petrophysics, providing onsite analysis of drill cores.

The Society of Economic Geologists has contributed some 15,000 maps to Elsevier’s Geofacets service.

Samsung Heavy Industries has selected Intergraph SmartMarine 3D for its offshore division.

KBR has been selected as BP’s preferred engineering and project management service provider for select stage activities on its ‘Project 20KTM’ program.

Petrofac has been awarded two engineering, procurement and construction packages for Saudi Aramco’s Jazan refinery and terminal project, with a combined value of $1.4 billion.

Lundin Norway is to implement a range of Aveva’s tools on its first operated oilfields in the North Sea.

Rolta has entered into a strategic partnership with SAP to deliver its business intelligence technology as a component of Rolta OneView enterprise suite.

IFS has signed a contract with ‘a major North American offshore oil and gas drilling company’ to implement IFS Applications as its enterprise asset management system. The suite will be delivered by IFS North America as a hosted service.

Cougar Drilling Solutions has adopted Paradigm’s Sysdrill software as its preferred solution for well planning and directional drilling.

GE unit Wayne has announced the first installation of its Wayne Vista CNG ‘pay-at-the-pump’ dispenser.


Standards stuff

OGP P-Formats, IPAC-CO2 carbon capture, Energistics, OSGeo MapServer, OWL 2.0 for energy.

The OGP geomatics committee has updated its ‘P formats’ used for marine seismic position data. Changes include a common header containing summary and configuration data and improved definition of coordinate reference systems. The OGP P1/11 and OGP P2/11 formats have been formally sanctioned by the committee and are available from the OGP website.

The CSA Group and IPAC-CO2 Research have announced a ‘bi-national’ (Canada and USA) standard for the capture and storage of CO2. The ‘CSA Z741’ standard targets saline aquifers and depleted hydrocarbon reservoirs and provides guidelines for regulators and industry and will be the basis of a future ISO standard.

Energistics has started work on a new WitsML API for high-frequency, low latency real time data, the first step toward a ‘next-generation’ API based on new technology. The ProdML team has drafted a three year roadmap which will focus on surveillance and optimization, industry reporting and a shared asset model. Development of a new ‘completion’ data object is nearing, err..., completion. The ResqML team expects to deliver V2.0 by mid year 2013 with a focus on defining the relationships between the earth model component of V1.1. More from Energistics.

The Open Source Geospatial Foundation, OSGeo, has announced the release of MapServer 6.2.0, the first joint release of MapServer 6.2, TinyOWS 1.1, and MapCache 1.0. The release is said to be the first step towards a ‘fully-fledged MapServer suite.’

The W3C’s OWL working group has published the second edition of the OWL 2 ontology language. OWL 2 is a component of the W3C’s semantic web toolkit and support the capture of knowledge relating to a particular domain such as energy information management.


Shell deploys inventory optimizer from Terra Technology

Demand-sensing software informs Shell lubricants supply chain management.

Shell has acquired a worldwide license to Terra Technology’s demand sensing and multi-enterprise inventory optimizer, for use in its lubricants business. Terratech’s Demand Sensing monitors consumer behavior and market influences to predict future demand. The multi-enterprise inventory optimizer generates daily inventory targets that are aligned with current conditions, even in volatile markets. Shell will use the technology to improve supply chain planning and assure product availability while limiting the risk of excess inventory.

The global deal was signed following a successful pilot in Western Europe and South East Asia. TerraTech’s demand sensing and math-based optimizers were developed for the retail segment. Clients include Procter & Gamble, Unilever, Kraft Foods, Kimberly-Clark and Kellogg. More from TerraTech.


Laser-scan everywhere in Aveva’s ‘lean’ construction paradigm

EPC sector eschews ‘lean’ approach. Aveva thinks it knows how to fix the situation.

A recent edition of Aveva Perspectives advocates the ‘wholesale incorporation of a lean philosophy’ by the engineering procurement and construction (EPC) sector. ‘Lean’ manufacturing techniques, which include ‘just in time,’ and ‘six-sigma,’ are widely used in discrete manufacturing—but no so much in EPC.

Aveva attributes the relative low take-up of lean in construction to a complex contracting environment, high levels of scrutiny and compliance requirements. But the ‘overriding constraint’ is time. Projects must be delivered in shorter and shorter timescales, offering little opportunity for continuous improvement, prototyping or the application of engineering best practices as in the automotive industry. Stakeholder fragmentation, incentives for quick completion of sub-components and the lack of feedback are also problematical.

Aveva sees positive signs, notably the advent of ‘new contracting styles’ and the emerging technologies of affordable laser scanning and mobile computing. These are core to Aveva’s goal of a lean approach that is applicable to ‘one-off’ projects. The trick is to use routine in-project laser scanning to provide all stakeholders with access to a model that is kept up to date with constant feedback on the real state of the construction process.


POSC/Caesar southern hemisphere meets in Brisbane

EU’s €14 million ‘Optique’ IM. Sharecat pitches southern EqHub. Origin on CSG ops. ISS’ well test.

Speaking at the recent POSC/Caesar association’s (PCA) southern hemisphere meet in Brisbane Australia, PCA general manager Nils Sandsmark introduced a new project, ‘Optique’ for intelligent information management. Optique is funded by the EU FP7-ICT program with a €14 million budget. The project kicked off in November 2012 and sets out to blend data from rotating equipment, E&P data stores such as Petrobank and OpenWorks and feed into the ‘Optique ecosystem.’ More from Optique.

Sharecat MD Dag Pettersen made a pitch for an ‘EqHub’ for the southern hemisphere—a mirror of Norway’s EPIM-managed equipment e-commerce platform. EPIM promises prequalified equipment information delivered ‘once and for all’ using Sharecat technology, ISO15926 and ‘other standards as required.’

Garth McDonald described how integrated operations are transforming data to actionable information in Origin’s rapidly scaling coal seam gas (CSG) operations. Origin’s CSG business transformation has been enabled by the use of data and multidisciplinary process orchestration. The component data consolidation and reporting project is built around ‘Intuition,’ a production data ontology based on industry standards such as ISO, OPC and ProdML.

Grant Eggleton showed how ISS Group’s BabelFish tools were used to automate the capture and validation of well test records in ProdML. BabelFish Sentinel monitor detects ‘start’ and ‘stop’ events from valve position indicators or detection of steady state in wellhead pressure. Test data can be grouped according to choke settings, tubing head pressure which can be viewed individually or as trending data. Presentations available from POSC/Caesar.


Semantic ISO 15926 ‘based on a misunderstanding’

University of Oslo researcher takes issue with standard—advocating a linked data approach.

Rurik Thomas Greenall, previously with Statoil and now at the University of Oslo, posted an informative contribution to the W3C Oil and Gas and Chemicals business group which he has kindly allowed us to summarize.

I thought I’d share a few thoughts with you about why I struggle with ‘semantics’ in oil and gas. There is a tendency to view the kind of activity done in ‘semantic data modeling’ as a pertinent semantic web activity. From the standpoint of a semantic web programmer like myself ‘semantic data modeling’ as used in ISO 15926, ISA, MIMOSA and OpenO&M is not a semantic web activity as set out by the W3C where, I quote, ‘The semantic web provides a common framework that allows data to be shared and reused [...] across application, enterprise, and community boundaries [using] the resource description framework, RDF.’

The current W3C view is that RDF is based on linked data and HTTP-URIs. Linked data is the semantic web. Attempts at providing an ontological view such as ISO15926 are based on a misunderstanding. My friendly challenge to the group then is to move towards the core of the semantic web and away from the niche we’re working in.

Read Greenall’s full post here.


CyrusOne, Dell, R Systems offer E&P HPC Cloud

Service providers team on on-demand high performance computing for seismic imaging.

Global data center services provider CyrusOne has partnered with Dell and R Systems to deliver a cloud-based solution tailored to oil and gas. The initial offering will be delivered from CyrusOne’s Houston West colocation facility from which Dell and R Systems are to offer custom high performance computing solutions for clients.

CyrusOne’s Sky for the Cloud is a ‘peering and interconnection’ platform for cloud-based applications served from a customized ‘data hall.’ The facility includes a ‘2N architecture’ for redundant power and high availability. Single location peering ‘pulls content from the edge of the internet to the heart of the data center.’ The HPC Cloud enables companies to align compute requirements with projects and technology refresh cycles, taking advantage of the fastest technology available.

Dell VP Nnamdi Orakwue said, ‘The HPC Cloud solution frees companies from having to manage HPC environments and resources so that they can focus on running their businesses, not data centers.’

The offering targets high volume seismic processing. Kevin Timmons, CyrusOne CTO, added, ‘Sky for the Cloud creates an ecosystem to facilitate analysis and sharing of all geophysical data locally and statewide.’ CyrusOne is to connect its facilities in Austin, Dallas, Houston, and San Antonio, allowing clients to build out capacity choosing either CyrusOne’s bandwidth marketplace or a ‘cross-connect’ to the Cloud service More from CyrusOne.


INCAS3 motes travel through reservoir

Pumpable micro devices to track injection pathways, characterize ‘wormholes’ and thief zones.

A field trial has demonstrated the feasibility of ‘motes’ to identify ‘wormholes’ in heavy oil reservoirs. The test was conducted by the PI Innovation Centre, a joint venture of the Canadian Petroleum Technology Research Centre and its Netherlands-based partner Incas3 on a heavy oil field operated by Canadian Natural Resources Limited.

The CNRL trial was performed on a heavy oil reservoir that uses a ‘Chops’ (cold heavy oil production with sand) technique whereby a sand and oil mixture is produced, leaving a network of wormholes’ in the reservoir. The trial with ‘dumb’ motes (millimeter sized pellets) has showed that sub 7 mm motes can transit the reservoir. Ultimately the idea is to provide a picture of wormholes and ‘thief zones’ to help production engineers optimize sweep. A next generation mote equipped with an RFID tag is under development but it will likely require motes with a minute inertial navigation system and recorder to fulfill the goal of mapping the wormholes. More from PI Innovation and Incas3.


AEC renews TNO ‘quantum dot’ reservoir monitoring program

Nanoparticles’ optical behavior can be tuned to address specific reservoir sensing requirements.

The Advanced Energy Consortium of the Bureau of Economic Geology at the University of Texas has renewed its contract with Netherlands-based R&D unit TNO for research into ‘quantum dot’ sensor systems. Quantum dots, originally developed for use in medical imaging, are nanoparticles with tunable optical behavior that have potential application in reservoir sensing. The nanoparticles are small enough to pass through a reservoir, and their well-defined and distinct optical behavior allows for their detection at very low concentrations. Specific coatings provide the quantum dots with stability under reservoir conditions and determine their specific response to parameters such as temperature or chemical environment.

The trials were conducted under realistic reservoir conditions of high salinity brine, crude oil and at high temperature and pressure. The sensors’ optical response demonstrated the desired multi-parameter dependence and a scalable process for quantum dot production was developed. In the new contract the main focus will be on long term behavior of the particles in the reservoir. AEC members are Shell, Halliburton, Petrobras, Schlumberger, BP, Conoco Philips, Total and BG. More from the AEC.


Cartasite CellSafe keeps truckers off phone, on road!

Encana driving safety program head backs cell phone blocking technology.

Pittsburg, PA headquartered Cartasite unveiled a new, safe driving system at the OSHA oil and gas safety conference in Dallas this month. The new ‘CellSafe’ feature is an addition to Cartasite’s ‘Rovr’ system that informs drivers about behavior patterns that are likely to result in an accident.

Cartasite CEO David Armitage explained, ‘CellSafe adds an extra safety measure by addressing the issue of distraction caused by cellular phone use when driving.’ CellSafe monitors and can even block cell phone use while a vehicle is in motion.

The Rovr device plugs into the vehicle’s OBD and acts as ‘black box,’ measuring subtle movements of the vehicle that have been shown to be indicative of distracted or aggressive driving patterns. Cartasite’s software translates this data into a scorecard which is emailed to each driver every week. The scorecard suggests changes to driver behavior that will save fuel, reduce emissions and crash risk.

Mark Trostel, head of Encana’s driving safety program said, ‘The Rovr scorecards have proven effective in reducing accident rates at some of the largest energy companies in the world. Drivers actually compete for the highest score and earn rewards for safe driving.’ Cartasite collaborated with Cellcontrol to develop the new feature. More from Cartasite.


© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.