December 2005


Pipeline software moves

Activity is hotting up in the transmission sector as Baker Hughes Petrolite acquires Baseline, GeoFields spins-off a new international unit and Invensys allies with Energy Solutions.

A lot has happened in the pipeline software marketplace this month, as operators in the US ramp-up their pipeline inspection activity, driven by the US DOT’s ‘Proposed Rule for Pipeline Integrity Management in High Consequence Areas (Gas Transmission Pipelines).’ These rules stipulate that operators must assess at least 50% of their line pipe before December 2006.

GIS

Defining HCAs leverages sophisticated GIS-based methods combining geocoded population data with pipeline route maps. Compliance management of pipeline inspection program is forcing operators to revamp their data management strategies across the board.

Baker Petrolite

Our first ‘pipeline move’ involves Baker Hughes’ acquisition of Baseline Technologies of Calgary which will be rolled into its Petrolite Pipeline Management Group (PMG). Baseline provides data management products and services to help with operational and regulatory requirements. The suite includes a Pipeline Information Control System (PICS), for the implementation of pipeline integrity management programs.

Dupuis

Baseline president Bruce Dupuis said, ‘By joining forces with Baker Hughes, we are creating a ‘best in class’ pipeline integrity company, combining PMG’s inspection, engineering and cleaning resources with our vendor-neutral data management expertise. Our patented data-centric methodology and change management processes assure life cycle integrity management.’

Geospatial

Baker Petrolite President Andy O’Donnell added, ‘Geospatial integration of data from our in-line inspection and engineering data bases with operators’ existing data stores will greatly enhance the value of this data to our customer base.’

GeoFields

More pipeline activity this month came from Atlanta-based oil and gas pipeline data management specialist GeoFields which has spun-off a new wholly-owned unit to target the international market. The move coincides with the launch of a business partner program targeting the Middle East, Latin America, Asia and European markets.

Irvani

GeoFields chairman Bahman Irvani said, ‘We’ve already begun to offer data management services outside North America, but we are now ready to dedicate much larger resources to establish our presence and gain market leadership overseas. GeoFields International (GFI) will offer pipeline data and application hosting, data modeling, database development and integration services to its international clients. GFI president Brad Schaaf said, ‘Demand is driving our international expansion. We have built our successful business on reliable, cutting edge technology coupled with great customer service, and we’re now ready to take that international.’

Energy Solutions

Signs of a buoyant pipeline software market came from Energy Solutions International (ESI) which reported its ‘best year ever’ for sales of its PipelineStudio flagship package for simulation and design of transmission pipelines. ESI now claims over 500 PipelineStudio installations around the world.

Jacob

ESI CEO Al Jacob said, ‘Many companies find PipelineStudio valuable for designing and operating their pipelines for optimum efficiency and value. In the past few months we have added significant functionality to PipelineStudio and this increase in functionality is attracting many new customers.’

Invensys

ESI has also signed a memorandum of understanding with Invensys Process Systems IPS on a strategic alliance and reseller relationship. The companies are to offer customers ‘intelligent solutions’ for pipeline automation and optimization.

Chmilewski

Mike Chmilewski, IPS VP SCADA and Pipeline said, ‘Our companies have worked together informally for years. The alliance will allow us to grow our partnership, combining ESI’s technology with our own portfolio of technologies and services.’

Integration

The deal envisages integration of both companies’ technologies, joint marketing and evaluation of other potential synergies. Invensys will be trained on ESI’s Pipeline-Studio tools and ESI personnel will likewise train the relevant Foxboro, SimSci-Esscor, and Wonderware products from Invensys.

Borgli

Felipe Borgli, control room coordinator with operator Transportadora Brasileira Gasoduto Bolívia-Brasil (TBG) said of the deal, ‘We’ve been using both ESI and Invensys products together successfully for six years. The products are a true compliment to each other and we anticipate this will make working with these companies even easier in the future.’

Globalization

As the Brazilian operator’s testimony demonstrates, the pipeline industry is a truly global business. But the latest moves reflect a ramping up of the IT component of transmission—and show how the tightening of safety regulations is having an effect way beyond the United States’ borders.


CDA for Schlumberger

LogDB and AssetDB/eSearch are to displace PetroBank/Recall in the UK’s Common Data Access well log and hard copy data repository.

Schlumberger is about to win the UK’s prestigious Common Data Access (CDA) well data management contract from incumbent Landmark Graphics. CDA, a wholly owned unit of the UK Offshore Operators Association (UKOOA) issued its tender last August.

Petrobank

Schlumberger’s technology will displace Landmark’s PetroBank/Recall data management solution and related services. The tender involves the management of digital well log curves, scanned images of well reports and logs and initially seismic navigational data—although this was subsequently excluded from the scope of the tender.

SOA

The tender also calls for a new ‘services-oriented architecture’ (SOA), decoupling the UK’s well data architecture into a data store and ‘internet-facing’ applications. This month CDA approved the Schlumberger Information Solutions (SIS) as the sole ‘preferred bidder’ for the Well Data Store contract and is to execute an agreement with SIS in January 2006.

FPAL

The CDA process was managed confidentially by a private UK concern, First Point Assessment (FPAL). Schlumberger will be moving in on CDA when Landmark’s contract expires in July 2006.


SOX, master data, standards and ‘Yaca-Focon!’

Oil IT Journal editor Neil McNaughton’s triptych covers the true scope of Sarbanes-Oxley compliance, technology lead, lag and master data management, and one of the oldest IT ‘gotchas’ in the book.

I’d like to offer some insight, gleaned from our attendance at two UK based IT/data management events in London this month. I submit that this is the sort of information that a) would cost you a fortune if it came from the likes of Gartner etc. and b) is of such simplicity and immediacy that the consultants would have to protect it with a bodyguard of hot air.

Sarbanes-Oxley

Readers of Oil IT Journal will have heard many dire warnings of the impact on IT systems of Sarbanes-Oxley (SOX), the post-Enron US transparency legislation. Paras MD Hamish Wilson, speaking at the PESGB’s Data Management conference this month (more of which in next month’s issue) described the increasing reporting burden that SOX has brought, with the requirement to demonstrate how reserves figures are obtained. Companies need to get a much better handle on their complex workflows which may involving seismics, stochastics and simulation. All of which is a potential field day for the IT consultants.

SPE

Following the PESGB do in Piccadilly, we strolled across St. James Park to Westminster for the SPE’s conference on Digital Security in Oil and Gas (again, more in next month’s Journal). Here we heard KPMG’s Chris Wright talk on what SOX really means for oil companies and their IT departments. Wright stated that the SOX provisions for control over corporate financial reporting are just that. The intent of the legislator was not that such controls should spread throughout the enterprise into technical analysis. There is a boatload of implications for IT managers but they are not about technical computing. ‘A lot of time has been wasted on systems that don’t refer to SOX.’

Archaic

The dividing line between technical and finance may be hard to draw, but the intent of the legislator is clear and has echoes in the SEC’s rather archaic definitions of ‘reserves’. When you hear technologists push for SEC’s acceptance of seismics and stochastics in reserves evaluation, you may want to reflect on the increased reporting burden that this would bring. I have an image of the petroleum engineering community’s collective finger on the trigger of a gun pointed at its foot!

~~~

Another piece of information triangulation this month links the talk from Shell’s exploration director Malcolm Brinded (opposite page), the Kalido webinar (page 12 of this issue) to recent presentations in Calgary (PPDM—page 6) and London (IQPC—page 12). Brinded’s plea for better tools for data integration begs the question, ‘is the upstream a technological leader or a laggard?’ If you believe Adam Dodson, Shell itself is a leader in terms of taming software ‘beasts’ like ESRI, LiveLink, Autonomy, MetaCarta, integrating them into an apparently coherent IM solution. OTOH, judging from Alan Bays’ PPDM talk, Shell’s own groundbreaking taxonomies could be improved. I listened to the Kalido webinar with all this in mind. Does the commercial data mining community have something to offer us in the context of taxonomies and the emerging ‘master data store’ technology? Watch this space.

~~~

When folks take a bird’s eye view of the standards arena things look fairly straightforward. In French they have a saying, ‘Yaca-Focon*’, which means ‘all you have to do is such and such and everything will be OK.’ I suggest this is a good starting point for what I would call the standards ‘disillusionment cycle.’

CRLF

You may think that ASCII is a standard, but even here there is no agreement on the most basic aspect of a text file, how a line of text is terminated. The story begins in the dark and distant past of computing, or rather before computing began, with the mechanical workings of a line printer. These devices used to be, and of course still are, proprietary, with complex control codes to make them print text, ring bells, feed the fan-folded paper and so on. In those days, there were separate ASCII codes for a carriage return and a linefeed.

iBook

My problematic workflow, which I have already exposed in these columns, involves a great deal of cutting and pasting. From websites, my own notes taken on an Apple iBook, Adobe pdf documents, Microsoft Office and so on. As anyone who has cut and pasted anything knows, this simple act is fraught with problems. In Microsoft Word for instance, a cut and paste can transform the target document’s formatting, numbering and even the spell check language. A trip through the good old Windows notepad as an editing buffer fixes a lot of these issues, except for one—the oldest IT gotcha in the book, the new line symbol which is variously a linefeed or a combination of a carriage return plus linefeed. If you ever wonder what your ‘knowledge workers’ are doing, there is a good chance that they are still occasionally futzing around with dumb stuff like this and I can assure you that there is nothing like it to stop the creative juices flowing.

Sand in ointment

Of course it would be easy to fix this. But we are years beyond the Yaca-Focon’ phase on the linefeed problem. In fact we are even beyond even the disillusionment part of the cycle. The problem is almost hidden as applications perform transformations on the fly. Sometimes they even get it right although as cutters and pasters know, the clever format guessing Word performs is not necessarily what was intended. Perhaps a bit of sand in the ointment is what it takes to make a company finally give up on ‘best of breed’ and go for a single vendor solution in despair.

Four seismic XMLs

Please do not get me wrong. I am a great believer in standards. I am just warning newcomers, the folks who think ‘Yaka-Focon.’ And to point out that if the simple stuff is so hard, we need to be very circumspect about the hard stuff. Talking of which I came across no less than four mooted seismic XML initiatives in the last month (from POSC/WITSML, OpenSpirit/Resolve, PPDM and the SEG) which suggests that there is a still a lot of Yaka-Focon-ism around. Happy new year.

* Il n’y a qu’à, faut qu’on.


Brinded, ‘We need tools for data integration’

Shell’s head of exploration believes that the world needs better integration of technology and data.

Speaking at the inaugural International Petroleum Technology Conference in Doha, Qatar last month, Shell’s head of E&P, Malcolm Brinded, said that, ‘Meeting the challenge of the increasing global demand for energy, whilst tackling the impact on our climate, will depend on our ability to develop new technologies and deploy them together effectively—integration is the key.’

Recovery

Raising the conventional global oil recovery rate from its present average of 35% to 45% could add 20 years to current production. To achieve this, subsurface integration is essential and this ‘doesn’t just happen by telling people to talk to each other.’ New plays will involve harsher conditions and more complex geology and will require a range of technologies beyond conventional seismics, including satellite imaging, airborne sniffing, electromagnetic methods such as seabed logging and enhanced seismic imaging.

Data integration

Brinded stressed that new tools are required to integrate such diverse data quickly. The industry also needs people with ‘wide geological and geophysical understanding who are able to apply global knowledge locally.’


Hatton on forensic software engineering

Following last month’s editorial, we are pleased to print Les Hatton’s analysis of ‘software woes.’ Hatton, of Oakwood Computing Associates, studies ‘defects’ in seismic and other code.

Seismic processing IT guru Les Hatton (Oakwood Computing Associates) addressed the Birkbeck School of Crystallography in London last month on his specialty, ‘Forensic Software Engineering,’ or how to evaluate software reliability, security, cost and other ‘woes’. Hatton, who is Professor of Forensic Software Engineering at the University of Kingston, offered guidance on survival strategies for the avoidance of software failure. Hatton’s first observation is that many failures of software-controlled systems could have been avoided by applying techniques that are already known.

Fail gracefully

Such well established engineering techniques include designing systems to fail gracefully, minimizing effects on users. When failures do occur, it should always be possible to trace them back to a root cause. The aerospace industry came in for particular criticism with an Airbus’ uncontrolled dive attributed to ‘a fault in the automatic pilot.’ Test pilots of the F/A-22 (Raptor) fighter used to spend an average of 14 minutes per flight rebooting critical systems—this is now down to ‘only’ 36 seconds per flight.

Problem prone computers

Similar problems caused recall rates in the US motor industry to top 19 million in 2003, this despite engineering ‘never being better’. Experts cited ‘problem-prone’ computers as a significant factor. The NIST (US National Institute of Standards and Technology) estimated the cost of software failure in 2002 at $60 billion per year in the US. While a report from the Royal Academy of Engineering (UK) estimated that around £17 billion will be ‘wasted’ on software projects in 2004.

Forensic analysis

Hatton’s specialty, forensic software analysis, involves investigating defects in executable programs—both applications and operating systems. Code quality is measured in faults per thousand lines of executable (machine language) code (KXLOC). The ‘state of the art’ is represented by the NASA Shuttle software—estimated as having a 0.1 KXLOC fault rate. Windows 2000 is thought to have a fault count in the 2-4 KLOC range while Linux fares much better at 0.5 KLOC. It is thought that 5-10% of such faults will have a significant effect on the results or behavior of a system.

Compiler quality

Forensic Software Engineering involves the analysis of software processes, products and project failures—along with the study of operating system reliability, security and compiler quality. Hatton’s investigations to date suggest some guidelines for software project planners: no sub-task with software systems should be longer than 1 week, projects should be tracked weekly with progress published, programmers underestimate the time taken to do things by about 50%.

Parsing engines

Hatton has developed a range of code parsing engines for a variety of domains from embedded systems to geophysical processing. The latter involved the study of nine independently developed FORTRAN packages for seismic processing—the tests produced nine ‘subtly different’ views of the geology. The types of defect detected by the analysis proved to have had exceptionally long lives and ‘cost a fortune.’ Hatton advocates more static testing of code, to establish KLOC count, rather than the usual dynamic, run time testing.

Compiler complexity

But even the best programmers are going to have a hard time producing quality code because of the complexity and unpredictability of modern programming languages. These are ‘riddled with poorly defined behavior’, ‘numerical computations are often wrong however they are done.’ Forensic analysis can be applied to OS reliability, security, arithmetic environment and compiler quality.

OS of choice

Hatton has little time for Windows as an operating system citing mean time between failure (MTBF) for Windows 2000/XP as in the 100-300 hour range against over 50,000 hours for Linux. Most modern compilers fail the standard validation suites. In fact NIST stopped validating compilers in 2000 because it’s no longer cost-effective for them, now they are partly privatized. Hatton’s recommendations are that if you want a reliable and secure OS, don’t use Windows. Furthermore, computer arithmetic should be checked on mission critical code and compiler quality ‘should not be taken for granted.’

More on software forensics, quality and Hatton’s other passion, javelin throwing, from www.leshatton.org.


TNK-BP to deploy Petrel Workflow Tools

BP’s Russian joint venture is to equip its multifunctional teams with Schlumberger’s Petrel.

TNK-BP, Russia’s second largest oil company, has selected Schlumberger Information Solutions’ (SIS) Petrel workflow tools for its newly created ‘multifunctional’ subsurface teams. TNK-BP’s selection process involved a long-term technical evaluation in which Petrel and other commercially available solutions were tested by TNK-BP subsidiaries across Russia, including divisions in the Volga/Urals, Western Siberia, Tyumen and Izhevsk technical centers and at the company’s Moscow headquarters.

Complex

TNK-BP projects include complex environments such as dual porosity, heavy oil and fields with over 20,000 wells. Petrel was chosen for its seismic-to-simulation capability. TNK-BP personnel found Petrel easy to learn and that it improved productivity and collaboration.

Le Peuch

SIS president Olivier Le Peuch said, ‘TNK-BP will use Petrel across its multi domain asset teams to develop a more thorough understanding of its reservoirs, to assess risks and uncertainties, and ultimately, to optimize production and enhance recovery.’

100,000

TNK-BP has a staff of around 100,000 and operates in nearly all of Russia’s major hydrocarbon regions. The company is 50% owned by BP and 50% by a group of Russian investors—Alfa Group, Access Industries and Renova (AAR).


Ikon gets £4 million Oilexco contract

RokDoc slated for extensive use in North Sea exploration, appraisal and development program.

Calgary-based Oilexco has awarded Ikon Science a two year follow-on contract worth £4 million for geoscience modelling, reservoir characterization and 4D seismic monitoring. Ikon is to leverage its RokDoc flagship package to underpin Oilexco’s aggressive North Sea drilling program, with the Transocean Sedco 712 rig on contract until March 2008. Oilexco credits Ikon’s technology as having contributed to its 2004 Brenda and Nicol fields.

Milholland

Oilexco president Arthur Millholland said, ‘Ikon Science will be helping us to continue the momentum of our aggressive program of exploration, appraisal and development drilling in the UK North Sea by keeping us at the cutting edge of North Sea subsurface practice.’

Millwood Hargrave

Ikon Science MD Martyn Millwood Hargrave told Oil IT Journal, ‘Our studies are both strategic and operationally focussed and make heavy use of RokDoc for predictive evaluations of prospects and well targets. Our activity spans petrophysics, rock physics and well matching. We will help Oilexco optimize its interpretation strategy, leveraging attributes and inversion parameters to build the pre-drill reservoir model to keep it updated during subsequent operations.’

Speed

‘Oilexco likes the speed we offer working with this tricky stratigraphic prospects. We can help them evaluate a lot of higher risk deals and prospects prior to making choices they have to live with. So far we have been involved with all the Brenda and Nichol wells and the successful Black Horse appraisals.’ Ikon also plans to leverage its newly acquired Chronoseis 3D/4D seismic reservoir characterization module which it is now planning to offer to the US and Far East markets.


OpenSpirit 2005 user meet, Houston

OpenSpirit appears to be gathering momentum even if it is some way from Dan Piette’s vision of a copy on every geoscientist’s workstation. The ‘next generation framework’ heralds a new ‘extensibility’ paradigm that will expose third party data models. The current version received enthusiastic support from users in SMT, Transform Software and T2B specialist PointCross.

Around 40 attended the OpenSpirit (OS) user group meeting held just after the SEG Convention in Houston last month. OS CEO Dan Piette opened the proceedings commenting that ‘the boom times of the early 1980s are back.’ OS is doing well too—with more clients, more licenses, more data stores and more soft companies writing to the OS API. Piette described this as a ‘virtuous cycle’ or ‘tipping point’ in OS’ fortunes. The company now has 31 employees and is working on new technologies, with new partners. Moreover the major companies are planning to use OS more extensively, meeting Piette’s dream of ubiquity—’I want to be on every geoscientist’s desktop.’ The potential fly in the ointment for Piette’s dream was the absence of a Landmark rep at the conference which we take as a certain distancing of Landmark from the Schlumberger-sponsored initiative.

Harter

OS CTO Clay Harter traced new developments in 2005 and outlined plans for the upcoming 2.9 release. In 2005 OS added six new dev kit customers and added five staff developers. The openspirit.com website has been revamped and now contains details of commercial data store connections and table maps. A discussion forum for developers has also been added. SGI’s Irix operating system was described as ‘in decline’ and support may not last long. Since last year, OS has added new (beta) connectors for Petra, PPDM & SDE Culture. Upcoming data objects include a stratigraphic grid, platform, trajectory, drill string, BHA and tubulars. Currently supported datastores are OpenWorks, GeoFrame, Finder, SMT Kingdom Suite, OS’ own ‘Managed SEGY’ and GoCad. Data synchronization now works across applications such that a change in OpenWorks can update GeoFrame. Users can for instance update geostatistics following a repick of tops in another application. Georegistered TIFF/GRID can now be generated from any seis horizon or grid and output to tools such as ESRI’s Spatial Analyst. ‘Point layers’ (irregular grids of well tops) can also be generated.

Demographics

OS sells two thirds of its run time licenses to 3rd party vendors so it is unclear whether OS is primarily used by end users or by data managers. Harter observed that some companies believe that geoscientists shouldn’t touch data. Others think they work faster if they do. Harter guessed that licenses were split 50/50 between users and data managers. This was not borne out in a straw poll that found all present to be data managers!

Next generation

The ‘next generation’ V3.0 OS will expand data coverage in the subsurface to include reservoir engineering, production engineering and drilling. V3 Design goals will initially address the infrastructure (not desktop) with performance enhancements, easier installation and improved ease of use for developers. The next generation framework (NGF) will offer services for events, units of measure, coordinate reference systems, data access and reference values. A ‘meta model’ will provide information on data models in use and will allow for customization of the OS framework. OS recognizes that no single ‘common’ model will ever fit all use cases. So the new technology will allow OS users to see, for instance, the full OpenWorks data model and connect to specific data elements therein.

Transform

Murray Roth described how startup Transform Software has leveraged OS to provide its seismic interpretation technology with access to multi-vendor databases, managing multiple data types, units of measure and CRSs. Transform offers users access to many data types including culture, seismic, image files etc. This involves managing time and depth domain data on a variety of datums. Users can drag time seismic data over a depth model and it is converted on the fly. Transform is pleased with OS functionality. Data is stored wherever the customer has it already. The API is ‘cleaner than most.’ Transform was less keen on OS’ fee structure—and would have preferred a shift to a run time fee, shifting the cost to the customer. Transform ‘whined and whined, but ended up writing a check.’

SMT

Bob Tucker (SMT) was equally enthusiastic regarding OS. The OS tools enabled a fast-track data inventory of the client’s terabytes of data. SMT reported ‘huge productivity gains’ from OS—along with a high end PC configuration and improved IT resources. Visual data QC leveraged OS GIS and 3D data viewers. Data was exported with the Excel adaptor for macros processing and further QC.

PointCross

Suresh Madhavan’s presented Pointcross’ Orchestra and its Integrated E&P Solutions framework as an ‘ontology engine’ and ‘knowledge infrastructure.’ The purple prose of Point Cross’ marketing material does much to hide the content of this MIS integrator’s offerings. The promise from the ‘Oil Company in a box’ is ‘business transformation’ and ‘agility and competitiveness in the face of ever-tightening access to untapped hydrocarbon resources’ etc. PointCross’ web-based business to technical (B2T) offering uses XML-tagged OS data to link across business processes. A ‘business process modeling tool’ allows for interaction with G&G applications via event tracking. Shell is PointCross’ flagship client.

ResolveGeo

Don Robinson (Resolve Geosciences) is working to extend OS’ ‘managed XML’ seismic data format with tags to build a ‘full strength’ XML format for data transfer. Robinson wants processing contractors to deliver data in this format for quality assured data archiving. Robinson expressed the opinion that SEG-Y Rev 1 ‘isn’t an answer to anything!’ Rev 1 requires software to be modified and ‘this won’t happen.’ Also there is ‘no more chance of fields being filled-in than with previous header info.’ But the biggest issue with SEG-Y Rev 1 is that it invalidates millions of Canadian datasets which already use several of the overloaded fields. EnCana has prepared a detailed analysis comparing SEG-Y, Rev 1 with Canadian practice.


PPDM AGM and Fall Meeting, Calgary

IHS Energy presented the results of a survey of clients’ IM strategies that showed rapid growth in unstructured data. Other presentations focused on taxonomies, metadata and quality issues. David Hood called for more ‘data openness.’ PPDM’s relational data model now underpins much existing industry software and first time modelers like BJ Services increasingly turn to the standard.

The PPDM 2005 fall member meeting heard nothing of Herb Yuan’s call for a merger with POSC (see our report on the facing page) although POSC’s Paul Maton gave a presentation of WITSML to the PPDM crowd. According to PPDM chairman Art Boykiw, PPDM intends to ‘collaborate with other standards organizations.’ But the Association’s main aim remains the ‘universal industry adoption of PPDM standards.’ The new PPDM board includes representatives from ExxonMobil, IHS Energy PetroCanada, OMV and Schlumberger.

IHS Energy

Steve Cooper presented the results of a survey of IHS Energy’s (IHS) clients’ information management (IM) challenges. The results are shaping IHS’ own IM Framework in a five year IM Roadmap. IHS sees the future of upstream IT as a combination of portals and master data stores of GIS and structured data. There ‘appears to be’ consolidation on PPDM for well data management with a move away from proprietary solutions to PPDM 3.7. IHS’ clients want their data hosted and served up over the web. Clients will stay on Oracle, ESRI continues to dominate GIS and MetaCarta gets a plug as the preferred link from GIS to DMS.

Linux on server

Most companies will continue with Unix or Linux on the server and Windows on the desktop, with Citrix or web-based clients preferred to installing software. SAP is the financial and ERP solution of choice although attempts to link technical and business systems have met with limited success, resulting in ‘an incomplete linkage of costs and revenues.’ Astonishingly, real-time drilling and production data volumes ‘now exceed seismics.’ Production data is the feed for ‘numerous operational activities and reporting events.’ 10% of technical data is structured and is growing at 10% per year, while 90% is unstructured and is doubling every 6 to 18 months.

Curtis

PPDM CEO Trudy Curtis presented the 3.7.1 ‘correction’ model. This contains 45 modules, 1,200 tables and 24,000 columns. A Meta Model is delivered, pre-populated with information about the PPDM database. Along with the previous ‘gold’ and ‘silver’ levels, a new ‘content compliance’ level is under discussion. Looking forward to the next 3.8 release, PPDM is investigating how to leverage metadata standards from Dublin Core, and for GIS, the FDGC and ISO, although here there is concern over ‘IP issues,’ somewhat ironic in the context of standards.

Bays

Alan Bays (Flare Consultants) described the new support for ontologies, taxonomies and metadata in PPDM, leveraging W3C standards and embedding Flare’s Catalogue. A high level, starter ontology will cover all PPDM 3.7 modules. This will be extended into detailed disciplines using UNSPSC, NASA codes and the Shell Discovery Project—although this is ‘very incomplete’ (only G&G), and terminology is ‘confusing,’ a bad idea for a taxonomy standard!

Intervera

Paul Gregory addressed the problem of populating a PPDM database. Data quality—a ‘universal problem’ which can impact migration projects, consuming up to 90% of the effort. Worse, data issues can be exacerbated by conversion, making validation difficult. Implementing PPDM is a non-trivial undertaking. Once migration is complete other problems arise such as data being reloaded from multiple public sources. The ‘mad rush’ to get at data during mergers and acquisitions is another problem.

Data loading

PPDM’s modular design helps the data loader but PPDM’s well and seismic modules share some 2,200 columns—’Without repeatable tools and processes, a project has to start from scratch every time.’ Intervera’s data loading tools offer drag and drop entity mapping from target to source database along and issue warnings of data mapping problems. Reusable rules can be applied to transform data during migration.

BJ Services

Tim Leshchyshyn descibed how BJ Services Canada has extended PPDM 3.8 in to the well operations domain in its ‘PayZone’ product for management of well service proposals, reports and customized engineering R&D. The database has been extended to include events (starting and stopping pumps), equipment (e.g. fatigue on coiled tubing) and materials consumed. The database is used to tune a frac job to clients’ requirements and resources. A case study (SPE 97249) showed how data mining of frac jobs in the Dunvegan formation, Alberta showed that the most successful parameters were not the most common choices. ‘The old optimization rules of thumb and industry understanding need to be understood again.’

GeoLogic

David Hood (Geologic) advocates extending PPDM’s ‘openness’ to data vending. A truly open system should allow users of any software to interact with any data set, giving users full interoperability by employing a uniform set of standards. This involves removing the artificial business barriers around the data. For Hood, the big question is, ‘Why shouldn’t data be made accessible to everyone, through any software that a client wants to use?’ While this is technically possible, commercial issues in the past have argued against such an approach. Hood advocates a compromise business model with an infrastructure cost to balance economics. Hood believes that Canada has the opportunity to ‘show the world how to extend open systems into business practices’ and that local industry can benefit by extending products and expertise into the international market. ‘Until we remove the barriers we have created for our customers in accessing data, we cannot have a truly open system—no matter how much technical progress we make.’

Content standards

David Thomas (PPDM) traced the history of content standardization within PPDM. A pilot project (with POSC) which ran from 1999 to 2000 looked into standard sets of country names, various seismic meta data related terms and well elevation reference types. Since the pilot, standardization has extended to geodetic information and units of measure. Current work centers on more seismic related standards which will no doubt embed the PPDM’s SEISML format.


POSC AGM and Conference, Houston

The 2005 Annual POSC members got a couple of surprises with the departure of POSC CEO David Archer and the announcement by POSC Chairman Herb Yuan that POSC, PPDM and PIDX are in ‘merger discussions.’ BP and Pioneer presented major in-house IM projects. WITSML is spawning a production reporting standard ‘PRODML’ and POSC is planning to extend its XML activity to seismic, geological and facilities, to offer a ‘single integrated standards family.’ The Data Store SIG is reborn as the ‘Epicentre Data Model SIG’ and the POSC/CAESAR Association returns to the POSC fold.

Chairman Herb Yuan said that POSC is ‘struggling with its purpose’. Current Priorities are to ‘harness WITSML,’ to kick off the production data exchange standard and to ‘ensure that POSC is relevant to its members.’ A recent POSC-commissioned Gartner Group study investigated ‘the case for standards in the oil industry.’ The study recommended ‘driving standards bodies together,’ considering a ‘clean start’ and a new push for ‘standards reform.’

2005 Scorecard

Yuan presented a scorecard of POSC’s 2005 activity, revealing that ‘discussions on a potential merger of POSC, PIDX and PPDM have been ongoing for the last few months.’ Agreement has been reached with PIDX to ‘take things further.’ The position with respect to PPDM is ‘pending.’ POSC is to become more project based with a focus on special interest groups (SIG). With the departure of David Archer, POSC CTO Alan Doniger is to be interim CEO.

Doniger

Alan Doniger gave an update on POSC’s 2005 activity. POSC has aligned WITSML geodetics and coordinate reference systems standards with the EPSG. The well GUI (a.k.a. well identity) service has been proposed and letters of intent have been received from five companies. POSC is testing WITSML-style production reporting standard with ERA (a Mobil/Shell spin-off) in California. The data storage solutions SIG appears to have been re-branded as the Epicentre Data Model SIG. WellLogML is to retire in favor of WITSML. A request is ‘pending’ to establish a WITSML-based geophysical data standard. WITSML coverage will thus span the whole POSC data ‘triangle.’ The intent is to have a single ‘integrated standards family with a great deal of reuse and consistency.’

Pioneer

Tom Halbouty (CIO Pioneer Natural Resources) offered a web services ‘survivors’ guide.’ Halbouty advocates a ‘hands-on’ approach from the CIO, ‘if you turn loose a bunch of consultants you are going to land on the high side of complexity.’ It is important to ‘assure simple services that work before chasing elephant projects.’ Pioneer’s engineers can control and manage fields via the Portal, logging on from home to fix a problem. Pioneer has linked 25 data stores into its system which leverages the BEA/Plumtree portal, business intelligence from ArcPlan and Schlumberger’s Decision Point. Other components include Spotfire, Map Objects and ArcSDE. OpenSpirit and Schlumberger’s ‘Coral’ Data Access provide connectivity with E&P data sources. Halbouty was sanguine regarding standards which are ‘under funded and there are too many cooks in the kitchen.’

BP

Rusty Foreman presented BP’s services-oriented real time architecture project, covered in our recent report from the 2005 SPE (OITJ Vol. 10 N° 10). Foreman reported increased take-up of the system, in particular from BP’s oil traders. For Foreman, SOA is ‘one of the next big things,’ on a par with the move from the mainframe to distributed computing. But in the Q&A, Chevron’s Roger Cutler remarked that though SOA was an interesting technology, it is only a part of the solution. The other problem is the plethora of nomenclatures and taxonomies that are different for each application. Foreman also presented the new PRODML initiative (also covered in our report from the SPE). WITSML and PRODML are the first components in what POSC sees as a single ‘ML.’ This may involve a ‘re-brand’ as ‘PETROML’ with subsets of WITSML, PRODML, EXPLML, PROJML and so on. Cutler opined that, ‘Web services have not penetrated upstream because primary datasets can’t be expressed in XML. We need a method like the SOAP Message
Transmission Optimizing Mechanism (MTOM) to incorporate binary data.’

Seabed standards

Jay Hollingsworth (Schlumberger) spoke on ‘standards for Seabed data modeling.’ Seabed is Schlumberger’s ‘next generation’ E&P data model. Seabed takes advantage of new technologies to speed technology update. Relevant new technologies include Oracle 9, Oracle 10, and customer push for Linux, ESRI, LDAP and SQL Server. Subsequent to its acquisitions (notably Petrel) Schlumberger needs a broader information management (IM) solution. One solution would have been to use PPDM. But PPDM, while popular, ‘has no logical model.’ An alternative was Open Spirit, ‘a great solution but one that only covers a narrow spectrum of data types.’

Shell and POSC

Philip Lesslar and Nancy Tso described Shell-sponsored activities during 2005 including the distributed temperature sensor (DTS) standards, the Shell standard lithology legend. Shell has also contributed to WITSML, well header data (GUWI) PWLS and PRODML.

Archer

Departing CEO David Archer traced his career with Amoco, INT and as POSC CEO. Archer recalled POSC’s lean years—’Five years ago things were not so good. $10 oil did little for the standards initiatives.’ POSC survived, reinventing itself and slimming down. More recently, POSC has been ‘ahead of the crowd’ with its web services work and is now helping with WITSML and PRODML take up. Archer regretted that POSC ‘was not more of a software organization—we don’t eat our own dog food!’ This contrasted with Open Spirit which, unfortunately, ‘took some of the energy out of POSC.’ Another regret was not doing a better job of aligning POSC with the POSC/CAESAR Association initiative. Today, the POSC vision is to fill the E&P standards space with POSC XML standards. On the proposed merger, Archer said, ‘Many present know that we have already looked at a merger with PPDM. The organizations are not aligned and a ‘win-win’ is unclear. We would need a good ‘pre-nup!’ Archer regretted that once Epicentre was finished, the industry ‘moved on.’ But with WITSML, ‘We are lucky that people are still involved. It is great to see such good participation from Shell, Chevron and others.’

This article is an edited version of a six page report produced for The Data Room’s Technology watch subscribers. For more information on this please email tw@oilit.com.


Folks, facts and orgs…

Hiring hots up chez Knowledge Reservoir, AspenTech, Badley’s, OpenSpirit, PiSys, Energy Solutions, ExxonMobil, GeoFields, Geotrace, Halliburton, Ødegaard, RPS, Ryder-Scott, Ulterra and Petrolink.

Knowledge Reservoir has made several recent appointments: Steve Knabe (Director of Quality Management), Corinne Danielli (Staff Geoscientist), Patrick Wong (Geoscientist) and Ahmed Ali (Reservoir Engineer).

Aspen Technology has appointed John Bell Senior VP EMEA Operations. Bell was previously head of EAME sales with Aspen and worked previously for MatrixOne and SAP.

Badley Geoscience has hired Cathal Dillon to its programming team. Cathal is a PhD-qualified geologist with expertise in image processing.

Wil Faubel has been appointed president of Baker Atlas. Faubel was previously head of Baker Hughes’ Centrilift division.

Bart Stafford has joined Petris Technology as VP Product Marketing. Stafford was previously with OpenSpirit.

OpenSpirit has appointed Larry White VP Sales and Marketing. White’s career began with Schlumberger and later Landmark and Bell Geospace. Most recently he was global account manager with Input/Output.

Ben Trewin has left Iron Mountain to join UK-based software house Pisys as Sales Director. Pisys provides software services to the E&P sector—in particular a training simulator.

Pipeline software specialists Energy Solutions International has bolstered its sales and marketing teams with three new appointments: Robert Young (Product Manager), Jodi Bash (Director of Marketing) and Mark Riggs (VP Sales, North America). Young was previously with El Paso Natural Gas and GulfTerra Energy. Bash previously held marketing roles with BMC software. Riggs was with Baker Hughes, Scientific Software Intercomp and AspenTech.

Lee Raymond’s tenure as CEO of ExxonMobil has ended. He is replaced by Rex Tillerson.

Brad Schaaf has been named president of GeoFields’ new International unit. The company has also appointed Gary Waters as Executive VP. Waters was previously with ESRI and NovaLIS.

David Bannister has been promoted to Manager of Geotrace’s London processing center. Bannister was with Seismograph Services before joining Geotrace in 1990. The company has also appointed Gary Yu as Director of Innovations.

Ødegaard has opened new offices in Calgary, Jakarta, and Perth.

Halliburton has named Craig Nunez VP and Treasurer. Nunez was previously with Phillips and Colonial Pipeline Company.

Burton Smith has joined Microsoft as technical fellow, high-performance computing. Smith was previously chief scientist with Cray Inc.

UK-based RPS Group has announced the appointment of Phil Williams as an Executive Director. Williams, MD of Hydrosearch prior to its 2003 acquisition, is CEO of RPS’ Energy Division.

Consultants Ryder Scott have added three new professionals to their 66-strong team; Keven Fry, Eric Nelson (both petroleum engineers) and Mat Tremblay (geologist).

Ulterra’s Measurement While Drilling (MWD) unit has named Mike Meadows as VP. Meadows was previously with Baker Hughes.

UK-based Petrolink has adopted a carbon neutral travel policy. Petrolink figures the carbon equivalent of its business travel program and plants a number of trees that exceeds its ‘flight carbon debt.’ Since the project was started, a total of over 4000 trees have been planted.


Energy Scitech EnAnable users meet

RWE-DEA, Star Energy and Pioneer use Baysian statistics to optimize reservoir simulations.

Energy Scitech’s EnAble applies Bayes linear estimation to reduce bias in the conventional ‘history matching’ approach to reservoir simulation. Enable is a helper application that manages simulation run parameters for most all commercial and in-house simulators. This month’s EnAble meeting in London welcomed delegates from Shell, RWE-DEA, Eni-AGIP, Star Energy, Maersk and BG Group.

RWE

Heinrich Junker and Klaus Gaertner of RWE showed the results of a study on the Voelkersen gas field in Germany. The presentation showed how EnAble enhanced petroleum engineers’ productivity and allowed meaningful uncertainty estimates. The results are used in RWE’s decision making process and contribute to reserves assessment for annual reporting.

Star Energy

Stuart Pegg (Star Energy) showed how EnAble-based history matching of an extended reach onshore well helped resolve complex reservoir flow mechanisms in the presence of significant water encroachment.

Goodwin

Energy Scitech Director Nigel Goodwin outlined the new features of the latest EnAble release, many of which result from users’ suggestions. EnAble 1.9 is shipping now and extends design optimization to include well scheduling. This allows for a comprehensive field development plan that integrates subsurface uncertainty. Grid-based support is also included to spread computations across networks with multiple reservoir simulator licenses, such as Linux clusters. EnAble runs on large data sets on 32 bit and 64 bit Windows or Linux. Support now extends to Roxar’s Tempest/More, Saudi Aramco’s POWERS, Shell’s MoReS and CMG’s STARS.

Pioneer

Speaking at the 2005 SPE last month, Diana Bustamante (Pioneer) described use of EnAble’s Bayesian statistics to perform rapid analysis and integration of production data from the Gulf of Mexico Harrier and Raptor Fields. These offered ‘quick answers to reservoir analysis and reservoir management questions and [provided support for] well-intervention and deepwater rig availability decisions.’ More from SPE paper 95401.


Software short takes, R&D, sales…

News from Ryder Scott, CartoData, Quorum, Foster Findlay, Rapid, Badleys, ElectroBusines, Energy Solutions, Geovariances, IHS, GX Tech., Linux Networx, LogTech, Scandpower, Maraco and Geocon.

A coalbed methane volumetrics package, rscCBM, has been released as freeware by petroleum engineering consultants Ryder Scott. The package embeds standard parameters from laboratory analysis. Alternatively, users can enter their own parameters manually.

CartoData’s ‘eCarto’ provides multi-user digitizing capabilities over the internet. Users can digitize cultural information from orthophotos served from ER Mapper’s Image Web Server and store this information in a database. The application targets high volume terabyte aerial photo and satellite datasets.

Canadian midstream operator Keyera Facilities has selected Quorum TIPS software for its production accounting and contract management.

Foster Findlay Associates is releasing a new version of its SVI Pro seismic image processing application. The new release adds color and opacity labeling of fault trends, simultaneous work in time and depth and better well path integration. SVI Pro R&D is supported by Norsk Hydro which uses the tool across its organization.

Rapid Solutions has launched Centrix 2.0, a web-based E&P life cycle management solution. Centrix integrates data across heterogeneous systems, leverages automated workflows and provides standardized reporting.

Badley Geoscience has ported its TrapTester fault seal analysis package to Windows. The beta release copies data files from GeoFrame and Landmark projects. Direct data links are under development.

The UK DTI-sponsored Seismic History Matching (SHM) Project has completed its first phase. A quantitative approach to seismic history matching has been developed to compare forward modeled synthetic data with the 4D time lapse seismics.

Calgary-based ElectroBusiness has enhanced its e-business flagship EB-Securedesk with secure messaging and attachment features promising an email ‘look and feel’ for new e-business users.

Energy Solutions has deployed its PipelineManager and PipelineTransporter packages to Petrobras’ Transpetro unit. PipelineManager’s predictive modeling simulates the impact of nominations to ensure that demand can be met. Approved nominations are routed back from the simulator to PipelineTransporter for movement scheduling and invoicing.

Geovariances has just announced Version 6.0 of its flagship Isatis geostatistics package. Isatis now sports a 3D viewer (built on Open Inventor). 64-bit versions of Isatis are now available for Solaris and Irix.

IHS Energy has released a new ‘Desktop’ version of its Enerdeq data viewer. Enerdeq Desktop, a.k.a. the ‘next generation’ AccuMap integrates Canadian data with a new GIS and query-based front end.

Input/Output unit GX Technology has announced the commercial release of its Reverse Time Migration (RTM) technology, a compute-intensive algorithm for imaging complex salt tectonics.

HPC specialist Linux Networx is to offer IBM’s General Parallel File System (GPFS) with its Linux-based clusters. GPFS is a scalable parallel file system that supports hundreds of terabytes of storage within a single file system.

Calgary-based LogTech has released a new version of its LAS tools and LAS tools pro products.

Scandpower Petroleum Technology reports sales of its Drillbench Presmod package to Total E&P Norway and of OLGA 2000 to Houston based consultants Intec Engineering.

Maraco is to offer its GasPal facility modeling package over the PetrisWinds application hosting infrastructure.

The most prolific contributor to the latest release of the Colorado School of Mines open source Seismic Unix project is Garry Perratt of The Geocon Group.


Workstation-ready synthetics from TGS

TGS is using synthetic seismogram technology from Loren & Associates to add value to A2D’s log database.

TGS is offering ‘Workstation Ready’ synthetic seismogram packages that tie well curve data from its A2D LogLine database to multi-client 3D and 2D seismic data survey in the US Gulf of Mexico. The synthetics are processed with Loren & Associates’ technology which corrects for borehole conditions.

Loren & Associates

Loren’s ‘Geophysical Workstation Ready Processing’ (GWSR) leverages multi-dimensional relationships between a range of rock properties, formation pressure and depth. Original data is used if possible—with correction of invalid log curve data and intelligent interpolation of missing intervals. Intervals with different lithology such as clastics and carbonates, wet and hydrocarbon bearing zones are regressed separately. Additional variables, such as anisotropic effects from deviated boreholes can be added.

Package

Synthetic seismogram packages can be created for groups of wells within a TGS seismic survey. Deliverables include the synthetic generated from a well-based wavelet, an acoustic impedance curve and a time/depth table. Loren also offers shear-wave sonic for both wet and hydrocarbon-bearing intervals. Additionally, fluid substitution modeling can be performed to compute AVO trend volumes to delineate prospective areas within seismic volumes.


GeoLogic deploys Labrador’s eTriever

New ‘generic’ web-based data access technology for data centers and client in-house datasets.

Labrador Technologies Inc. (LTI) has moved quickly following its return to the oil and gas market announced last month (OITJ Vol. 10 N° 11) with the launch of its new ‘eTriever’ secure, web-based, generic data retriever. eTriever is built on LTI’s dynamic, web-based Labrador query engine. eTriever requires ‘no installation, no CD updates, and no maintenance.’ eTriever also promises ‘comprehensive query, reporting, and exporting capabilities.’

Data Center

eTriever is to be deployed in GeoLogic’s Calgary data center, opened earlier this year (OIITJ Vol. 10 N° 6) along with GeoLogic’s GeoScout client.

Modeller

Upon request, LTI also plans to use its Labrador Modeller software to incorporate clients’ proprietary data stores into eTriever’s ‘web of accessibility.’ Additional Data Centers will become accessible as a function of customer demand.


Petris announces Recall enhancements

Log data management tool now shares PetrisWinds infrastructure, adds GIS and ‘Archive’ edition.

Houston-based Petris Technology has been busy integrating the Recall petrophysical data management system into its PetrisWINDS Enterprise (PWE) infrastructure. Petris acquired Recall from Baker Hughes last July. The company claims that the enhancements are the foundation of a strategy to make Recall ‘the most open well bore data management system in the market.’

Pferd

Petris CTO Jeff Pferd said, ‘We now have a product road map that amplifies the value of both Recall and PWE, adding value by integration. The extensible and scalable architecture used by Petris supports the growth of Recall and will make integrating new capabilities more efficient, giving customers an open upgrade path.’

GIS

New capabilities include web-based access to Recall data, a GIS interface for search and ‘full scale’ integration of Recall into PWE—allowing for bi-directional data exchange between Recall and other applications. Another novelty is the implementation of relational database access to Recall which now provides a relational view of Recall data with full SQL support.

Archive

Petris has also announced a slimmed-down ‘Archive’ version of PWE designed for use with a single data source such as Recall. Archive users can upgrade to a fully-featured PWE as needs expand.

Hamley

Petris VP Chris Hamley added, ‘PWE is proving to be the perfect vehicle to expand the interoperability of RECALL, as it has for other applications.’


ADNOC chooses SGI 64 Itanium 2 Altix

128 GB of shared memory to handle ADNOC’s reservoir simulation with Schlumberger’s Eclipse.

While SGI’s Irix architecture is getting a bit long in the tooth, its high-end Linux solutions are selling well into the upstream, as witnessed by Abu Dhabi National Oil Company’s (ADNOC) acquisition of a 64 processor Altix Bx2. Unlike conventional Linux clusters, the Altix offers a single image of the Linux operating system, a window onto its massive 128GB of SGI’s NumaFlex shared memory.

Eclipse

This high-end configuration is specially suited to reservoir simulation applications such as Schlumberger’s Eclipse, the target application for ADNOC’s new toy. Shared memory enhances memory read/write bandwidth and eases the programming task of writing to multi-CPU nodes. The Altix is capable of scaling to 512 Itanium 2 nodes and 6 Terabytes of shared memory.


P2ES QByte opens Calgary Data Center

Yet another data center opens in Calgary—offering aggregated public/hosted data storage.

P2 Energy Solutions’ recently acquired Qbyte unit has just opened a new Petroleum Data Center (PDC) in Calgary. The PDC is both a library of commonly required public petroleum data sets, as well as a secure hosting facility for company specific proprietary data.

Danielewicz

QByte VP Michael Danielewicz said, ‘Qbyte provides a reliable, secure and cost effective solution to your petroleum data needs. Our open systems approach delivers current and accurate data, public or proprietary. Qbyte’s petroleum data offering is all about choice: choice of data, choice of supplier and choice of software.’

Data aggregation

QByte acts as a data aggregator for multiple vendor data sets, but clients sign and manage a single contract with Qbyte. Flexibility is claimed to avoid locking users into long term agreements. Qbyte eliminates the need to load CDs to update internally maintained public data sets from multiple vendors.

Petro-LAB

Client access is either by users’ own data access tools or with Qbyte’s Petro-LAB geotechnical mapping, query and reporting tool. P2ES has installed more than 235 Qbyte information systems at more than 145 oil and gas companies and claims the largest client base of any vendor in Canada with over 80% market penetration of Canada’s top 50 oil and gas companies.


CERA—‘Not peak, undulating plateau’

CERA’s Robert Esser reassures the House of Representatives on ‘Peak Oil’ and world reserves.

Cambride Energy Research Associates’ (CERA) Robert Esser testified before a US House of Congress subcommittee hearing on the peak oil theory this month and suggested that the Peak Oil concept is ‘not very helpful’ and lacking in ‘descriptive power.’ CERA’s research suggests that, ‘rather than an imminent peak, we envision an undulating plateau two to four decades away.’

IHS Energy

CERA’s analysis combines its own research with parent group IHS Energy’s oil field data. Esser recognizes that ‘the planet has a finite resource, and the world is consuming 30 billion barrels a year.’ But the oil supply situation ‘needs some clarification’ to consider technology, economics, timing, fiscal and regulatory terms, and ‘a comprehensive understanding of current and future productive capacity.’ CERA believes that ‘the model for peak oil has been and continues to be flawed. The resource base is still poorly understood and it appears to continue to expand.’

Polar

CERA’s views are diametrically opposed to the Matt Simmons’ analysis (OITJ Vol. 10 N° 10). CERA believes that ‘the world is not running out of oil in the near or medium term.’ In fact there is ‘a substantial build-up of liquid capacity over the next several years.’ Furthermore an increasing share of supplies will come from ‘non traditional oils’—from the ultra-deep waters, oil sands, natural gas liquids, gas-to-liquids, coal-to-liquids, etc. On the issue of Saudi Arabia’s reserves, CERA is bullish and sees no justification of claims that production is about to ‘fall off a cliff’.’

What could go wrong?

CERA believes the risks to capacity expansion are mostly above ground: people, rigs, yard space, and raw materials are in very short supply; costs have been driven up; and the situation shows no sign of easing. This will limit the expansion of the exploration effort and slow the rate at which new projects will be sanctioned. Other risks include ‘creeping nationalization,’ ‘resurgent nationalism,’ and ‘tightening fiscal terms.’

SEC

CERA believes that current reserve estimates, particularly those filed under the United States Securities and Exchange Commission (SEC) rules overly conservative. A 2005 CERA report ‘In Search of Reasonable Certainty’ advocated revising the SEC rules to align them with the Society of Petroleum Engineers’ guidelines to create ‘a globally consistent data set that covered the vast majority of the world’s oil and gas reserves.’

Digital Oilfield?

Curiously, Esser omitted to tell the House about CERA’s own great ‘discovery’ of recent years—the promise of the ‘digital technologies’ that were supposed to add some 125 billion barrels to world reserves (more than Iraq’s current reserve base!). These were revealed in the 2003 CERA study of the ‘Digital Oilfield of the Future’ (OITJ Vol. 8 N° 2).


Geodynamic Solutions’ web maps

New web-based ‘unified geographical interface’ and ‘Layer Wizard’ for enterprise GIS.

Upstream GIS specialists Geodynamic Solutions has just announced WebMaps, a web-enabled interactive navigation and search technology which serves as a ‘unified graphical interface’ for map-based data. WebMaps leverages ESRI’s ArcGIS Server architecture and democratizes user access to GIS resources.

Layer Wizard

A related product, the ‘Layer Wizard’ is an ArcGIS extension that simplifies identifying, loading, and symbolizing spatial data. The wizard is compatible with enterprise Geodatabases (ArcSDE), personal Geodatabases and shapefiles. Layer Wizard uses the ArcGIS metadata engine and database. A standalone application, the Layer Wizard Administrator controls the configuration of the main user interface.


GeoFields goes for Inline NAS

ArcGIS-in-a-box solution with 12.8TB disk store to host GeoFields’ clients’ pipeline data.

Pipeline data management specialist GeoFields is to deploy Inline Corp.’s Network Attached Storage (NAS) to host its client’s data in its Atlanta data center. The installation comprises Infile’s FileStorm F1E Engine and a MorStor S30H loaded with 12.8TB of SATA disk. The hardware will provide a ‘robust’ data storage and back up solution for business critical data.

Schaaf

GeoFields president Brad Schaaf said, ‘The new storage solution will allow us to expand our hosted solution offerings to oil and gas pipeline clients, while better serving existing customers with improved performance and reliability. Inline’s cost performance ratio, expansion capability, and track record in the GIS industry were key factors in our purchase decision.’

ESRI

Inline was selected by ESRI to manufacture the recently announced ‘ArcGIS-in-a-box,’ a hardware and software bundle incorporating ESRI software and Inline storage and servers.


Shell’s smart decisions leverage MetaCarta

Shell is ‘finding new plays’ with its IM architecture spanning structured data, documents and GIS.

Speaking at IQPC’s 5th Annual Oil and Gas Exchange in London last month, Global Explorationist Adam Dodson unveiled Shell’s smart decision tools, for providing end users with ‘what you want, when you want it, the way you want it!’

Autonomy

Shell Global E&P’s enhanced content delivery embeds ‘intelligent’ search across structured (database) data, unstructured (documents) and geographic information systems (GIS). Shell’s multi-million document store runs on OpenText’s LiveLink, with high-end indexing and search, leveraging Autonomy’s natural language information retrieval technology for automated document tagging.

MetaCarta

Structured data stored in internal libraries and in vendor data stores is manually tagged with Flare Solutions’ taxonomy . The cherry on the IM cake is MetaCarta’s text-based geographical indexing—this recognizes spatial location information in natural language and tags such information with geographical location. Roll in Shell’s SDE-based spatial data stores (OITJ N° 10 Vol. 3) and you have the complete picture. It seems to be working for Shell—as Dodson said, ‘Seriously, we are finding new plays with this stuff.’


Halliburton opens KL app hosting center

HP and EMC team to build Malaysian data center around Landmark’s Team Workspace.

Landmark Graphics has opened an application hosting center in Kuala Lumpur to provide infrastructure, applications and services to oil and gas companies in Malaysia. The hosting center is located in Landmark’s new office in the Petronas Twin Towers. The center is built around HP’s AMD Opteron Linux Blade servers, Landmark’s Team Workspace Portal and an EMC network attached storage (NAS) NS702 system. Connectivity is assured by a gigabit network and fiber-optic backbone to companies in the vicinity.

Bernard

Landmark boss Peter Bernard said, ‘The hosting center removes the burden of managing and maintaining the technical computing infrastructure from E&P. Outsourcing data and applications lets companies focus on their primary goals.’

EMC

Peter Ferris, EMC’s oil vertical manager added, ‘Supporting the demanding infrastructure requirements of the upstream is becoming challenging and costly. EMC’s enterprise NAS solutions assure the availability, performance and scalability requirements to meet the service-level requirements demanded by providers like Landmark.’


Kalido master data management webinar

Shell spin-off, IBM, Knightsbridge and Ventana to tame today’s IT ‘unwieldy organisms.’

Shell spin-off Kalido, Ventana Research, IBM and Knightsbridge Solutions co-hosted a webinar this month on the topical subject of master data management (MDM). According to Kalido CEO Mark Smith, information architectures have evolved into ‘unwieldy living organisms’ with data stored in applications, systems and external sources. For many organizations, the lack of reference data on customers, products and employees is an obstacle to a coherent IM strategy. Merger and acquisitions complicate the picture and data quality issues abound. Data is now generally recognized as a core asset, but there is a lack of quality tools and processes for its management.

Silos

Commercial companies (Kalido originated in Shell’s Oil Products unit) are faced with data governance challenges. In particular, when managing data across the ‘silos’ of CRM, ERP, Finance and HR, companies are faced with multiple versions of data much of which is wrong. This is where MDM comes in, to aid in ‘data governance’ across the company, centralizing reference content across applications such as Siebel, SAP, Oracle, Cognos and PeopleSoft. Kalido’s MDM layer is described as a data backplane or blueprint the enterprise.

Gaines

Dan Gaines (Knightsbridge Solutions) believes that managing enterprise master data is too important to be left to IT or outsourced. ‘Managing the data assets needs to be done in house and by the business. Standards are key—for data resources and architectures. We need to identify the key processes for business owners and data stewards. We also need to measure data quality before we can improve it.’

Data governance

Paula Wiles Sigmon (IBM) described IBM’s data governance council, which was established earlier this year. The council is to identify data challenge and their solutions in the fields of ETL, EAI and EII. Again, the target is the customer-product-supplier space. David Waddington (Ventana Research) emphasized the strategic importance of data governance. High level support is necessary to ensure that tough problems (like agreeing on data definitions) and resistance from those who think ‘it’ll delay my project work’ can be overcome.

Hot topic

Winston Chen (Kalido) outlined how Kalido’s MDM tools were helping Unilever, Shell and BP’s business people ‘collaboratively control and manage master data.’ A Kalido poll found MDM one of the hottest IT topics today. Companies are contemplating appointing a Chief Data Officer to head up the new ‘data governance’ discipline which now encompasses data management, data quality and integration. One challenge for would be users of B2B is to ‘get your own data house in order!’


© 1996-2024 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.