Generally speaking, when you are out shopping (or in shopping if you are an eBay freak) the name of the object of your desire is not an obstacle to the incipient transaction. Buying a breakfast cereal does not require a degree in semiotics. “I'll have some Kellogs Cornflakes please”. “Certainly sir, shall I wrap that for you?” Money changes hands and you can rush back to the car, tear the pack open, pour in some milk and get munching.
The naming convention here is maker: ‘Kellogs’, and product: ‘Cornflakes’. There may be other considerations in you purchase—size of the pack, flavor differences, whether animals were hurt while filming the TV ads etc. but these are not reflected in the product’s name. It is as if most marketing folks have at least an intuitive grasp of relational theory—you do not have to put all your data in the same field!
Four names already
Straightforward nomenclature is definitely not the case for exploration software companies. I’m afraid I have to get ‘personal’ here. Take Schlumberger for instance. Sorry, that should be Schlumberger-Sema. No it shouldn’t—it should be Schlumberger Information Systems! Or should that be GeoQuest?. Hey we have four names already—and we still haven’t drilled down to a product yet!
Who to call?
Other corporations have similar complex naming patterns. Before you even begin talking of products, you have to build a mental map of corporate structure and recent acquisitions just to figure out who to call. Who you do end up calling will likely determine what you end up buying.
In some cases we still have a layer of corporate structure to drill down through before we get to the ‘cornflakes’. These are the ‘foreigners’ - as in Halliburton’s Geographix or Magic Earth units and Schlumberger’s Merak or Petrel (actually that should be Petrel Workflow Solutions I believe!). These semi-autonomous product lines reflect acquisition history and a desire on the part of the acquirer to preserve the independence and dynamism of the acquired company.
Lets suppose that you now know who to talk to and want to get inside the product lines. This is where things get really interesting. A bewildering variety of product topologies and nomenclatures are now on display. These reflect a confusing interplay between corporate acquisitions, embedded technology, delivery mechanism and ‘forward-looking statements’ from the marketing department.
It might seem churlish to take any of these great companies to task for what is probably just a degree of exuberance. But try these nomenclatures for size. Following the latest deal with Rose & Associates, Rose’s Multi-Mode Risk and Reserve Analysis (MMRRA) software is now to be rolled into Merak’s Value & Risk 2002 suite—itself a part of Schlumberger Information Solutions Living Business Plan. Or for instance, from the inside out, a product such as Landmark’s TrackPlanner is a part of the DecisionSpace Technical-to-Business workflow integration, as well as being a component of the new(ish) ‘Drill-to-the-Earth model’.
Others get sucked in to the same nomenclature nightmares. If you are interested in finding out about Baker Hughes’s flagship well data management system ‘Recall’ you might like to visit the Baker Atlas website where you will be invited to log on to ‘BakerHughesDirect’, select ‘TotalRecall’ from the ‘BakerAtlas or INTEQ’ areas. Paradigm too is ‘brand bulimic’ and also likes to jumble products and ‘workflows’. Thus Paradigm’ flagship Geolog log management, correlation and petrophysics application embeds the Epos data management and interoperability integration framework—all components of the ‘Trace-to-Target’ workflow.
So what? you may ask. I suppose that since this is a complex business, it is fair enough that this is reflected in product names. But beneath all this there lies a certain tension between technologists and the marketing department. Technologists work to a 10 year cycle. Marketing? Maybe 6 months to a year. The two can get out of sync. I suppose that the moniker mayhem is inevitable—although there is another way. Take the re-branding of Total Fina Elf as simply ‘Total’. That’s what I call radical!
Talking of acquisitions, I was earlier on at least, how do corporations manage to achieve cost savings on acquisition? Given that most companies were so big before mergers, further economies of scale are unlikely. New savings can only come from eliminating inefficiencies and excess costs. Nothing new here—but the corollary is amusing.
Inefficient is good
Midsized independents should pretty themselves up for the merger beauty contest by making themselves as inefficient as possible. This will let the bean counters boast of huge post-merger cost savings. The corollary of the corollary is of course that if you merge two well run shops, true savings are likely to be impossible without driving your retained employees to drink.
International Datashare Corporation (IDC) and Divestco are proposing to merge in what is described as “an arms-length share exchange transaction”. If approved, the deal will give IDC shareholders 1/3 of the new merged company and Divestco shareholders the remaining 2/3. Shareholder approvals will be sought at meetings of both companies next month.
IDC president Norm Stein said, “This is an excellent opportunity for IDC shareholders. Divestco has made an impressive number of strategic acquisitions to position itself as a premier software, data, and technical services company.”
Both companies have a recent track record of aggressively pursuing growth by acquisition. Divestco was founded in 2000 when Petromap and CDPubCo were amalgamated. Last year Divestco acquired SeisView, Rocky Mountain Data, the brokerage division of Pulse Data, Kernel Technologies, Dynamic Solutions, and Digi-Rule. Earlier this year Divestco also acquired Excalibur-Gemini.
In 2001, IDC acquired MSI Capture, AnGIS and Nickle Map Service Ltd. Last year IDC acquired Riley Electric Log in a $4 million transaction (see Oil ITJ Vol. 7 N° 3). At the AAPG this month, Divestco COO Matthew Puzey told Oil IT Journal that the combined companies would have the largest collection of North American well logs—around 4½ million.
Divestco CEO Steve Popadynetz added “The merger will improve data delivery from our software—adding industry leading libraries of raster and digital well logs, drill stem tests and land data. IDC’s software will complement Divestco’s oil and gas applications, and its storage and tape archival and copying facilities will complement our seismic services business.”
The amalgamation is conditional on receiving a 2/3 majority vote from both IDC and Divestco shareholders. Alberta-based Divestco provides oil and gas software, data, seismic brokerage, and technical services. The working title for the newly merged company is ‘Amalco’ but this will likely change once the deal is done. More from www.divestco.com and www.datashare.net.
Jason Geosystems, part of Fugro’s Geoscience division, has purchased Volumetrix—the reservoir modeling software house. UK-based Volumetrix’ flagship product ‘FastTracker’ is described as a ‘next-generation’ reservoir modeling tool which produces a fast, interactive reservoir model. FastTracker supports the ongoing change management requirements of lifecycle reservoir development through its ‘UpdateAbility’ concept. This allows users to harness the constant flow of new data and insight that result from drilling and production operations.
Jason plans to integrate FastTracker with its 3DiQ Reservoir Characterization software suite. Volumetrix will become an integrated part of Jason Geosystems. All current key employees have agreed to continue their employment. Volumetrix’ origins go back to Sperry-Sun and Dresser and an old BP Alaska project - ‘Decision Driven Reservoir Modeling.’ Last year Landmark’s GeoGraphix unit acquired ‘worldwide distribution and marketing rights’ over FastTracker (see Oil ITJ Vol. 7 N° 2).
Oil IT Journal managed to catch Microsoft’s Marise Mikulis at the Salt Lake City AAPG this month. Mikulis was appointed as Microsoft’s energy industry manager and senior strategist for energy industry enterprise solutions last year.
Oil ITJ—What is Microsoft’s strategy with regards to the energy industry?
Mikulis—First I want to make it clear that Microsoft has no intention of creating oil and gas specific software. The plan is rather to make sure that Microsoft’s horizontal offerings get optimal use in oil and gas.
Oil ITJ—Does that mean a focus on Microsoft’s application software or operating systems?
Mikulis—Of course Microsoft Office has a great role to play in oil and gas enterprise computing. But we also believe that the new Windows Server 2003 should be of particular interest to oil and gas companies because of its designed-in emphasis on security.
Oil ITJ—Talking of servers, what is your strategy towards Linux in the high performance computing environment?
Mikulis—You know, Microsoft spends $6 billion per year on R&D and is directing a considerable amount of this in the way of high performance computing (HPC). Microsoft will be presenting its HPC activity at a half day event in Houston next month.
Oil ITJ—You have personally come a long way since your days at the Petroleum Open Software Corporation. How do you square working with Microsoft with your previous ‘open’ work?
Mikulis—Microsoft has an interesting viewpoint on standards. Microsoft wants to compete by having the best commercial implementations of industry standards. For companies seeking to deploy web services, Microsoft will offer the most compelling solution. Microsoft’s Jim Clark sits on the API/PIDX board and is also on the UN EDIFACT committee.
Oil ITJ—What of your ERP activity?
Mikulis—Last year, Microsoft acquired Danish ERP developer Navision whose solution has been rolled into our Business Solutions unit. Navision delivers integrated functionality for financial management, supply chain collaboration, CRM and e-commerce. We believe that these solutions can provide support for remote operations and plants which are often run like small autonomous businesses.
Seismic Micro-Technology has opened a European subsidiary SMT Europe based in Croydon, UK to serve clients in the EAME region. SMT claims clients in 22 countries in the EAME region.
Magic Earth’s Houston-based visionarium now sports a 48 processor SGI system with 100 GB of memory. At the AAPG, Bill Matthews explained that the hardware is used to present prospects to potential investors – “for Wall Street, Magic Earth has become a capital attracting tool”.
Open Spirit Corp. has named Dan Piette as CEO. Piette was previously with Input/Output, Inc.
Dallas-based independent geologist Pat Gratton has been voted president-elect by the membership of the American Association of Petroleum Geologists. Gratton will serve as AAPG president in 2004-05.
A2D Technologies has appointed Rich Herrmann (previously with Petroleum Place) as VP of data access and integration.
Houston-based Seitel Inc. has opted not to continue a ‘standstill agreement’ with its lenders following the failure of protracted negotiations. This month an undisclosed buyer has obtained 37.6% of Seitel’s senior, unsecured debt. Seitel has retained Jefferies & Company to act as its financial advisor in connection with a possible restructuring.
The 2003 Offshore Technology Conference (OTC) reported strong attendance with over 50,000 registered – the highest attendance since 1985. The boost came in part from a rush of interest from service companies hoping for a part of the Iraq reconstruction program.
TGS-NOPEC CEO Hank Hamilton reported first quarter 2003 results as in line with expectations but warned that “Despite excellent commodity prices, market signals about near term demand for seismic and well log data remain mixed.”
TGS-NOPEC unit A2D Technologies announced that its eastern Canadian offshore well log collection is now available through its LOG-LINE Plus online database.
The Petroleum Open Software Corp. (POSC) has re-branded itself as the Petroleum Open Standards Consortium (still POSC!).
The deliberations of the POSC data storage special interest group are now available in the form of a draft recommended practice. The document is available on posc.org.
Landmark is to introduce OpenWells next summer—new software for well information management. OpenWells leverages Landmark’s Engineering Data Model (EDM), and integrates with third-party tools and Landmark’s other drilling and well services software. OpenWells includes basic daily operations, cost and AFE reporting, drilling and completion operations and well services.
Royal Dutch Shell has signed with IBM as its latest customer for Grid computing services. IBM worked with Shell to build a reusable software toolkit ‘wrapper’ around legacy applications to create a Grid-enabled infrastructure for seismic processing. The solution, based on IBM xSeries running the Globus Toolkit and Linux, cut processing times ‘while improving the quality of the data’. Shell research physicist Jacob Buur said, “Grid computing is important to Shell because it offers the potential to create a truly unlimited resource, with a uniform interface to a variety of services. This will allow Shell’s independent companies to engage in closer cooperation.”
IBM also announced that 35 companies, including Cisco Systems, are to form the foundation of a Grid ‘ecosystem’ designed to foster Grid computing for businesses. IBM and Cisco are working to enable enhanced Grid services for Storage Area Networks (SAN). Cisco’s intelligent multilayer storage networking architecture helps lay the foundation for building globally scalable access to Grid data. The integration of intelligent services into the network helps simplify data access across the Grid and resource sharing and management.
The Sarbanes-Oxley Act is the cornerstone of federal regulators’ financial accounting reform and attempts to protect investors by holding executives of public companies accountable for misrepresentation of financial information. HandySoft Corp. and portal vendor Plumtree have just announced the ‘Sarbanes-Oxley Accelerator’ to help publicly traded companies comply with the 2002 Act.
HandySoft COO Stuart Claggett explained, “With the SEC’s unanimous vote detailing new Sarbanes-Oxley requirements, it’s imperative that companies start addressing these complex reforms, using the timeline extension to ensure that the audit process is comprehensive, timely, and accurate. To address these challenges we have developed rapidly deployable, process-based solutions that reduce time-to-compliance, improve audit quality, and provide public companies with the flexibility to deal with future regulations.”
The Accelerator helps establish internal controls and reporting procedures, while providing a platform for collaborating with auditors and board members. The Accelerator leverages HandySoft’s ‘BizFlow’ platform and is built on Plumtree’s Enterprise Web Suite.
Speaking at KPMG LLP’s Global Energy Conference in Houston, Halliburton Energy president John Gibson expressed a contrarian view of the Act, “Sarbanes-Oxley is the most ridiculous thing I’ve seen. We are spending more time doing certification processes that don’t improve internal control or the quality of anything. It is a ‘letter-signing’ activity that appears to be ‘blame assignment’ as opposed to change in corporate commitment to integrity. If shareholders could see how much money is being spent to say ‘I’m honest’, they would be appalled.”
The US Minerals Management Service (MMS) is starting out on a five-year transformation designed to streamline business operations. The MMS OCS Connect project will reform the way the agency operates. The MMS intends to offer customers in industry, the public and other government agencies a better service by moving MMS services on-line.
MMS Director Johnnie Burton said, “OCS Connect will maximize customer involvement by delivering essential information and allowing input via the Internet. It will streamline delivery by automating major business transactions, resulting in more timely decisions, and will simplify and unify government by minimizing redundant reporting. By using common oil and gas industry standards and proven solutions, we will leverage existing market-based practices.”
Booz-Allen-Hamilton (BAH) has been engaged as prime contractor and will assist the MMS in reengineering its core business processes, while developing a robust, secure enterprise architecture to replace the ‘antiquated legacy systems currently in place’. The OCS Connect program sets out to reduce time and cost-intensive processes and procedures ‘that are inherent in traditional paper-based organizations’.
The MMS originally engaged BAH back in April 2001 (Oil ITJ Vol. 6 N° 4) to ‘transform its Offshore Minerals Management Program into a web-enabled environment for offshore leasing and regulatory reporting’.
Unocal has just claimed an e-commerce ‘first’: an e-business transaction using the new API PIDX XML electronic invoice standard. Unocal received and processed an invoice from Schlumberger through Digital Oilfield’s OpenInvoice hosted application. The transaction process is said to comply with API’s Recommended Practice 3901. Schlumberger Oilfield Services created the invoice using PIDX format directly from its’ ERP system.
The invoice and corresponding service delivery ticket were then transmitted utilizing the RosettaNet Implementation Framework (RNIF) transport, and routing protocol directly into Digital Oilfield’s system. Unocal then used OpenInvoice’s functionality to route, code and approve the invoice. The invoice was then automatically uploaded to Unocal’s financial system for payment.
Mike Comeau, e-Procurement manager with Unocal said, “We have now demonstrated that the PIDX RNIF standard works in a production environment. Unocal’s strategy is to drive down costs through the use of technology that helps us change our processes. This first PIDX XML invoice is a significant step forward for our industry and reduces transaction costs on both sides of the equation.”
For data managers and petrophysicists struggling to understand older well logs, the Denver Well Logging Society (DWLS) has made its Historical Collection of Wireline Charts available on CD-ROM.
The DWLS has compiled a collection of most of the logging company charts published between 1947 and 1999. The collection is now available as a seven CD set from the Society of Professional Well Log Analysts (SPWLA). The CD set is said to be an invaluable reference source for anyone analyzing logging data from the past 50 years. More from www.spwla.org.
The Open GIS Consortium (OGC) has published a Web Map Service (WMS) Cookbook - the first in a planned series of books detailing the implementation and use of OpenGIS specifications. WMS defines interfaces for web-based software to learn about, retrieve, merge and query maps. The Cookbook provides the basic understanding and steps needed for implementing and exploiting the WMS interface and related technologies. Cookbook contributors include software vendors, universities, and local government users of the WMS interface from around the world. The variety of contributions highlights the different software being used and insures widespread applicability.
The book covers WMS client and server development technologies (XML, XSL/XSLT, ASP/JSP, etc.) before addressing the design of systems that implement the WMS interface along with illustrations of DTD/XML documents and XSL/XSLT style sheet usage. A final section explores implementations of WMS in existing software on both the server and client side. Detailed recipes for implementing WMS in popular commercial, open source and freeware products are provided. Download the Cookbook from www.ogcnetwork.org.
UK-based AnTech has announced a new ‘intrinsically safe’ data transmitter for use in hazardous environments. AnTech’s RED-I Data Logging System uses an infra-red ‘eye’ to transmit data in hazardous zones without external wiring for power or data transmission.
The autonomous battery powered unit is ATEX certified for Zone 1 & 2 Hazardous areas. The RED-I data logging system allows operators to collect stored data using a handheld computer without the need to physically connect to the RED-I unit. This is achieved by accessing and transmitting data via an infrared communications port housed in an intrinsically safe handheld computer. Data can be reviewed and then transferred to a spreadsheet using either an infra-red link or a serial cable.
Shell’s John Darley’s thesis is that technology is key for developing the 40-50 million barrels per day production which will be required around 2020. Renewables will only ‘get serious’ around 2050. Shell believes it gains competitive advantage from its in-house developed software packages. Darley also reports ‘dramatic take-up’ for mono-bore drilling.
Abd Allah Al-Saif described Saudi Aramco’s in-house developed software which includes a new seismic trace sort algorithm, fuzzy logic for deep gas well placement, DETECT (Aramco’s coherency package) and fractal deconvolution. Aramco has built a 27 million cell model over the Gahwar field which is simulated using Aramco’s ‘massively parallel’ POWERS simulator—a run takes 16 hours on a 4CPU PC cluster (10 million cells, 61 year history and 3400 wells). For Al-Saif, R&D collaboration between oils, service companies and academia is the way to go.
Kurt Rudolf reported that ExxonMobil has 100TB of online data – half is ‘non seismic’ and represents the most rapidly growing segment. Rudolf advocates ‘collective enquiry’ by bringing people together in a 3D visualization and real time decision support environment. A movie of the geological evolution of the South Atlantic showed ExxonMobil’s technology in action with animation of 100 million years of geologic history. Other focal points for ExxonMobil’s technology include ‘hidden play’ challenges—such as those obscured by shallow gas or salt overhang and lowside fault plays. For Rudolf, 4D seismic reality has finally ‘caught up with the hype’! In the right settings, 4D can provide insights to the reservoir manager, which are not accessible from other direct forms of surveillance. Exploitation geochemistry and other techniques provide a ‘holistic understanding’ of the reservoir and its geological context. The future will see continuous monitoring of the reservoir with micro sensors, borehole instrumentation, micro gravity, passive and 4D seismics.
ChevronTexaco’s Bob Laing observes that oil and gas technology has some awkward aspects—it involves the management of a long and complex value chain. Moreover petroleum R&D offers little to other industries. ‘Horizontal wells are not movies or medical’. Customers are just not prepared to overpay oil R&D—it is ‘cost plus’ rather than ‘added value’—unlike, for instance the IT business. The last 15 years have seen a decline in proprietary R&D but this has been replaced with ‘leveraged R&D’. CTx seeks to build ‘fundamental platforms’ – such as GoCad. Laing believes ‘next generation’ technology will likely come out of large scale integration and the new simulator under development with Schlumberger.
VizEverywhere is an ‘entry level’ visionarium described as the ‘affordable large screen visualization solution for the oil and gas industry. VizEverywhere throws an SXGA (1280x1024) image onto a wide screen. Twin ‘matched’ projectors offer stereo with passive polarized specs. A complete system comes in a ‘under $100,000’.
Tricon is planning to port its ‘Tsunami Suite’ pre-stack depth migration (PSDM) package to Starbridge’s field programmable gate array (FPGA)-based supercomputer. Starbridge anticipates 100 fold speed up over conventional microprocessor-based machines.
Not geological and not very new but Teleportec’s teleconferencing system was an eye catcher on the AAPG’s stand. Teleportec offers a fairly lifelike image with a lot of ‘presence’. The remote speaker can locate and engage people in the local environment with eye contact. The eyes really do ‘follow you around the room!’
SMT has introduced digital ‘Post-It!’ notes into its Kingdom Suite interpretation package. PakNotes allow for comments and Windows documents to be attached to an interpretation object (seismic horizon, fault, well etc.). The PakNotes are stored along with the business object – and are accessible from other components of Kingdom Suite. PakNotes is the brainchild of SMT president Tom Smith. Smith believes that Windows-based document formats are likely to outlive most other ‘standards’ and that saving documents in their native Windows formats is ‘most likely to preserve information assets over time’.
Gravitas bundles HRH’s geological log drafting and visualization software—Winlog 5 for drafting, RepGen for daily reporting and the Windart real time data link. Gravitas development was sponsored by Total and was redesigned to collect together the company’s best practices. Windart leverages the WITS acquisition standard to collect real time data feeds.
Schlumberger’s new Fault Surface tool is due for release at the Stavanger EAGE. Part of GeoFrame IV, Fault Surface automates fault picking in 3D seismic – à la Coherence Cube. It does a good job!
Trivision has released PowerCore—a new component of its PowerSuite. PowerCore lets geologists capture all information relating to the coring process and present it in a WYSIWIG printable format. PowerCore includes multi-track plotting for grain size, sedimentary structures, trace fossils and more.
Decision management software (DMS), a new component of Landmark’s Decision Space suite, promises an integrated framework for risk-based decision support. A change in one parameter—such as the depth of an oil water contact—ripples through the system to ensure updated information in all modules. Different risk constituents can be visualized with tornado plots. DMS originated as one of BP’s ‘nuggets’—key in-house developed software components.
A2D’s Log line now offers web services based remote calls from Landmark’s OpenWorks. The system offers GIS spatial data selection and data publishing to an internal PetroWeb intranet. Both Dot Net and Java client APIs are available and support business objects including well header, log attributes and e-commerce. Landmark showed how A2D data can be accessed from its Power Explorer desktop (developed from PetroBank Surf &Connect web edition and Landmark’s Open Explorer).
Corelab’s Reservoir Information Browser (RIB) offers clients a secure, password-protected website with clients real-time access to data through an ASP browser. RIB offers management of rock samples, fluids, thin sections and clients can add their own stuff including office documents, jpegs and ‘clickable LAS’ logs for core description pop-ups.
AGM has used SGI’s Volumizer (à la Magic Earth) to develop new 3D seismic display functionality for its Recon interpretation environment. Recon now integrates 3-D views of well log and seismic data with basemap and 2-D cross-section interpretation views and runs on SGI Onyx 2. AGM is working towards 200GB data volumes (next summer) allowing for ‘interpretation of the entire basin’.
Terrasciences new dipmeter module supports all dipmeter and imaging tools and loads LIS, DLIS and other formats from 4 and 6 arm tools. Output can be made to CGM, Postscript and other graphical formats. An OpenSpirit link is under development and the software is now available through the Petris Winds ASP hosting service. A new sonic waveform module loads LIS and DLIS data and displays as wiggle or variable density. A variety of views of receiver data and computed slowness displays are available and synthetic seismograms and rock strength calculations performed.
Open Spirit Release 2.5 will be out in July. A pre-release was on demonstration and is said to be ‘100 times faster’ thanks to a new ‘query by attribute’ function. The new release includes a data server for GoCad Voxets and 2D/3D SEG-Y data management. A data selector with a project copy/sync utility, Excel adaptor and Arcview based GIS search will also be introduced. The new SEG Y Module includes a novel file-based storage data structure. A line is stored as a set of files – header, trace metadata, stack, wavelet and data etc.—all kept neatly in the same folder. Open Spirit is showing credible take-up with software vendors.
IES’s new PetroRisk provides a risk management framework around IES products. PetroRisk assigns Bayesian probability distributions using Monte Carlo or ‘Latin Hypercube’. Risk is evaluated at all stages in the workflow from generation, maturation, migration, fill and spill.
NDS Pro bundles all Neuralog products ‘from scan to capture’. Neuralog supports World TIF format (WTF), Petra, Petris, ArcView—and is testing with GeoGraphix and SMT. Neuralog now also displays raster images along deviated well paths.
InfoPipe’s OwnerImage Builder takes input from sources including broker spreadsheets, Bureau of Land Management data and other land record systems. Land information is consolidated and can be output in Geoplus’ Petra, MapInfo, ESRI, AutoCAD and Geographix formats. OwnerImage Builder makes land management data accessible to workers in other divisions—notably exploration.
Rose Associates’ Performance Tracking Data Base is designed to ‘eliminate bias’ in portfolio analysis by allowing the comparison of current opportunities with corporate ‘memory’. Data is captured from Rose’s multi-mode risk analysis spreadsheets into a Sequel Server database. Various plots of actual results vs. predicted estimates can be made. Rose’s software leverages the Crystal Ball Excel add-in from Decisioneering, Inc.
EP-Tech is a Chinese-US joint venture specialized in extracting information about reservoir fracturing from seismic data. EP-Tech spectral imaging highlights channels and other sedimentary features using wavelet-based image processing.
This article has been abstracted from a 20 page illustrated report produced as part of The Data Room’s Technology Watch Reporting Service. For more information on The Data Room’s reports and to request a sample copy please email firstname.lastname@example.org.
What would you expect of a $20 booklet on content management that sets out to be a “guide for your journey to knowledge management best practices”. As content managers ourselves (writing a newsletter that is produced on paper and on the web is a fairly mainstream content management application) we know what we would expect to see. A discussion of available tools and technologies—in particular XML-based—that allow documents to be captured in a logical way and to be reused in a variety of contexts and media.
The American Productivity & Quality Center’s new booklet on Content Management (CM) starts out with a guide to justifying a CM system in your organization. Subsequent chapters include planning and system implementation, content management lifecycles, IT and some case histories. In the planning phase, the recommendation to “establish a steering committee or council of senior executives to guide and fund the initiative” suggests that CM is addressing large highly structured organizations. A couple of pages on, CM reveals that the AQPC has determined that no less than 18 different roles—from database administrator to strategic knowledge managers passing through information stewards and librarians. No skunk work here!
The booklet is peppered with lists, bullet points and questions that the CM implementer may have a hard time in answering. Many of these—like ‘what do your competitors know’ are undoubtedly stimulating. Others, like ‘how do you want to play the game’ less so. But such considerations surely put the cart before the horse. In an ideal world, CM should set up the organization so that it can answer ad-hoc questions on pretty well any subject.
Taxonomy is a key component of CM. AQPC has found that while some software is sets out to create taxonomies automatically from content, organizations rarely rely on this alone. ‘User involvement which may involve focus groups and surveys during development is critical for accuracy and acceptance of the taxonomy.’
One interesting case history is that of Schlumberger which has ‘established a work flow for submission, creation, editing and validation of content’. But the account of Schlumberger’s InTouch Knowledge Hub reveals little that assiduous readers of Oil IT Journal don’t know already.
What is CM?
This booklet falls short of satisfying our requirements on two counts. First the definition of what content management is about is never really addressed. Content management is at times used inter-changeably with knowledge management or with the ‘Portal’ and there is considerable overlap with document and records management. This ‘confusion’ is natural given the plethora of tools and issues involved. But it is a shame that the authors didn’t make a better stab at positioning CM in the quality-best practices-KM-Portal continuum. CM offers a multitude of bullet points for consideration, but fails to make a case for why CM should be the focus of your IM initiative as opposed to a DMS, a Portal, KM or RIM.
The other problem we had is that you will not get any insight into how to solve the nitty-gritty problems that you face when starting-out on content management. Should you aggregate and index native format documents? Or should you go to xml, html, PDF or a DMS? These and other serious issues around content management unfortunately get short shrift from the APQC CM booklet.
Content Management: A Guide for Your Journey to KM Best Practices. ISBN 1-928593-92-8. Hasanali and Leavitt 2003, www.apqc.org/pubs.
Trade Ranger (TR) President & CEO John Wilson has written to members at large in what amounts to a position paper on upstream e-business. Wilson recognizes that these are challenging times for our industry and that ‘high profits from upstream activity have not been rewarded through stock price performance’. Such uncertainty is ‘driving a case for greater productivity and bottom-line results, even under the worst possible financial conditions’.
For Wilson though, today’s situation represents ‘an unprecedented opportunity for e-business to realize real efficiencies through process cost reductions, and leveraging information and knowledge captured by electronic commerce. TR itself is transforming – notably with a shift in focus from technology to service. TR ‘consistently delivers over 98% percent uptime’ and is ‘constantly seeking to integrate new documents and capabilities’ into its solution.
TR is expanding its business to serve its global membership – with the opening of a new European office in Brussels. Other centers are planned. TR is also changing its corporate culture. The organization used to be staffed by ‘consultants and secondees’. These are to be replaced with ‘full-time incentivized employees’. Wilson intends to complete TR’s transformation with an increased customer focus. TR plans to ‘participate in the internal business, political, hierarchical and social networks of our members’ and thereby to ‘better assist them in the adoption of effective e-procurement strategies’. TR’s vertical markets are searching for innovative, yet inexpensive, ways to improve business performance. TR expects to see ‘increased demand for access to our exchange’ though the ‘community effect’ which lowers the cost to TR members well below that of an individual private exchange. Finally, TR helps organizations ‘adopt standards that offer real value and benefit to everyone in the community’.
The upstream IT world is not all about Sun, Linux and PC clusters as a recent survey* by World Oil reveals. In a survey of 80 oil and gas companies’ visualization centers, the study revealed that ‘88% of surveyed companies use SGI graphics products and 86% use SGI High Performance Computing (HPC) products.
SGI’s Global Energy Solutions director Bill Bartling said, “These striking results reflect SGI’s deep penetration into the oil and gas industry, particularly with those companies that are committed to high return on investment through the use of top graphics and high-performance computing equipment for such tasks as exploration and reservoir simulation. SGI works closely with companies that already have SGI Reality Center infrastructures to maximize their investments through enhanced uses in collaborative decision making, distributed graphics and the emerging field of digital oil fields.”
In the graphics and HPC servers/workstations category, SGI’s Infinite-Reality3 graphics subsystem emerged as the most popular graphics card with 55% of companies surveyed identifying it as their graphics subsystem of choice. 42% of companies surveyed use SGI Onyx2 graphics servers and at least one SGI workstation or server is being used in 86% of companies surveyed.
According to SGI, its high-performance computing, storage and visualization solutions are well suited to the petroleum industry. Seismic imaging applications make use of the SGI Reality Center’s 3D and 4D processing and visualization capabilities while SGI Origin 3000 and the recently announced SGI Altix 3000 servers and superclusters are used in reservoir simulation. More from sgi.com/industries/energy/.
*Supplement to World Oil Magazine May 2003. More from www.worldoil.com.
New Zealand’s Institute of Geological and Nuclear Sciences Limited (GNS) has won a government contract to help in the search for new oil and gas reserves. The contract is valued at NZ$3.6 million a year for six years – an aggregate of about $12 million US – provided by the Foundation for Research, Science & Technology. The aim of the program is to identify regions with petroleum potential to help attract new exploration companies to New Zealand. The new program sets out to develop a four-dimensional computer model of the prospective parts of New Zealand’s sedimentary basins.
Program leader Pete King said, “As well as geographic location, the model will show thickness and depth of burial of rock strata likely to contain oil and gas, and how factors affecting the evolution of petroleum accumulations have changed through geological time. We’ll work closely with oil companies to identify special areas where our research efforts can be focused.” The new research program contains a mix of fundamental geological knowledge, ‘big-picture’ regional evaluations, computer-based modeling, and solutions-based research attuned to industry needs. The research will add value to industry data and will be of a type not normally undertaken by exploration companies.
MJ Systems is to offer its ‘LogSleuth’ raster image log library to users of users of GeoLogic’s GeoScout application suite. GeoScout users now have on-line access to a library of over 1.3 million raster images of original well logs from over 360,000 wells drilled in Canada. GeoLogic CEO David Hood said, “This agreement gives our users timely access to a high quality set of historic data with the ability to apply their own unique interpretation and store the result in their GeoScout databases. LogSleuth also offers users the ability to modify and correct log image parameters such as depth registration - a great time-saver.”
Alberta-based GeoLogic was founded in 1983. The GeoScout exploration information system is caters to all disciplines in oil and gas and provides access to well, reserves and land information with presentation mapping, cross section, engineering and data management tools that enable the user to manage both public and proprietary data. MJ Systems introduced the concept of well logs on fiche back in 1971. MJ System’s library was converted to rasters in the late 1990’s. MJ has since delivered over 200 million micro fiches to some 1,500 customers.
Halliburton unit Sperry-Sun has teamed with Larson CGM Software and announced a new service ‘linking remote logging operations to decision makers’. HalLog Viewer is a new service that provides customers with logs in the computer graphics metafile (CGM) format. The new light-weight web-enabled well log viewing tool allows for the secure, quick, and easy exchange of complex graphical information.
Well activity information can be captured from Halliburton’s InSite data logging wireline or logging-while-drilling systems. The technology enables Halliburton to deliver log data and related information over the internet and extranets. Halliburton’s petrophysics manager Jeff Grable said, “The use of TIFF and CGM/PIP image formats produces a rich environment and functionality for easy viewing, printing, and annotating logs.”
Landmark Graphics Corp. has just released (actually it was released last year but no-one noticed!) its ‘Drill-to-the-Earth Model’ (DTTEM) software bundle, made up of a range of Landmark’s applications. The new package is designed to enable asset team members design wells and visualize real-time drilling operations in a 3-D earth model. Ivar Haarstad, project leader with Statoil, retraced the project’s history: “Back in 1999 Statoil developed a prototype in-house for data integration and real-time decision-making. This system was tested on the Heidrun field and has resulted in new work processes, reduced operational costs and increased oil recovery. This promising technology led Statoil to implement the new WITSML standard as embedded in Landmark’s OpenWire product.”
DTTEM was designed to support real-time decision-making. By integrating DepthTeam Express and Presgraf, the system performs pore pressure determination prior to and during the drilling. OpenWire enables the real-time decision-making process by loading real time MWD/LWD and trajectory data into OpenWorks. DecisionSpace AssetView enables the asset team members to visualize the results of real-time earth modeling while DecisionSpace TrackPlanner provides real-time, visual drag-and-drop well path planning.
PeopleSoft is to acquire JD Edwards in a deal which will create the world’s second largest enterprise resource planning (ERP) software company after SAP. The paper-only deal values the transaction at approximately $1.7 billion. The combined companies will have around $2.8 billion in annual revenues, 13,000 employees and more than 11,000 customers in 150 countries.
JD Edwards will operate as a wholly owned subsidiary of PeopleSoft. The acquisition expands PeopleSoft’s presence in more than 20 industries including a broad range of services, manufacturing, distribution and asset-intensive industries – particularly in the oil and gas sector. JD Edwards provides verticalized ERP software to the upstream through an association with Petroleum Place. JD Edwards upstream clients include BP Exploration and EnCana Corp.
Petróleos Mexicanos (Pemex) has awarded Fugro unit Jason Geoscience a contract to incorporate Jason’s 3D integrated Quantitative (3DiQ) reservoir characterization technology. The contract is for a two-year period, covering software, support, training and consultancy services. Jason claims a ‘string of successes’ in exploration and development in Mexico using its methodologies. Following an ‘intensive evaluation’ of available products in the market, Pemex identified Jason’s 3DiQ Reservoir Characterization software as strategically important software reducing uncertainties, risks, costs and cycle time associated with oil and gas discovery and production.
Halliburton unit Landmark Graphics Corp. has signed a three-year reseller agreement with United Devices to provide grid computing to the upstream oil industry. Grid computing creates a distributed computing and workload management environment from an organization’s existing clusters, desktops, laptops, workstations, and server nodes. The grid optimizes oil companies’ resources to accelerate compute-intensive projects. Landmark’s DecisionSpace DMS system and the VIP reservoir simulation suite will leverage the grid technology to simulate multiple reservoir development scenarios.
Landmark president Andy Lane said, “Affordable access to massive compute power is a critical ingredient in the successful prediction of reservoir performance. By combining Landmark applications with grid computing, field development planning and performance prediction may be compressed from months or years, to days or weeks.”
Ed Hubbard, United CEO added, “By leveraging computing assets they already own, oil and gas companies can significantly reduce the costs associated with high-performance computing. Companies that use compute-intensive applications such as seismic analysis, reservoir modeling and horizontal drilling will significantly reduce their research time and see a dramatic ROI increase by deploying grid-enabled applications.”
Buy-side e-procurement grouping Trade Ranger has selected Contivo’s software to automate data transformation between its network of trading partners. Trade-Ranger will be able to automate the integration of its members’ systems and data.
Contivo helps enterprises integrate data more effectively than traditional hand-coded methods. Comprised of the Contivo Analyst and the Contivo EIM Server, the software automates data transformation for integration across multiple platforms.
Trade Ranger uses Contivo’s solution in conjunction with its webMethods integration platform to connect trading partners. Contivo provides Trade Ranger with scalable, reusable data models, based on standards such as the API Petroleum Industry Data Exchange (PIDX).
Trade Ranger CTO Cornelius Morley said, “Contivo is a vital component of Trade Ranger’s integration strategy, providing us with an automated and scalable data integration solution to what was an error-prone and time-consuming task. Using Contivo’s technology, we are able to perform data mapping in a fraction of the time compared to a hand-coding approach. Contivo’s approach to integration is key in building and maintaining the links between our trading partners’ systems.”
Following trials of Wellogix Navigator last year (see Oil ITJ Vol. 7 N°s -6-7), Marathon now plans to extend the deployment of Wellogix’ e-FieldTicket (eFT) software across its North American operations. A three year contract has been awarded to Accenture and Wellogix for the migration of Marathon’s manual systems to the web.
Accenture will integrate the eFT solution with Marathon’s back-office system, enabling Marathon to electronically reconcile field tickets with the corresponding contracts. Accenture is also developing communications, training and deployment plans and will help Marathon develop a supplier-enrollment strategy.
Marathon’s eProcurement business manager Mike Rawles said, “From our pilot it was evident that eFT could eliminate errors in billing and improve the efficiency of verifying invoices. Ultimately, eFT may even remove the need for an invoice. We have found the technology easy to use, and suppliers involved in the process found it effective for them as well, especially since it can shorten the purchase-to-pay time frame.”
Wellogix CEO Ike Epley added, “Marathon understands the value of the eFT solution for both operating and capital expenditures, and this expanded, enterprise-wide implementation, will provide savings in many areas of Marathon’s purchasing.”
Schlumberger Information Solutions (SIS) and Rose & Associates (R&A) are to collaborate on the provision of risk analysis software for prospect and play analysis. The alliance sets out to link SIS’ Merak value and risk management tools with R&A’s risk estimation toolkit.
R&A partner Gary Citron said, “Better capture of project uncertainty with probabilistic methods minimizes bias and enhances market value through informed decision making.” SIS portfolio manager Doug Elrod added, “This alliance will bring the power of our tools to explorationists and new ventures teams. Our goal is to deliver integrated decision and risk management solutions that span the enterprise, from the engineer’s desktop to the CEO. Sound value and risk management gives our clients insight into their economic drivers and results in performance improvements.”
R&A was founded in 2000 by Pete Rose as an E&P risk analysis consultancy. Rose has developed software for risk analysis and portfolio management along with a seismic amplitude analysis package. R&A focuses on integrated teaching, consulting and software solutions that help client companies ‘implement consistent evaluation and performance tracking procedures’.
The new alliance is said to ‘reinforce Schlumberger’s Living Business Plan’ which sets out to ‘deliver value and risk management at every level of the organization’. There would however appear to be some overlap of R&A with Merak’s software which was previously touted as a portfolio management solution – notable following the integration of PDI’s software back in 2001 (see Oil ITJ Vol 6 N° 1).
Unocal has signed with Oilfield Service (OFS) Portal for the exchange of digital content governing many key aspects of e-commerce between the two parties. OFS Portal is the e-business front-end to a group of upstream oil and gas suppliers and service organizations. OFS Portal’s members include ABB, Baker Hughes, Halliburton, Kvaerner, Schlumberger.
Unocal e-Solutions Manager Mark Bruno said, “We now have a single source for digital content from OFS Portal members. We can now electronically purchase products and services ranging from drilling equipment to completion. This streamlined process enables us to quickly engage OFS Portal members in our eProcurement initiatives.”
Bill Le Sage, CEO of OFS Portal added, “This agreement provides an avenue for improved productivity and further cost savings in the sourcing, ordering and fulfillment of products and services between Unocal and our members. Bringing together key buyer and supplier organizations in the upstream oil and gas sector is an important step in advancing global e-procurement initiatives.”
Unocal has been in the forefront of online business particularly through its EnergyOpps acquisition and divestment portal and its use of e-business technology from DigitalOilfield. Unocal is also a member of the buy-side grouping Trade Ranger and has contributed to the API PIDX/ComProServ standards for upstream e-commerce.
New software from Exprodat promises to solve many issues associated with the use and deployment of products from multiple software vendors. Typically, such tools have different environmental requirements and time is wasted installing, configuring and upgrading multiple vendor application suites.
Exprodat’s Common Application Environment (CAE) CAE provides an extensible framework for multi-vendor application installation and configuration, a configurable application launcher and data management. A multi-system, multi-OS application execution harness means that any UNIX application can be run from any system using any data.
CAE’s generic application launching framework isolates vendor off-the-shelf products with a customizable, middle tier ‘wrapper’ so that individual vendor applications can be upgraded independently. The product utilizes a cross-platform language (Tcl/Tk - ‘tickle’) which gives portability across Solaris, Linux and Irix. Source code is provided with the product license. This ‘open source’ approach is claimed to mitigate fears of vendor lock-in.
Exprodat CTO Bruce Rodney believes that CAE will simplify user workflows with transparent access to data and applications and reduce application support costs. The CAE, originally developed for Marathon Oil, avoids ‘home-grown’ proprietary solutions and limits single vendor dominance and lock-in. More from email@example.com.