Action this month by US consumer advocacy group, ‘Food & Water Watch’ (FWW) highlights what might prove to be the oil industry’s data ‘Achilles heel,’ with the charge that BP’s ‘Atlantis’ Gulf of Mexico production facility ‘lacks the documentation needed for safe operations and maintenance.’ FWW has asked the regulator, the Department of the Interior’s Minerals Management Service (MMS) to ‘immediately suspend production’ on the facility, one of the largest in the world, pending ‘further investigation of documents critical to the project’s safe operation and maintenance.’
BP Atlantis has been operating for over a year but still, according to FWW, lacks ‘a large percentage of engineer-approved and up-to-date documents for its subsea equipment.’ FWW has also written to Secretary Ken Salazar and the MMS Director Liz Birnbaum calling for ‘a complete investigation [to] prevent a catastrophic failure.’
FWW executive director Wenonah Hauter said, ‘We are concerned that the lack of final, engineer-approved documentation may mean that the platform has serious design problems [that could] increase the risk of operator errors and harm to workers, the environment and local fishing communities.’
FWW staff attorney Zach Corrigan told Oil IT Journal ‘A whistleblower inside BP provided us with a database that we have had validated by an independent engineer. We are working with the whistleblower’s attorney to see how much of this data we can make public.’ FWW’s analysis of the leaked data suggests that Atlantis is lacking approved ‘issued for design,’ ‘issued for construction,’ and ‘as built’ subsea piping and instrument diagrams (P&IDs). Such ‘incomplete documentation’ could lead to ‘a catastrophic error’ on the platform, located 240km off the Louisiana coast and vulnerable to hurricanes.
Hauter commented, ‘BP prides itself on being a progressive, green company and it should do everything in its power to ensure its largest facility is fail-safe. Given the risks of operating without this documentation, we urge MMS to launch an immediate investigation.’
BP spokesman Ronnie Chappell told Oil IT Journal, ‘We have reviewed the allegations made by FWW and have found no evidence to substantiate the organization’s claims with respect to Atlantis project documentation. BP has complied with MMS regulations requiring compiling and retaining ‘as built’ drawings for the project, and has provided documentation as requested by the MMS.’
‘Atlantis was designed and built to meet BP and global industry engineering standards, including review and approval of project design and construction procedures by professional engineers. The engineering documents for Atlantis have the appropriate approvals and platform personnel have access to the information they need for the safe operation of the facility. The Atlantis field has been in service since October 2007 and has safely produced more than 50 million barrels of oil. The platform was successfully maintained through the course of two major hurricanes in 2008. Its safety, operations and performance record is excellent.’
Comment—Speakers at document and data conferences across the upstream, from geoscience to construction, have bemoaned the parlous state of their data and the difficulty of getting adequate resources for its management. Some have forecast exposure to regulatory risks and non compliance ‘issues.’
Many companies are struggling to address the problem of maintaining up-to-date engineering documentation across the complex design, build and commission life cycle of the modern offshore facility. Whatever the outcome, the FWW case will be music to the ears of engineering document management software vendors!
LMKR, a Dubai-headquartered consultancy has rolled-out XpoSim, a computer-based training tool that was originally developed in-house for internal use. The company notes that today’s university output falls short of industry requirements—hence the need for a new training toolset tuned to ‘Gen-Y’ new hires with a predilection for cell phones and gaming.
Simulators are already well established in aviation, healthcare and for preparing workers for the hazards of offshore work. Now simulator use is extending to workflows such as capturing knowledge from a retiring workforce. LMKR has leveraged gaming technology to build an ‘extensive repository’ of upstream workflows and solutions. XpoSim, an exploration simulation game is used to familiarize trainees with upstream workflows and concepts. Trainees start with an exploration block, acquire seismic, interpret, propose drilling locations and perform economic evaluations. Companies can load their own data. The system also provides metrics of users activity in what is described as a ‘continuous improvement process.’ More from www.lmkr.com.
Imagine for a second that you are about to present a paper on the subject of special relativity. OK I know that is a bit like suddenly finding yourself on the train wearing your pajamas, but bear with me. You would I am sure, whether you were an expert or a pajama-wearing tyro, feel some obligation to include a reference or two—perhaps to ‘prior art’ by one A. Einstein?
Citing references is second nature in scientific articles. If you don’t supply references, your work will lack authority and may even be considered plagiarism. It is unlikely that you alone were responsible for the whole information value chain that led up to your oeuvre. We all stand on the shoulders of giants, although as another physicist, Carl Sagan once said, ‘to make a apple pie, first create the universe,’ but I digress...
Now, for another second, imagine that you are an engineer working for a major oil company. You had a neat, although not exactly rocket sciency idea of how to model ‘uncertainty’ or hook together a few dynamic simulators to solve an everyday problem in oilfield development. You will likely check the literature to see if there are any previous presentations on similar usage. These will be duly cited and off to print or presentation we go.
It is curious though that while such usage acknowledges ‘prior art’ in the form of other oil company work on a related theme, it actually overlooks the majority of the intellectual property that contributed to the solution. This is because, increasingly, what makes the oil and gas production world go round is software. And software is manufactured by vendors. And vendors are ‘commercial’ so we do not like (or may not be allowed) to mention them in case we are accused of the heinous crime of commerciality.
Yea, the ‘scientific’ world, and here I include all of our learned societies and conference organizers, has an abhorrence of ‘commerciality’ that verges on Marxism-Leninism.
At the 2009 EAGE (see our report on page 6) I listened to an oil company presentation which went pretty much along the above lines. Lots of interesting trivia about plugging this simulator into that and observing the results. True to form, none of the software components were actually named. In the Q&A I ask what tools had been used. The presenter duly gave chapter and verse with a caveat that such information went counter to the EAGE’s policy on commerciality.
This got my hackles up. I felt an editorial coming on. I was going to do some lambasting! Fortunately, I met the EAGE president Phil Christie shortly afterwards and was able to do a trial lambast on the spot. Phil was a bit taken aback by the idea that mentioning a product name was deprecated as being commercial. He pointed out that the latest evaluation scheme used for the extended abstracts allows for a moderate degree of commerciality. He also pointed me in the direction of a First Break article* he authored in May 2005.
Phil sent me a copy of the new guidelines. These offer a series of ratings for various aspects of a proposed paper. Thus, a paper offering ‘fresh insights from a case study’ can expect a contribution of 4 points, while a submission that shows ‘no overt or excessive commercialism’ will get a measly point. However, should your contribution include what is deemed to be commercialism ‘intended for marketing purposes rather than technical enlightenment’ you get null points, a fatal error and your paper is out.
It is clear that authors probably don’t even get down to the bottom of the list. If they do, they are unlikely go for the single point that a mention of a trademark might bring, when this means they run the risk of seeing their paper rejected for ‘commerciality.’ Vendor co-authors just have to ‘grin and bear it.’ Such is the lot of the humble supplier. For there is another factor at work here, the tendency towards arrogance of those lucky enough to be writing the checks—I know, I used to be an arrogant check writer myself!
The EAGE instructions to authors are a distinct improvement on a blanket ban on ‘commerciality’ but they don’t go far enough. In fact they are not even coming from the right direction! We should be much more concerned about giving credit to the software authors who enable our studies. There should be a systematic obligation to say what tools are used. In fact if we are concerned about the accuracy of modeling outcomes, we should really be citing software version numbers, in case a few years down the road, a bug is found that invalidates the whole study.
Adding software references is a simple enough idea, but you may like to know why it’s not going to happen. The real problem is that the learned societies get so much of their revenue from ‘commerciality’ themselves that any ‘free’ mention of a product name is seen as a threat to their revenue stream! Societies are not against commerciality at all—they just want to make sure they are getting a piece of the action!
SAS’s Whitepaper ‘Geology, Geophysics and Reservoir Engineering (GGRE) Analytics for the Oil and Gas Industry’ is midway between a manual for SAS’ geostatistical package and an introductory text. Author Keith Holdaway, a geophysicist, now works with SAS’ R&D unit. Holdaway’s starting point is the need for proper integration of disciplines, data fusion, risk reduction and uncertainty management. Here ‘soft computing’ methods allow information from various sources with varying degrees of uncertainty to be integrated and mined for relationships between measurements and reservoir properties.
SAS Geostatistics, a component of SAS Analytics, provides answers to questions on risk and uncertainty, ‘endorses’ reserves information and ensures that exploitation plans are in line with targets. SAS Analytics offers predictive and descriptive modeling, forecasting and spatial analysis that incorporates variograms, kriging and simulation to better understand the reservoir.
Industry use cases include Total’s use of nonlinear regression to condition production data and Stingray Geophysical’s ‘GODS’ permanent reservoir monitoring system.
Perhaps Holdaway’s most contentious suggestion is that making access to such complex tools easy turns time-constrained geoscientsts into geostatistical experts. Can an ‘intuitive interface’ make up for a lack of a ‘specific skill set and a knowledge of the ‘nuances of statistical analysis?’ Whatever the answer, SAS geophysics offers a point-and-click interface that exposes data mining, text mining and visualization tools to expose GGRE data across disparate geoscience disciplines. Our main regret though is that, as a geophysicist, Holdaway did not make more of this opportunity to explain the relationship between the broadband approach of geostatistics and the band limited Fourier analysis. Geostatistics is more than a spatial equivalent of time series analysis. Download the SAS Whitepaper from www.oilit.com/links/0907_3 (registration required).
Reposl-YPF’s Argentina-based YPF Gas unit migrated from ‘proprietary’ UNIX operating systems to Red Hat Enterprise Linux (RHEL). As Adriana Marisa Vazquez, responsible for the UNIX administration group at YPF explained, ‘At YPF, decisions are made only after thorough testing and research, and the IT team had to prove the migration from the old proprietary servers to the new platform would pose no risk to the reliability, availability and performance of the systems. We chose RHEL for a number of reasons, especially its lower costs, simplified management and compatibility with our SAP and Oracle solutions.’
Today, over 80% of YPF Gas’ Oracle databases and 90% of its SAP applications run on RHEL. The new deployment also leverages RHEL’s virtualization capabilities. YPF Gas is now able to virtualize servers for testing and development and can rapidly deploy a virtualized server live into production. YPF Gas also utilizes the Red Hat Network systems management solution. Red Hat’s consultants helped out with the implementation. More from www.customers.redhat.com.
UK-based Flare Solutions has added a new component to its exploration and production catalog. EPCat-Wellfile provides what is described as an ‘intuitive’ mechanism for publishing and retrieving well-related documents, data and knowledge. EPCat-Wellfile provides web-based access to digital and physical content. A flexible storage, search and retrieval mechanism operates across data and documents. EPCat-Wellfile integrates with document management systems including Livelink and Documentum and GIS systems. The system also indexes data store in commercial subsurface data management systems.
Flare has also announced that parts of its E&P Catalog and E&P Taxonomy & Ontology products are now available on Microsoft Office SharePoint Server (MOSS). MOSS integration complements existing E&P Catalog applications with Microsoft-based search, publishing, graphic and metrics. Flare is to release a suite of discipline solutions for geoscience, petrophysics, reservoir engineering and production operations. More from www.flare-solutions.com.
Sydney, Australia-based FaultSeal has announced ‘FaultRisk,’ a Software as a Service (SaaS) solution for evaluating seal quality. FaultRisk 1.0 calculates the risk of lateral seal for fault bound prospects and defines the distributions of fluid contacts in un-drilled fault blocks and compartments. FaultSeal’s consultants have been using the tool in-house for the past year. Users can test drive FaultRisk by logging onto faultseal.com and creating an account, and ‘downloading the software’ (oops, thought this was SaaS?).
Registered users can also access FaultSeal’s online knowledge base and can simulate faults using an online ‘Allan map’ server. An online tutorial is available using data from Norway’s StatoilHydro-operated Gullfaks field. More from www.faultseal.com.
The Geospatial Information & Technology Association (GITA) has just released its 2009 Geospatial Technology Report providing a snapshot of the current state of the art. The 144 page report emanated from a study of over 500 participating GITA member companies. Most GITA members are classed as utilities—although these include many oils and pipeline companies. The introduction notes that ‘a multitude of technologies have reached the point of viability for enterprise-wide implementation. Interoperable solutions based on open standards, web access to spatial data and open relational databases are fueling a move to a geodata-based enterprise. Mobile applications and wireless/GPS use is rising.’ Commercial-off-the-shelf (COTS) landbase and imagery datasets and lower cost hardware providers have lowered the cost of entry for many users. Intriguingly, Google is now categorized as a ‘leading GIS vendor’ with 5% of full-use seats.
The survey contains a wealth of cost and implementation information that should help organizations budget and benchmark their GIS deployments. GITA’s constituency means that it provides a vision of a far more diverse supplier environment than the ESRI-dominated upstream. Copious information on engineering, data conversion and outsourcing strategy should be of relevance to oil and gas users. The full report costs $449 to non members. More from www.gita.org.
Seismic MicroTechnology (SMT) announced its new 64 bit version during its Houston User Group meet last month. The new version includes tools for ‘project mobility and management’ and also represents a transition from Microsoft Access to SQL Server Express. The move to 64-bit means that Kingdom now supports projects of 4GB. The new version also offers multi user support (although not at the same time) and scales to a full blown SQL Server edition for users interested in doing things like scheduled backups.
Internal benchmarking shows respectably fast project copy timing in SSE. The latency and overhead of the new database seems manageable in a typical PC environment. Some of the customers with larger SMT installs were concerned as to how the new storage format would work with file-based backup systems like Legato, as this now requires synchronizing both flat files and the database. One suggestion was to use Microsoft Studio Express to manage the SQL project data.
Of course it is not all roses. SMT is now selling Kingdom in two versions, a regular and ‘Advanced’ edition. This means that users who require some of the interesting new functionality will have to pony-up for the high-end edition. But the main thing is that SMT is getting serious about data management—an interesting development given that for competitor Schlumberger, ‘Petrel data management’ is considered by some as a bit of an oxymoron!
The Professional Petroleum Data Management (PPDM) Association held its first Oklahoma user group meet last month. The Chesapeake-hosted gathering attracted over 100 attendees. PPDM CEO Trudy Curtis focused on the ‘doing business’ facets of the 3.8 flagship data model. While not all users will be deploying these modules, these extensions offer support for business associate, financials, work orders and other data management and support functions. Other business related modules now cover produced volume reporting regimes and conversions and reporting hierarchies and roll up/aggregation. Curtis concluded saying ‘PPDM 3.8 now has over 50 subject areas and is robust and complete. You don’t have to use all the modules, just those that add value today.’
E&P CIO Jonathan Smith described Chesapeake’s extensive PPDM implementation. PPDM helps fulfill Chesapeake’s requirement for ‘robust solutions to satisfy ever increasing reporting requirements.’ Early work focused on ‘flash’ reports on production feeding into Excel and Spotfire for analysis. Subsequently, Chesapeake really has drunk the PPDM Kool-Aid and subject matter areas embrace financials, lease operations, rig scheduling and more—a total of 13 business areas are being implemented. Chesapeake is now planning to add Hyperion business intelligence to its PPDM master, to extend to document management and GIS and finally to retire its legacy databases.
PPDM’s Steve Cooper’s presentation on data management introduced PPDM’s ‘Data Value’ metrics that allow subjective data quality tags to be attached to data items. PPDM 3.8 is now also offering support for data governance and master data management to allow foreign data sources such as technical data, operations data and finances to be linked to a PPDM database. MDM is seen as a way of linking data across distributed master data stores. Data Quality is increasingly a focus area for PPDM members. Cooper believes that more focus is needed on targeting quality initiatives on data problems that have the most potential impact on the business through an analysis of ‘data value.’ PPDM supports data value analytics and improvement initiatives with quality metrics and audit trails.
FUSE Information Management’s XStreamline solution offers repeatable, collaborative E&P workflows to capture, integrate and share company knowledge. The solution embeds an open source Lucene search engine.
DPTS’ ‘Exploration Archives’ service provides web access to clients’ digital and hard copy data hosted and managed at DPTS’ UK facility. EA’s original data footprint was seismic survey ancillary data but the service is being extended to other data types. Flagship clients are PetroCanada UK and Venture Production Plc.
Invensys SimSci’s PRO/II 8.3 chemical process simulator has added localized language configurability for its user interface. According to VP Tobias Scheele, ‘Research shows that local language deployment improves engineer’s understanding and productivity.’ The release also enhances links with Pipephase.
Venture Information Management has extended its consultant’s ‘V-Kit’ data quality tool with more quality rules, QC functions for seismic navigation data, LAS files spidering and ‘V-CATS,’ a core and well test database.
Kadme has announced Petrel and Kingdom ‘Data for Whereoil’ tools for enhanced data management and project search across Schlumberger and SMT’s flagship applications.
V2.0 of Interactive Net Mapping’s OilElefant now links to ZebraDat’s VuVault, embedding data management technology developed for the ‘EzDataRoom’ service.
The V8.0 release of VSG’s (formerly Mercury) Open Inventor has added many new features notably a computing framework for multi-threaded apps on multicore CPU and CUDA architectures.
RokDoc 5.4 from Ikon Science sports an improved Petrel plug-in, Bayesian seismic to lithology transforms and a 3D stochastic and spectral inversion module. Users can now run their own algorithms within RokDoc.
Paradigm has announced GeoDepth Tomography, a ‘next-generation’ grid-based model builder for large, complex seismic velocity models.
Tecplot’s RS 2009 reservoir engineering pre-post processor has added an interface to Coates Engineering’s ‘Sensor’ fluid flow modeler. The interface was developed by consultants from The Strickland Group.
HP has announced a six-core AMD Opteron 2400-based workstation, the HP xw9400. A 12 core dual processor option is claimed to offer a 34% power/performance hike over previous quad-core units.
ffA’s SEA 3D Pro 2009 seismic stratigraphic analysis package is available under Linux.
dGB’s OpendTect is now supplied in a three tier license structure, a GNU/GPL base system license, a commercial edition for plug-in developers and academic license. dGB described the 3.4 OpendTect release due next month as ‘a complete seismic interpretation system that can compete with any commercial system.’
Calgary-based Neotec is to embed Calsep’s PVTsim ‘open structure’ flash module in its PipeFlo, WellFlo and ForGas simulators. The toolset targets multi and single phase pipelines and oil and gas wells and gas field deliverability forecasting.
The 2009 refresh of Entero’s ‘Mosaic’ oil and gas evaluations and reserve management system speeds calculation of large datasets ‘5 to tenfold.’ Other enhancements have been made to sensitivity analysis, user-defined reporting, royalty calculation and decline analysis.
AspenTech’s AspenOne V7 manufacturing and supply chain (MSC) extends the integrated refinery planning and scheduling solution to the supply chain. Token-based licensing and support for virtualization technologies from Microsoft, VMware and Citrix have reduced deployment time. Marathon Oil’s Phil Koenig said, ‘Each organization, refining, supply and marketing, now feels that supply chain decisions are fairly made. MSC’s high-fidelity organizational models allow us to accurately determine incentives, costs and impact of supply chain decisions.’
Kongsberg Oil & Gas Technologies has launched K-Spice, a ‘next generation’ dynamic process simulation tool. K-Spice combines functionality from Kongsberg’s D-SPICE and ASSETT lifecycle simulation solutions
The New Zealand Crown Minerals department has just amalgamated its regional core stores into a new national storage facility in Featherston, Wairarapa. The new 2,300 square meter National Core Store is equipped with a tailored shelving system suitable for the storage of both petroleum and minerals core as well as ‘fit for purpose’ viewing facilities.
The National Core Store’s indexing system is being upgraded to more tightly integrate it with Crown Minerals’ existing exploration data management system. This system is built on Landmark’s PetroBank Master Data Store, PowerHub and Team WorkSpace data management. Crown Minerals is to index cores at individual sample level. A program of core photography, re-labeling and barcoding is also underway. The upgraded information will be integrated into PetroBank and be made available through the Crown Minerals website. Explorers will be able to search a validated database of samples and gain an impression of sample completeness/quality before making their decision to access samples. Crown Minerals is looking for ‘pragmatic’ solutions to facilitate access including temporarily releasing cores for analysis. More from www.crownminerals.govt.nz/cms/petroleum.
On the other side of the globe, Common Data Access, the UK’s industry-supported data store now holds some 10,750 wells from the UK Continental Shelf. Speaking on the Schlumberger booth at the Amsterdam EAGE last month, CEO Malcolm Fleming enthused over CDA’s new Schlumberger management and technology. This includes a DecisionPoint front end, a Seabed database and ProSource for data management. The whole CDA database and software is hosted by Schlumberger and members get access to both data and the front end for worldwide use. The new Schlumberger system tripled the number of users over the previous Landmark Petrobank solution. Fleming remarked ‘Users are not interested in sophistication if it comes with complexity.’ The new seismic data store will be online in August 2009. Members benefit by no more physical storage or re-mastering. More from www.cdal.co.uk.
EAGE attendance was low this year, particularly at the plenary sessions with about 40-60 present for the mature basins session and only 20 at the start of the HSE session. Very poor for what are billed as ‘executive’ plenary sessions. The ‘Mature Basins’ session kicked off with a presentation by Mike Ames, Cirrus Energy. Ames believes that mature basins align well with small companies’ objectives. Cirrus is ‘following’ the creaming curve out to identify low risk prospects with good politics, hydrocarbons proven, infrastructure and markets. The right sort of government helps. In NW EU, governments, in general have ‘got it’. In the Netherlands, the ‘small fields’ policy creates incentives and drives ‘faster and cheaper’ developments. But this is not the case in many other countries. If governments want entrepreneurial small caps, they need to relax ‘minimum requirements’ and focus on people and practices rather than ‘five years of operating experience.’ Access to data is also important. Information, ‘data with purpose’ should be made freely available on the web!
Nick Maden described how Petro-Canada has carved up its asset portfolio into risk categories from T1 (production) to T4 (frontier) plays. Until recently, Petro-Canada was ‘bleeding to death’ on its T4 spend. In 2004, the company changed tack and aimed to find 100 million barrels per year in a ‘sustainable’ way. This was done by leveraging existing geographical focus, building acreage positions and losing the ‘dross.’ The company was to ‘learn from its mistakes and move on.’ No more ‘just another go,’ no more 10% chance prospects. The strategy of moving away from high risk and building the portfolio has been a success with finding costs down to $3/barrel.
David Parkinson presented the results of a WoodMac survey of the period 1999 to 2008 showing that mature basins represent 53% of exploration spend. The big issue here is the question of ‘materiality,’ are the reserves enough to satisfy corporate goals? Mature basins offer a decent ROI of 17%, thanks to shorter lead times. They offer a higher chance of success and lower government take. But returns have declined in recent years and spend is suffering with the downturn. It is harder to raise capital—the UK AIM has ‘shut down’ and the delta between US T-Bond and corporate paper is now 5%. Parkinson anticipates that large caps will focus on ‘core and hub-based’ exploration, noting the increasing interest from utilities in exploration. Mature basins have produced value—but the credit crunch and downturn will impact this activity. .
Andy Spencer described GDF Suez’ progress from ‘ugly duckling’ to a ‘Cygnus’ thanks to an astute combination of farm-in and acquisition based on ‘deep technical understanding’ and ‘intensive’ reprocessing of seismic and well data.
GDF uses Rose Associates’ SAAM Direct Hydrocarbon Indicator (DHI) risk tool. Risking parameters were established by consensus amongst GDF decision makers—allowing for a ‘disciplined split’ of exploration budget categories. The Cygnus discovery, an earlier missed opportunity, is now one of the top three post 1980 Southern North Sea fields.
Portfolio management is not rocket science, but it is not generally applied. For Spencer, ‘You need to stick to the rules. Technical staff need to treat data like a dog with a bone—then you will find stuff that people have not seen.’
Chris Flavell ascribed Tullow Oil’s Southern North Sea success to an astute acquisition (from BP) and successful application of computer technology. Tullow uses Schlumberger’s Petrel to evaluate regional potential with a variety of isopachs and subcrop maps. This allowed the company to extend a UK SNS play with low materiality into the adjacent Netherlands. Comparison of UK and NL creaming curves suggests that the grass is greener on the NL side of the fence where structural traps are still in play.
Russ Bellis (ExxonMobil) stated that mature basin E&P depends the effective application of technology. It also depends on our ability to develop subtle plays and produce from smaller targets than standalone development would require. Materiality and commerciality are keys. Mature basins are good for commerciality thanks to the infrastructure. The creaming curve data from West Africa deepwater is ‘incredibly steep.’ Here a robust petroleum system is under attack from geoscience and engineering and is proceeding at a furious pace. Key technologies include DHI and deepwater drilling knowhow. A range of geoscience technologies is applied from plate tectonics through petroleum systems to pore analysis a.k.a. ‘from plates to pores’. Exxon addresses the ‘materiality’ question by building ‘global perspectives’ and testing opportunities in a consistent way. If a play fails, it is re-evaluated when new data or ideas comes along.
Glyn Edwards showed how multiple software applications were harnessed to provide Monte Carlo-based simulation of BP Angola’s multi field development. The deepwater development includes three over pressured oil reservoirs tied in to a single FPSO. BP’s ‘Top Down Reservoir Modeling (TDRM) application was used to drive Roxar’s RMS geomodeler and Landmark’s Nexus fluid flow simulator in an automate loop. This allowed BP’s asset team to investigate a range of geological parameters and to fine tune the development in the face of constraints such as gas re-injection. 700 multi-reservoir simulations were run in 2 months on a PC cluster.
A quite exceptional, if not unbelievable presentation was made by a Scottish startup, ‘Adrock.’ The company claims to have invented a low power infra red laser sounding technique that provides a ‘99.9% correlation with lithology’ at a depth of several kilometers. The device is claimed to generate standing waves to get returns far into the ground. Needless to say this met with some skepticism and not a little outrage at the lack of technical backup for the paper!
A less contentious system was presented by Paul Hatchell, Shell. The autonomous seafloor system for monitoring reservoir deformation seeks to provide a service similar to that provided by satellite InSar surveys—but in 1,000 meters of water. The study was initiated to identify un-drained compartments on Ormen Lange, and to ‘de-risk’ the timing of 4D seismics in the face of uncertainty as to reservoir compressibility. There is no point doing a repeat survey until there is something to see. Shell turned to underwater acoustic ranging specialist Sonardyne which came up with an autonomous solution capable of running for three years. Acoustic sensors on 3 meter high stands on the sea bed ‘wake up’ every hour and chat to each other with coded pulses. Ormen Lange lies beneath the famous Storegga landslide boulder field creating line of sight issues. Accuracy of around 1cm/km was obtained and the system even detects ‘weather’ in the form of mini storms at the sea bed. There is clear evidence of centimetric movement down to the center of field although the current network is considered too small to properly evaluate the subsidence.
Those with access to serious compute resources will be interested in Mark Noble’s (Ecole des Mines de Paris) presentation (with Philippe Thierry, Intel) on the use of millions of shots to derive a surface velocity model for seismic static corrections. These techniques have been known about for years but are extremely compute intensive. Algorithms need to be aligned with current computer technology but with an eye on the future. The tomographic techniques are being evaluated on a 2048 core ‘fat tree’ Nehalem EP X5560 2.8GZ cluster with a 24GB/node interconnect. The idea is to be ready for the advent of ‘multi petaflop’ machines and to be able to spec out interconnect and memory requirements when the technology is ready for prime time. Noble envisages a 100 million shot workflow running on a 100,000 plus core system—real soon now!
Consultant Claudio Turrini presented an entertaining multi-media traverse of the Alps from the Po Valley to the Rhine Graben—all built with Google SketchUp. This included Google Earth movies across Alps, geological story-telling with maps, cross sections and 3D seismics. Petroleum geology was checked out en passant with well log information from a Po Valley reservoir. One minute we were looking at results from Midland Valley’s 2D/3D Move in Google Earth, the next were in Google’s Flight Simulator for a Mont Blanc flyover! Google SketchUp allows any geo-referenced information to be visualized.
Another enthusiastic Google Earth aficionado is TNO/VU’s Garry Sonke whose ‘GeoMAX’ package offers inexpensive 3D mapping. The free GeoMAX package leverages the open source Collada 3D modeling tool, the Google Earth API and any 3D gridded geology. The results are pretty neat—an arbitrary map cross section is pulled out of the earth to reveal the underlying geology. On the downside, both these last two Google Earth presenters mentioned that establishing a dialog with Google was just about impossible.
This article is an abstract from The Data Room’s Technology Watch from the 2009 EAGE. More information and samples from www.oilit.com/tech.
The fourth annual Semantic Days conference was held earlier this year in Stavanger, Norway, hosted by the Norwegian OLF trade body. Semantic Days is a meeting place for the industrial use of semantic technology with contribution from oil and gas and other industries, research institutes and universities. The ideal behind the semantics is a web of ‘linked data,’ available to people and computers, that spans all oil and gas activity and exposes a ‘new class of intelligent services and applications.’
Norway has been in the forefront of oil and gas data standardization—with the 1SO 15926 suite. This work is increasingly being leveraged in a semantic context and is underpinning the next generation of integrated operations as Norway’s oil and gas activity moves into the high north.
Richard Sagli, project manager of StatoilHydro’s Integrated Operations in the High North (IOHN) program, outlined the potential and opportunities for the technology in oil and gas. A lot of investment has been sunk in IO R&D and Norway is a leader in first generation IO implementations. These have provided improved communication between onshore and offshore operations, faster and improved decision cycles and increased production. The second generation of integrated operations will bring service companies tighter into the loop, through standardized data exchange and reduced communication barriers. Time line for the IO Gen 2 nirvana is circa 2015 by which time we will see integrated operator/vendor centers, 24/7 ops and fully automated processes.
Sagli offered the following definition of IO as ‘the integration of people, process, and technology to make and execute better decisions quicker.’ IO is enabled by the use of real time data, collaborative technologies, and multidiscipline workflows. Other facets of IO include ‘increased opening hours and availability,’ an increased requirement for remote monitoring, diagnostics and assistance without the need to send people offshore, new requirements for IT, comms and collaboration facilities and increased use of standard data formats such as PRODML.
Where do semantics come in? Sagli sees the value add of semantics applying across the board from better HSE to increased production. The IOHN project includes the development of a ‘reliable digital platform for integrated operations.’ This will be leveraged across drilling, through production to operations and maintenance activities in the remote and hazardous conditions where limited operational personnel and ‘zero footprint’ solutions will be achieved by ‘interoperable XML RDF and OWL schemas supporting data flows from highly instrumented fields.’ IOHN’s success will depend on a close link between the pilots and operations—with prototyping and testing on real world cases and data, ‘challenging existing silos.’ IOHN sub projects include workflow automation, detailed production optimization, field performance analysis, Avocet production surveillance, integrated asset modeling and advanced control of wells and reservoirs.
On the industrial front, presentations from SAP and IBM set out to address the problems posed by the Norwegian oil and gas community. Daniel Oberle, Senior Researcher at SAP’s ‘Campus-based Engineering Center’ in Karlsruhe, Germany outlined some best practices in collaborative ontology engineering that SAP has contributed to the German THESEUS/TEXO project. Theseus involves the creation of an ‘internet of services,’ leveraging a modular approach to ontology construction by domain experts, each responsible for their own piece of the ontology pie. Oberle suggests that IOHN should adopt a similar approach, combining domain-specific ontologies from HSE through reservoir to transport and have vendors base their oil and gas software on the industry ontology. End users will see familiar views of data through windows on P&ID diagrams, reservoir, data sheets and CAD drawings. More on the web of services from www.texo-project.info. More on IBM’s Chemical and Petroleum offering in our Redbook review on page 9.
Patrich Simpkins has joined Allegro’s board. He was previously Senior VP of Duke Energy.
Arthur D. Little has appointed Joseph Coote as new Global Energy & Chemicals Practice Leader.
Oliver Taylor has been appointed Seismic Data Manager with Common Data Access.
Former FERC Director William Hederman has joined Concept Capital’s Washington Research Group as Senior Energy Analyst.
James Smith, Chairman of Shell UK, has been elected president of the Energy Institute.
Energy Solutions has appointed Clare Spottiswoode and Pascal Colombani to its board. Chairman Lance Hirt is stepping down.
FMC Technologies has named Alf Melin as Treasurer.
Emmanuel Kerrand heads-up GE Energy’s new Technology Center in Belfort, France. Cristiano Tortelli has been named sales leader for GE Oil & Gas’ turbomachinery business.
GeoEye has appointed Joseph Greeves as Executive VP and CFO. Greeves hails from Managed Object Solutions.
Richard Tyson has joined Getech as a Senior Geochemist and Peter Kovac as Structural Geologist.
Gray Wireline Service has appointed Mark Harris as president and CEO.
Nance Dicciani has been named to Halliburton’s board of directors.
Mark Stevens has been appointed as VP of Sales (EMEA) for Isilon and Tim Goodwin is VP sales, Japan.
Liz Birnbaum is the new Director of the US Minerals Management Service. Birnbaum was previously director for the Committee on House Administration at MMS.
The Well Abandonment Working group of UK Oil and Gas has launched an abandonment performance review and is creating a North Sea well abandonment benchmarking database.
Larry Heidt is chairman and CEO of Nabors Well Services.
UK-based Oilennium has joined forces with Oilfield Training Center (OTC) to develop interactive training programs for Ghana’s emerging oil and gas industry.
Richard Cooper is now CEO of Offshore Hydrocarbon Mapping. Cooper joined OHM when the company acquired Rock Solid Images in 2007.
Optelligent Solutions has joined the ‘PetroComputing IT Professional Outsourced Development and Sales’ (PODS) initiative, sharing a single sales team amongst multiple vendor companies.
Paul Groves is now MD of Petrofac Training Services succeeding Leigh Howarth. Groves was previously with Shell.
Missy Edwards has joined Red Hen Systems as marketing analyst.
At the International Supercomputing Conference last month, an SGI Altix ICE 8200 cluster chez Total was the list’s top commercial deployment.
Tom Blades has joined Siemens as President and CEO of its oil & gas division.
Peter Nürnberg has joined the Texas Digital Library as Chief Technology Officer.
Nanci Caldwell, former executive VP and chief marketing officer for PeopleSoft, has been appointed to Tibco’s board of directors.
Nigel Pollard has been appointed new Director of Engineering Excellence by John Wood Group. Pollard hails from BP.
Carrie Manion, Senior VP Sales, is leaving WellPoint Systems.
The French Petroleum Institute (IFP) reports that in 2008, its wholly-owned RSI affiliate, specialized in oil and gas process simulation, acquired the assets of the Autodynamics division of Trident Corp. The deal adds an international dimension to RSI’s activity and raises its technical and marketing game.
Norwegian AGR Group ASA reports the outcome of discussions with its lenders following its earlier ‘technical’ breach of covenants. The outcome is an amended loan agreement with reset covenants. AGR is now back in compliance and is to issue 175 MNOK worth of new shares in September 2009. The company also reported the sale of its NETool software to Halliburton.
Oil country B2B solution provider Amalto Technologies has closed an ‘A’ share sale to Succès Europe. Amalto was advised on the deal by Aelios Finance.
AspenTech reports the closure of the US Federal Trade Commission’s investigation its sale of process simulation products to Honeywell. AspenTech and Honeywell have also agreed to settle related litigation. The outcome of the FTC investigation is that, inter alia, AspenTech will continue to provide HySys case data in a standard, portable format.
Computer Science Corp. (CSC) has acquired BearingPoint’s Brazilian operation (BPB). Financial terms were not disclosed. Many of BPB’s 550 employees have SAP and oil & gas specialization.
Fusion Petroleum Technologies and Vanguard Engineering and Oilfield Services of Oman have teamed on the provision of geophysical and geological technology, services and software in The Sultanate of Oman.
Geoservices has acquired Petrospec Technologies, specialists in seismic and real-time pore pressure and rock property analysis. The deal follows on the heels of recent Geoservice acquisition including Production Wireline Services and Wireline Services and Manufacturing.
Ventyx has acquired nMarket’s software business from The Structure Group.
Troubled Satyam Computer Services has rebranded as ‘Mahindra Satyam’ reflecting its new anchor stakeholder Mahindra Group.
IBM has just published a 150 page Redbook, ‘Discovering the business value patterns of an integrated information framework’ (IBM IIF), the first of two Redbooks covering IBM’s Chemicals and Petroleum (C&P) industry solution. IBM IIF is a environment where a whole lexicon of oil and process industry standards from OPC-UA, through ISO 15926 to PRODML have been ‘integrated’ via WebSphere and its Rational Software Architect graphical modeler. The IIF supports a changing IT environment and portfolio, ‘flexible business requires flexible information technology.’ The IIF builds on components and a services oriented architecture. Target activities include real-time oil and gas exploration and production. The Redbook ploughs through the usual marketing benefits of SAO and enumerates a whole slug of benefits that IBM’s infrastructure provides including asset monitoring, security, dashboards and automation to name just a few. The thesis is that C&P, like other industries suffers from poor information access and visibility. Other challenges include productivity, aging workforce and data and information management.
Different requirements, for application efficiency and IT simplicity, have led to the operations/IT ‘dilemma’ that is hampering progress in the intelligent oilfield. An engineering leadership/IT partnership is needed to overcome this.
The Redbook proposes ‘solution patterns’ based on the IIF. Here, ‘metadata is the key.’ All this is driven by the Solution Studio, that adapts the IIF to different business problems. Document management also gets a plug—leveraging IBM’s Lotus text and workflow environment. The Redbook suffers from being far too wordy, stringing out a rather thin technical content with a vast amount of marketing mumbo-jumbo. The section on standards is fairly informative as is the general background on upstream IT. Whether anyone interested in either of these topics will rush to read this obscurely titled oeuvre is not so clear. Download the Redbook from www.oilit.com/links/0907_2
StatoilHydro has awarded GE Oil & Gas a $70 million contract for subsea control systems on its Tordis Vigdis Controls Modification (TVCM) revamp. At the heart of GE’s offering is the new VetcoGray SemStar5 subsea electronics module (SEM). SemStar5 houses a new, high reliability computer system that was developed from GE’s experience in avionics and space research. Designing a systems that will spend decades on the sea bed is a similar problem to computerizing a space probe. Here the watchwords are reliability, redundancy and the avoidance of obsolescence.
GE’s Manuel Terranova told Oil IT Journal, ‘The SEM houses a computer plus communications to topside—think of it as a node for in a subsea LAN. SEMs sit on a production well or manifolds, collect and transmit them to the topside control room. They need to be ultra reliable as they may run for a decade and are very expensive to replace. Obsolescence is very real and problematical We spotted an opportunity to modernize subsea equipment dominated by proprietary technology that was based on 1970s avionics technology. Our experience of embedded systems in wind turbines led us to deploy an off the shelf chipset from Freescale to control the SS5’s computer driven valves. We also selected an open standards-based operating system—QNX ZSoftware’s ‘Neutrino’ real time system. This gives us protection against obsolescence, better I/O and cost of ownership and an easier upgrade path. We know that we will find engineers in 10 years time who can program QNX. This is not the case for pods that are coming out of the water now after only 10 years operations.’
The SS5 is destined to play a role in StatoilHydro’s digital oilfield initiatives—although so far this is limited to remote health check and diagnostics of subsea hardware. As bandwidth between the topside and subsea controls improves, the SS5 is expected to play a more active role in production optimizations. GE is currently looking at a control room interface for the SS5 although this is at a ‘very early stage.’
The TVCM order for 60 units in one go is the first global SS5 deployment. The contract will be executed under the GE’s 2007 subsea production systems frame agreement with StatoilHydro. More from www.ge.com/oilandgas.
Writing in the latest issue of Kuwait Oil Co.’s ‘Digest’ magazine, Abdulaziz Al-Dhuwaihi describes the North Kuwait Integrated Multi-Simulation Asset Model project as one of the most advanced reservoir simulation exercises ever undertaken. ISAM sets out to improve production forecasts by integrating full physics models of all producing reservoirs with surface networks and facilities. Before IMSAM, fields were modeled independently without network constraints.
14 Nexus simulation models were converted from ECLIPSE. Seven major full field models (five in Eclipse and two in VIP) were converted and tested for conformity with the original models. Another team worked on the surface network of pipelines and separators. The final model allowed wells to be switched from one separator to the next as water cut dictated—or from gas lift to ESP operation as required. The exercise provided insights into network constraints on water disposal under different scenarios.
EDS reports on the completion of the harmonization of Total’s UK pension scheme administration. When Total acquired PetroFina and Elf in 1999, it inherited a ‘complex web’ of pension administration systems. EDS’ ExcellerateHRO unit performed the work. The HRO business unit was formed when EDS acquired Towers Perrin’s benefit outsourcing business.
NuStar Energy has hung its IT hat on Microsoft technology including Windows Server, Microsoft System Center and Microsoft Forefront security and access control to manage its 8,491 miles of pipeline, 82 terminals and two asphalt refineries.
Plains All American Pipeline has implemented P2 Energy Solutions’ Enterprise Upstream Oil and Gas package to manage its joint venture accounting. The EU JVA module works alongside Plain’s Oracle E-Business Suite.
Jinan, China-based Pansoft is to develop a finance and accounting system for Sinopec. The custom system will be based on the SAP Netweaver platform. Pansoft developed a similar system for PetroChina in 2006, now used by some 2,000 PetroChina subsidiaries!
Abu Dhabi-based National Petroleum Construction Company has chosen AVEVA Plant for use on the Integrated Gas Development Project, Habshan Platform Offshore Facilities for ADMA OPCO.
Subsequent to its 2007 deployment of CartoPac’s eponymous software, Chevron Pipe Line has now implemented CartoPac Field Server to collect field data over the web.
Shell has ‘standardized’ globally on MatrikonOPC’s OPC connectivity products. Shell will deploy Matrikon’s products to ‘solidify’ key parts the real-time layer of its Enterprise Production Architecture. The deal gives Shell unlimited use of Matrikon products and paves the way for a migration to the ‘upcoming’ OPC UA architecture.
Shell signed a similar global agreement with Industrial Defender for the supply of a global control cyber security monitoring solution. ID’s intrusion detection system will be deployed at all of Shell’s refineries to protect against cyber security threats to the production process.
GDF Suez E&P Norge has selected SPT Group to supply the online flow assurance system and associated flow assurance services on its Gjøa field development.
BP has entered into a three year agreement with Dresser Wayne, now the ‘primary supplier’ of fuel dispensers to BP’s EU and North American retail operations.
A ConocoPhillips/CNOOC joint venture operating the Bohai Bay Phase II project has deployed Emerson Process Management’s PlantWeb digital architecture across six platforms and the Peng Bo floating production storage and off-loading vessel a.k.a. Hai Yang Shi You 117. PlantWeb will provide an automation infrastructure for process control and asset management and will be integrated with shutdown and fire and gas systems. Emerson’s DeltaV digital automation system is on each of the seven phase II facilities and forms the core of the Bohai Bay digital oilfield.
Sunoco Logistics Partners has selected Energy Solutions International’s (ESI) leak detection software for its Western Pipeline system. Some 8,000 miles of Sunoco pipelines are now under ESI leak detection licensing. ESI also reports that service provider OSD Pipelines has acquired a network license for its PipelineStudio design and simulation package.
International Data Services (IDS) has signed with OMV for the provision of a web-based daily reporting services IDS WITSML-enabled DrillNet service. IDS also signed a similar deal Addax Petroleum this month.
Invensys’ Operations Management unit is to use PAS’ ‘Integrity Automation Genome Mapping’ software to facilitate automation system deliveries worldwide. The package creates and verifies engineering documents and tracks configuration changes. The solution, to be used internally by Invensys, targets the delivery of automation systems from initial build to site acceptance testing.
Process standards body the Fieldbus Foundation has announced a ‘copyright agreement’ with Prolist International that allows Prolist to publish Fieldbus parameter names and definitions in own process control device and system specifications and database. Prolist’s lists of properties in process control engineering provide standardized descriptions of process control and instrumentation equipment. More from www.prolist.org.
Energistics has issued a call for participation for a PRODML special interest group (SIG) to work on the definition of production component types. Members and interested subject matter experts are invited to help formulate the classification. The work is to facilitate a ‘Shared Asset Model’ service provision concept providing a means to query asset models for installed components More from www.energistics.org.
Yokogawa Electric Corp. has hung its hat on the recently approved ISA100.11a wireless communication standard for process industries. Yokogawa is developing a new field digital network platform that sets out to solve wireless network compatibility issues and ease integration of wired and wireless technologies. ISA100.11a benefits include high reliability, an IPv6 foundation and compatibility with existing non-wireless protocols including Foundation Fieldbus, Hart, Profibus and Modbus. More from www.yokogawa.com and www.isa.org.
Presentations in the oil and gas industry track at SAP’s SAPphire user group, held earlier this year in Orlando included Scorpion Offshore, Tesoro and Holly Corp. Pam Thompson described Scorpion Offshore’s symbiotic relationship with National Oilwell Varco (NOV) which has enabled Scorpion to leverage its larger partner’s supply chain management infrastructure and concentrate on drilling wells. Scorpion has deployed SAP NetWeaver process integration across finance, control, project planning and asset management. The startup partnered with NOV on the development of a supply chain solution for rig procure-to-pay processes, ADP payroll, banking and HR. The alliance allows Scorpion to leverage NOV’s expertise, relationships, and purchasing power. Scorpion gets outsourced inventory management and procurement and benefits from NOV’s ‘highly maintained’ SAP implementation and SAP support personnel.
Refiner Holly Corp. used to have ‘fragmented’ compliance solutions. Nelson Burns described how these have been consolidated into SAP’s EHS Management module. This has led to standardization of compliance tasks across the company and reduced the risk of environmental compliance tasks ‘slipping through the cracks.’ Better visibility has improved external reporting. SAP EHS deployment has laid the way for integration of environmental compliance with plant maintenance. The system spans air emissions, waste management, safety and water. Today the SAP EHS database is the centerpoint of Holly’s EHS Management. The project saw around a dozen consultants working for approximately nine months. A second phase will see extension to Holly’s pipeline and distribution company for Title V and DOT compliance.
Eugene Nel outlined deployment of SAP Business Objects Planning and Consolidation application at Tesoro. The work was carried out by consultants from the Aster Group. Tesoro’s SAP software line-up includes SAP ERP, IS-OIL and various NetWeaver components. Aster Group’s specialty is the Planning and Consolidation application. The project saw a migration from a spreadsheet-based solution driven by ‘Excel experts’ to a structured environment that combined the flexibility of Excel with a central database supporting budget and forecasts and offering capital management, ‘what-if’ scenarios and 10-year projections—all based on an authoritative live data store. Other Aster Group oil and gas clients include Forest Oil, ARC Energy trust and Newfield. More from www.oilit.com/links/0907_3.
Dave Wallis, OFS-Portal’s EAME representative gave a keynote address to IQPC’s 3rd Annual Oil and Gas Procurement Summit in Abu Dhabi last month on the subject of master data management (MDM) in upstream e-business. Wallis described MDM as ‘the elephant in the room that no one wants to talk about!’ But for companies that want to reap the full benefits of e-commerce, MDM is an essential ingredient of ‘data hungry, rich catalogues.’ The need for quality master data is increasingly felt as data volumes rise at between 30 to 60% per year.
MDM is the key to successful inventory search and management. This translates into tangible benefits such as being able to place timely orders for not in stock items and in an efficient supply chain. A balance needs to be found between the effort required to generate and maintain quality master data and the costs and risks. Wallis offered the following quantification of the cost poor data. For every $1M sold, $35K is lost through supply chain inefficiencies. One in four items in a catalogue has an error that costs between $60 and $80 to fix. A man-year is wasted on duplicates for every 400,000 lines of data.
Effort needs to be focused on quality issues such as consistency, ‘de-duplication’, standardizing formats and using a ‘robust’ taxonomy. This is where the American Petroleum Index (API) PIDX standards come into their own (Wallis is VP of EU API PIDX) offering a consistent nomenclature and leveraging the international UNSPSC code set. The crux of ‘e-MDM’ is to understand which data is relevant, what data to use and what to discard. In this context, one size does not fit all. Commodity, specialty and ‘single source’ items all need a slightly different approach.
Wallis wound up with a brief outline of Kuwait Oil Co.’s successful e-MDM initiative. KOC selected IBM’s Maximo to manage its material master, and commenced implementation this year. Data cleansing was outsourced and a quality data set was input to the new tool. Following a review of industry classification standards, KOC went for—you’ve guessed it, API PIDX. More from firstname.lastname@example.org.
DuPont has awarded Computer Sciences Corp. (CSC) a contract for the implementation of a compliance program to satisfy the new European ‘Registration, Evaluation, Authorization and Restriction of Chemicals’ (REACH) regulations. Introduced in 2006, REACH calls for the registration and tracking of chemicals that are manufactured, used or transported in Europe. Non compliance with REACH could incur penalties such as plant shut downs or a transport ban.
CSC has developed REACH methodology for complex portfolios of projects that involve IT, data management, process and organizational change.
Russ Owen, president of CSC’s Americas Commercial Group said, ‘This solution is part of a new family of compliance offerings from our Chemical, Energy and Natural Resources vertical. It will leverage our understanding of REACH legislation and will help establish DuPont as an early leader in this effort.’ CSC has provided infrastructure, applications and business consulting services to DuPont since 1997, notably via a 10 year, $4.3 billion IT outsourcing deal, extended in 2005 with a $1.9 billion extension through to 2014.
Total, in presentations to Energistics and at the 2009 PNEC Data Integration conference has proposed to create a standards-based modeling environment for its aging North Sea Alwyn oilfield. The intent is to leverage the emerging RESQML reservoir model description protocol, along with WITSML and PRODML to share real time temperature, pressure and production data. Discovered in 1974, Alwyn has seen a ‘pioneering’ initial phase, followed by a couple of decades of ‘industrial’ exploitation. The field is now distinctly ‘brown’ and is entering a new ‘craftsman’ phase, a fight against terminal production decline. The standards-based life of field management to will ease data QC and allow different vintages of data and interpretations to be merged. The Alwyn area comprises six fields plus infrastructure. Compression introduced in 2006 is expected to add 10 million barrels production. A huge dataset has accumulated over time including over 100 reservoir modeling studies, history matching studies and multiple seismic surveys. RESQML aims to handle such evolving datasets and expose updated models to engineers and economists. Total’s enthusiasm for data integration stems from an early multi discipline study of different oil water contacts that enabled the discovery of a satellite field. RESQML ‘could turn such extraordinary events into more routine occurrences.’ As part of the initiative, Total is offering an Alwyn dataset to the public domain. More from www.oilit.com/links/0907_1.
Dallas-based ENRG has announced a web-based real-time noise monitoring service for oil and gas drillers. The patent-pending service enables operators to react ‘proactively’ to potential sound issues created by drilling or completion operations, by providing round the clock access to well noise data from personal computers or cell-phones. Noise monitoring uses high end Brüel and Kjær equipment and solar powered wireless telemetry. Data is recorded in WAV format.
ENRG flagship clients XTO Energy and Range Resources are using the new service to monitor their drilling and frac operations in the Barnet and Haynesville shale areas. Both involve urban and high-impact operations and are now subject to stringent city ordinances governing the impact of drilling noise on their communities. ENRG’s monitoring technology provides alerts when noise levels exceed compliance levels, allowing operators to change drilling parameters and prepare responses to complaints. Continuous tracking allows operators to fully utilize allowable intermittent noise thresholds and can avoid the compulsory installation of sound proofing. More from www.enrgconsultants.com/nmt.html.
Invensys has teamed with SPT Group on interoperability of dynamic simulation for subsea applications. A new interface links SPT Group’s ‘OLGA’ well and pipeline multi phase flow simulator with Invensys’ DYNSIM package.
Invensys VP Tobias Scheele explained, ‘As part of our dynamic simulation engineering and operator training simulator platform, the new interface includes improved numerical integration, drag-and-drop configuration and the ability to view dynamic profiles of key OLGA parameters from inside DYNSIM. The solution addresses flow assurance modeling and integrated operations of subsea production systems and topside processing facilities, such as platforms and FPSOs. The system can be used to ‘virtually’ test and model a plant’s process control system to reduce commissioning time. It also supports the development of high-fidelity training of control room operators. Plans are afoot to integrate DYNSIM into SPT Group’s EPPM solution for online monitoring of multiphase production and processing facilities. More from www.invensys.com and www.sptgroup.com.
SK Energy, one of the largest oil and gas companies in the Asia-Pacific has built a ‘Visualized Operations Intelligence System’ (VOIS) for its principal refinery and production complex at Ulsan. VOIS leverages Siemens’ SIMATIC IT XHQ operations intelligence infrastructure to bring together data from SKE’s tanker jetty, processing storage and offloading facilities. VOIS aggregates operational data across the vast Ulsan Complex. VOIS replaces SKE’s legacy IT, business and process monitoring systems which were point solutions operating in data silos. VOIS implementation took four months and included localization of some documentation from Korean to English, the plant’s working language. The system monitors 70 process units and supports around 1000 users.
SKE is now looking to install a second plant-wide XHQ system at its Incheon refinery complex and throughout its Seoul operational HQ. Other flagship XHQ users include Exxon and Saudi Aramco which deploys the toolset across the whole scope of its operations from the well head to loading terminal. XHQ forms the backbone of Aramco’s high-end control room that featured in a CBS video last year (www.oilit.com/links/0812_3). More from www.oilit.com/links/0907_5.include ("copyright.inc"); ?>