June 2004


Shell downsizes IT

Shell is to reduce its worldwide IT headcount by up to 2,700 over two years. Some cuts result from globalization and software rationalization, but others reflect offshoring to India and Malaysia.

Shell is undertaking a major reduction in its information technology (IT) head count. Between 1,900 and 2,700 IT positions are to go worldwide by year end 2006.

Zaayman

Shell spokesperson Johan Zaayman told Oil IT Journal, ‘From the outset, allow me to explain the rationale for this decision. Firstly, as Shell businesses standardize and globalize their business processes, it allows us the opportunity to rationalize the number of applications. This in turn gives the opportunity for a standardized infrastructure.’

Offshored

‘In simple terms it means that we will need fewer people to set up and operate our IT environment. Secondly, the IT market has matured over recent years and Malaysia and India now offer excellent alternatives in comparison to high cost locations such as the US, UK and The Netherlands,’ Zaayman continued.

All units

‘All Shell business units will be impacted, thus the affected IT positions will be across the board. We expect to reduce 600 to 800 IT positions in the US, 400 to 550 IT positions in the UK, and 450 to 650 IT positions in The Netherlands by the end of 2006. We further expect from 450 to 700 reductions in all other countries we operate in. These positions include Shell staff and contractors within IT.’

Attrition

‘The reduction in the IT community will come about by a combination of reductions in contractor positions, natural attrition, re-skilling and placement of individuals within other parts of Shell and severances. Thus, the reduction in positions does not equate job losses of Shell IT staff.’

Rationalization

‘We are rationalizing our applications portfolio with a view to standardizing and cutting out applications. We are focusing on fewer, more strategic IT projects. We are standardizing and rationalizing our IT infrastructure and leveraging our procurement better. Whilst this will bring impacts across all areas of IT, we expect the biggest immediate impact to be on applications development and support and in infrastructure operations.’

Wipro

A report in the Houston Chronicle, which broke the original story, indicated that Shell has awarded outsourcing contracts worth over $1 billion to IBM and Wipro for India-based IT services. Shell would not comment on the commercial aspects of these deals.


Land meets GIS

LandWorks is to acquire Geodynamic Solutions in a share exchange, bringing added GIS functionality to its land management software.

Houston-based LandWorks Inc. is to acquire oil and gas geographical information systems (GIS) specialists Geodynamic Solutions in a share exchange transaction. LandWorks’ core business, technology solutions for land management, is said to dovetail well with Geodynamic’s mapping solutions including the Petrolynx.com ASP mapping and production data analysis portal and the Spatial Search Engine.

Bramwell

LandWorks president Jerry Bramwell said, ‘This acquisition extends our client base to petroleum explorationists and engineers and will expand our consulting services into a diverse group of oil companies.’

Barrell

Kirk Barrell, Geodynamic president added, ‘LandWorks can now offer an expanded product suite and key new spatial search technology. GIS usage is growing rapidly in numerous industries. This acquisition will enable us to offer our products and services beyond the petroleum industry through LandWorks’ diverse client base.’ Geodynamic Solutions will operate as a wholly owned subsidiary of LandWorks.


Downsizing, offshoring and IT projects

Oil IT Journal editor Neil McNaughton thinks that Shell’s downsizing reflects a natural tension between buyers and sellers of IT services. If downsizing reflects the need for less IT, early experiences of offshoring upstream software seem to have failed to generate significant savings.

Working from home—actually from the garage, in the time-honored tradition of the IT start-up—I observe folks heading off to go to work on the public transport system with quiet satisfaction. Not having to ‘go to work’ saves me two hours of commuting time—and an untold amount of hassle. Of course, working from home means other benefits, like not having to shave, get your hair cut and wear a suit and so on (rest assured, I do still wash occasionally).

Amstrad

Reflecting on haircuts and going downtown to work reminded me of one of the most successful IT developments I have ever seen. This was probably about twenty years ago, just after the IBM PC had established itself as the ‘standard,’ but before it had driven off all the competition. My downtown hairdresser used to have an Amstrad something or other and I would josh him about this, asking when he was going to get a PC with one of these newfangled hard disk thingummies.

Bespoke

He really wasn’t very interested. Like many folks with a ‘life,’ he didn’t want his business dominated by computers. The bespoke tool that he used (and which probably cost ten times what he’d paid for the computer) had a couple of compelling virtues; it did the job it was designed for and it was quite easy to use. Moreover, because his business did not change a lot from year to year, he really didn’t expect new versions, upgrades, bells, whistles and so on.

IT nirvana

Functionally, this was IT nirvana. Let’s deconstruct the process a bit. The hairdresser has a need for some basic record keeping, check printing and saving data on a floppy for the monthly visit from the accountant. The IT folks come in, write the software, it works and they get paid and then go away. Simple isn’t it? Except that to anyone in IT there are some glaring holes in this story: where is the maintenance contract? Where are the upgrades? Who fixes the bugs? In other words, where are our jobs?

Labor market

In terms of the labor market, this development project can be analyzed thus. Before project: zero jobs. Duration of project: one or more jobs. After project: zero jobs. This is what I call the plumber’s business model. You call them, they come along and fix what’s wrong (I know that it ridiculous to imagine that a plumber will even answer the phone—but that is another issue). Then they go away.

Clipper?

Of course my ex-hairdresser no longer has an Amstrad. When the machine fell apart physically, Amstrad was no longer building these machines. He was forced to get in line and now has a PC (although a Clipper-based machine would have been more appropriate), an off-the-shelf Hairdresser’s Business Management Suite, along with annual maintenance, upgrades, new bugs and old bug fixes. Why? Because this is the way of the IT world. IT prides itself in ‘rationalizing’ and ‘downsizing’ other people’s jobs. But when it comes to downsizing its own activity, the IT world actually works in reverse, generating activity for itself, oftentimes with little benefit to the user.

Revolution

IT’s early days (say the past 20 years or so) have seen a successful fight against nature to preserve or create new jobs. While good IT design should expose increasingly simple interfaces to users (and I am really talking about programmers here), the ‘success’ of IT in job preservation has been to offer staggering complexity, changing paradigms—ensuring an ongoing cultural revolution, and employment for all. Instead of developing a tool—and then quietly retiring, developers manage to milk clients for 20% maintenance fixing their own bugs and offering meaningless enhancements such as re-engineered tools fitting the latest IT paradigm. Even though the end user couldn’t give a damn whether the thing runs on Linux, Windows XP, is object oriented, ‘engineered for re-use’ (hilarious, that one), or standards compliant.

Downsizing

The downsizing of Shell’s IT department could be interpreted as reflecting a (slow) structural change from an industry where one program begets another, where a few months of development leads to years of maintenance. Standardized applications and infrastructure are part of the solution. But what of the other side of the coin—the offshoring exercise?

Offshoring

If outsourcing is in part a way of redressing the balance and ensuring that IT projects don’t turn into jobs for life, offshoring is a quite different dynamic. One major E&P software vendor told Oil IT Journal of their experience of a major offshoring exercise which has been running for the last four years. This involves the wholesale offshoring of a major upstream software tool. It has been a mixed success, in part because the offshoring partner, as an IT professional, is extremely compliance-oriented.

No quick fix

What’s wrong with that? Nothing, except it is harder to get your ideas translated into code if you have to fill in reams of specifications, rather than go down the hall and chat to the developer. It gets hard to implement a ‘quick fix’. Offshore software houses also may need a education in the strange ways of the upstream industry. All of which means that the hoped-for cost savings have not materialized. The offshoring exercise has generated mixed feelings. But what has changed is the ease with which a project can be abandoned without having an army of developers to re-deploy. Which is, in a way, a return to the plumber’s business model—but what a circuitous route!


Oil ITJ Interview—Murray Roth, Landmark

Oil IT Journal spoke to Murray Roth, Executive VP Marketing and Systems with Landmark at the 2004 EAGE in Paris. Roth tells of OpenWorks extension to engineering, new Grid-computing based HPC solutions and automated velocity analysis—the ‘biggest thing since Magic Earth.’

Oil ITJ—The announcements of the Prospect Generation Engine and Field Development Engine sound great. What are these and how much is re-packaging of existing solutions?

Roth—OK, let’s cut to the chase! The big change is the expansion of OpenWork’s scope from G&G to engineering with the OpenWorks Engineering Data Model (EDM). This is leveraged in products like Asset View and Well Planning—making new workflows possible. These are underpinned with synchronization and data management tools for working with both Unix/Linux-based OpenWorks databases and EDM on Windows. This has involved a new focus on IT/hardware components and we are working with Sun, SGI, Intel/IBM to offer complete ‘shrink-wrapped’ IT systems.

Oil ITJ—Shrink-wrapped?

Roth—We have been working with IBM and Intel on IT ‘templates’—addressing workflow bottlenecks such as seismic processing. Systems are tuned for load balancing and to assure data management across the workflow. These can include Myrianet switches from NetApp and United Devices’ Grid computing solutions. The Grid is very applicable to reservoir modeling—you don’t need big NUMA machines any more. The flexible IT model, as deployed in Abu Dhabi, is part of a global network of Landmark Asset Management Centers leveraging our on-demand agreement with IBM.

Oil ITJ—Where else are these available?

Roth—These need to be locally available, we don’t want to separate data from the CPU. IBM has centers in Abu Dhabi, Poughkeepsie and France.

Oil ITJ—These are Itanium-based?

Roth—No they use Xeons.

Oil ITJ—What’s the ‘best thing’ chez Landmark since Magic Earth?

Roth—Auto Imager is a good candidate for that categorization. This is new automatic seismic processing technology, automating velocity analysis. We partnered with Calgary-based Data Modeling Inc. to develop these image-driven techniques. One 80 million trace land 3D survey, which would have taken six weeks in a traditional workflow, took two days with no human intervention. The technique is also great for pressure prediction and AVO studies. The integration of processing with interpretation has now been accepted by our larger clients—particularly ProMagic’s use of GeoProbe in processing.

Oil ITJ—Is ProMagic a killer app?

Roth—ProMagic sells well to GeoProbe customers—big oils and NOCs. We have also seen interest in Well Seismic Fusion reflecting the changing role of AVO analysis, accessing pre-stack seismic data during the interpretation process. We are working closely with Statoil in this area.

Oil ITJ—What is Landmark doing in knowledge management these days?

Roth—We have had some success with our Team Workspace portal but we try to avoid the ‘portal for portal’s sake’ mentality. Clients get better results starting with a data management focus. Here, Open Explorer has been replaced by Power Explorer, along with WOW for QC which now offers thumbnail displays of data. But really, data management will never be a ‘shrink-wrap’ application.

Oil ITJ—Still using Java?

Roth—Yes, in the context of a heterogeneous platform. Java gives platform independence that matches the state of the industry today. DecisionSpace runs on both Windows and Linux. We use .NET in isolated engineering applications but get more flexibility and less risk with Linux.

Oil ITJ—What’s Dave Hale up to; how did the atomic mesh work pan out?

Roth—He is presenting some interesting work on atomic-mesh derived ‘tanks and tubes’ which are used to perform a simple simulation of subsets of the reservoir—to high grade modeling options.

Oil ITJ—Will you be productizing atomic mesh?

Roth—Some European customers are looking to form a consortium around this. The tanks and tubes work may prove a quick win for the technology—in the seismic to simulation workflow. The technology is leveraged in the new DecisionSpace Nexus*—next generation unstructured simulator. This uses a tetrahedral grid, and reservoir simulation is modeled along with surface facilities. Nexus was a joint development with BP.

* More on Nexus in next month’s Oil IT Journal.


Landmark opens Abu Dhabi Center

Landmark R&D Fellow John Killough is to head-up the new Middle East Asset Management Center.

Landmark Graphics Corp. has just opened a new Asset Management Center in Abu Dhabi. The facility has been designated as a global center of excellence for applied research and training and is to spearhead production optimization initiatives in the Middle East.

Bernard

Landmark’s new president Peter Bernard said, ‘The center will let Landmark apply new technologies and consulting expertise to the challenge of enhancing recovery from Middle East reservoirs.’

KM

The center will support knowledge management and knowledge transfer to oil companies throughout the region and enable regional oil companies to test and evaluate Landmark technologies for reservoir management and decision management and will help define requirements for future technology development.

Lewis

Landmark regional VP Jonathan Lewis added, ‘The center will speed reservoir simulation projects with parallel and grid computing and will improve reservoir understanding through multi-scenario modeling and analysis.

Killough

Heading up the new center is Landmark R&D guru, John Killough. On show at the Center will be new modeling software including the Nexus simulator, developed in association with BP and the SeisSpace seismic processing suite.


Badleys and Midland Valley forge data link

The two UK vendors are to team on a data link between their structural modeling software tools.

UK-based software houses Midland Valley Exploration (MVE) and Badley Geoscience Ltd. are to develop a link which will let users of MVE’s 3DMove and Badley’s TrapTester to exchange data between their geology models.

Overlap

Both companies report a ‘strong overlap’ in their client-base and believe that together, these tools provide a ‘complete’ toolkit addressing structural analysis, fracture and stress prediction, and geodynamic basin modeling.’

Gibbs

MVE MD Alan Gibbs said, ‘It makes sense that we should work alongside each other to the benefit of our customers and ourselves. Collaboration provides our clients with a unique experience and skill resource.’

Roberts

Alan Roberts, Badleys MD added, ‘We recognized MVE to be the leader in fault-restoration and recommend their products and services to our own customers. Formal cooperation between our companies is appropriate as the technical fit is so strong.’

New workflows

The data link between TrapTester and 3DMove was conceived to support new user-devised workflows leveraging both products. This will allows TrapTester models to be read into 3DMove for fault restoration. Likewise, restored models in 3DMove can be captured in TrapTester for seal and geometry analysis.


New data processing center for Brazil

PGS’ new seismic processing center is built around Linux cluster technology from IBM.

PGS is opening a new seismic data processing center in Rio de Janeiro. The center’s compute engine is an IBM eSeries cluster of 512, dual Xeon CPU, xSeries 335s, provided by IBM Brazil. The system, which has a 6 teraflops maximum bandwidth, is claimed to be ‘one of the largest supercomputing clusters running the Linux operating system in South America’.

Wilkinson

PGS’ Mark Wilkinson said, ‘The Rio Center is a significant addition to our service portfolio for Brazil and will allow domestic and foreign operators to tap into the expertise of PGS geoscientists to maximize their producing assets and fully evaluate exploration potential. In-country service provision also allows processing projects to be managed locally and satisfies local content commitments.’

HoloSeis

PGS’ ‘HoloSeis,’ a virtual reality-based seismic display system used for survey planning, interpretation and well planning will be deployed at the center which will be linked to R&D and support resources at the Houston Hub. PGS claims that the center is the only in-country solution for large-scale, compute-intensive applications such as pre-stack time and depth migration.


Landmark and IBM ASP Grid processing

Landmark and IBM are to offer ‘on-demand’ computing for seismic processors.

Landmark Graphics Corp. and IBM are to sell ‘on demand’ access to computing technology for seismic processing services. Landmark’s seismic processing customers can now access IBM’s network of ‘on demand computing’ centers offering scalable access to information storage and computing resources.

Bernard

Landmark president Peter Bernard said, ‘This offering satisfies customers’ needs by supplying technology as required, without large, up front capital investments. Customers already benefit from our managed services of data and application software hosting. With IBM ‘On Demand’ technology, seismic processing customers will now be able to access and pay for computing power as needed.’

Subscription model

A subscription model caters for satisfying ‘overflow’ demand during peak processing workloads.


Updated software for Shell’s portfolio

Ikon Science is a new supplier to Shell’s global portfolio and Fugro-Jason’s Workbench is updated.

Shell has selected software from UK-based Ikon Science for integration into its Software Suite Portfolio (SSP) for E&P. RokDoc V 4.0 combines well logs and seismic information to make predictive models.

Jason

Another supplier to Shell’s SSP, Fugro-Jason has just announced version 7.0 its Geoscience Workbench (JGW). The new release promises improved, global simultaneous AVO inversion and a more complete analysis of pre-stack seismic data. Jason has also introducing new consultancy services including Markov-Chain Monte-Carlo (MCMC) probabilistic inversion engine and a new patented polar anisotropy compensated simultaneous AVO inversion technique.


Recall 5.0 rolls-out at user group meet

The 2004 Recall user group demonstrated renewed support for the popular log management tool.

As Baker Atlas director Bill Befeld said in his introduction, ‘We’re back!’ Recall is back with a new version (V5.0), a port to Windows and many new features including an ODBC interface. This means that Recall can now be integrated into SQL-based systems and addressed as if it were a relational database.

New in V 5.0

Recall 5.0 is a native port to Microsoft Windows and runs ‘faster than some Unix systems,’ while offering integration with other Microsoft products. Recall 5.0 includes Logscape interactive graphics, support for data dictionaries and a web interface. But the big change is the addition of the ODBC driver. The ODBC interface means that Microsoft Access can become a front-end to Recall, leveraging Microsoft’s interface, Wizards and familiarity. ODBC exposes the tool to SQL query e.g. you can query for ‘all instances’ of an object such as a field, well or depth zone. There is also an ODBC write function for insert and update, but no delete.

Logscape

Logscape, Recall’s data viewer and graphical spreadsheet now boasts a ‘huge’ x-y plot library including Pickett plots, templates and configurable toolbars. Dip tracks, tadpoles, image data and waveforms are now included. Petrophysical workflow can be initiated by drag and dropping a log onto LogScape.

PetroCanada

Following a major acquisition, PetroCanada’s UK unit’s data needed rationalization. With help from Venture Information Management, PetroCanada deployed a combination of Finder and two Recall databases, one for data staging and one for QC’d data. Legacy data lacked standard formats and sometimes, UWIs. PetroCanada went back to original well reports to see what was run, and assembled a set of useful curves for key wells which were digitized as required. The project involved 7,000 wells and 40,000 logs.

RecallML

Despite the move to Microsoft, Recall continues with its standards-based work with a new XML schema defining operations that can be performed on a Recall data base such as search, I/O, browse etc. RecallML supports read-only access and can run off a snapshot of Recall on a CD from a browser. A WITSML loader for Recall is also being developed.

R&D

On the R&D front, a geomechanics extension has been undertaken for Hydro, Agip and BG. The objective is to avoid well collapse via mud pressure window using Earth Imager/waveform and dip data visualized through LogScape. The analysis can also be used to study fracc jobs. ENI reported a ‘near real time’ use of Logscape to drill highly deviated wells in unstable formations. TVD management has been added to LogScape and near real time borehole stability analysis keeps mud weights in a safe window. ENI is anticipating a WITSML server feeding data into Petrel via OpenSpirit.

Reference

According to Baker Atlas, Recall is the well log data management system of reference for most major oil companies including ExxonMobil, Shell and BP. Recall is also the well log data management engine inside Landmark’s PetroBank. Anadarko and ENI also presented their Recall deployments which also involved staging databases.

This article has been abstracted from a six page report produced as part of The Data Room’s Technology Watch service. For information, email tw@oilit.com.


UK-DEAL firms Catalogue strategy

DEAL members try to figure out how best to deploy UK data catalogues for distributed vendor data.

At a recent meeting, DEAL members investigated data cataloguing strategies for the UK’s national data repository. As for earlier e-business initiatives, two strategies have emerged. Either the hub (DEAL) does the cataloging (a lot of work, but likely to produce a good catalog) or the vendors catalog their own data – and DEAL develops a mechanism to ‘expose’ catalog information to researchers.

CCLRC

A possible DEAL analog, the CCLRC provides an portal to UK scientific data. Access to disparate legacy data sources at the different R&D facilities is enabled by XML ‘wrappers’ around local metadata catalogues. The CCLRC also exposes its data through the SDSC Storage Resource Broker (SRB) developed in the ’90’s at the University of San Diego. Some 200TB of data are shared through the SRB between 30 participating US Universities.

Oil Cos.

DEAL’s philosophy is to let data owners expose their data through their websites. But these offer DEAL users a wide variety of user ‘experiences’ which need standardizing. Portal technology will be available in DEAL ‘real soon now.’


PKI One digital signatures for UK DTI

The DTI has announced that its Digital Signature initiative will be ready to roll for 2005.

The DTI and LOGIC have rolled-out the UK’s Oil and Gas Trust Scheme for digital certificates. A number of companies have been approved to issue digital certificates to the oil and gas oil industry following a DTI-led study of available solutions.

Digital Submissions

During 2005 DTI will start to require submissions to government via the UK Oil Portal to be digitally signed. The Trust Scheme will be managed by a small team that initially will include DTI.

PKI One

One participant is Aberdeen-based PKI One which provides ‘complete, open standard, low cost solutions’ public key solutions ‘tailored to the energy sector.’


PNEC Data Integration 2004, Houston

This was a well attended PNEC (250 pre-registered), with a high proportion of company presentations reporting on real-world data management achievements. Catalogues remain popular as witnessed by papers from Shell and Halliburton and the W3C Semantic Web initiative is emerging as a potential solution to the taxonomy problem. Russian oil major Yukos presented a refreshing look at merits of building vs. buying software. PNEC now welcomes both PPDM and POSC—resulting in timid, but significant joint POSC/PPDM presentation. Panel members agreed that data management was under-recognized and under-funded. ChevronTexaco, ExxonMobil and Shell all plugged our ‘star of the show’—Innerlogix’ Datalogix data clean-up tool. On the technology front, ExxonMobil presented a new XML standard for Fluid reporting, ‘FluidReportML’.

Carol Tessier described how Pioneer now has 100 communities and around 1,000 users. Even superficial ‘lipstick on the pig’ solutions can be useful as was moving Pioneer’s Artesia ERP ‘green screens’ to the browser. Pioneer’s Portal leverages various components including Schlumberger’s Decision Point (ArcIMS-like GIS), Unify NXJ (workflow) and Citrix MetaFrame for Unix access. Open Spirit is being integrated and will become a ‘critical piece of future developments’. Content management for Sarbanes Oxley reporting is ‘just exploding for us’.

ConocoPhillips

According to Pat Meroney, ConocoPhillips (CP) has been ‘immersed in the merger’ for the last 18 months. Meroney’s group, which had to contend with shared servers and Oracle databases ‘everywhere,’ was tasked with improving data delivery and developing a data management strategy. CP has a home-grown data integration layer between databases and front ends such as the web browser, GIS and Excel. The solution involves a ‘metadata server’ and web services, based on a spatial data catalog. Business and spatial data are stored in separate databases with GIS browsing though ArcIMS.

ChevronTexaco

Guy Moore related another post-merger data management tale. The starting point was ‘a mess’ with Chevron, an IESX user, and Texaco on OpenWorks. The Gulf of Mexico (GOM) unit supported 120 earth scientists, over 1,000 projects and 16TB of seismic interpretation data. ChevronTexaco (CT) set up a change management unit of IT support staff and subject matter experts, located at the Landmark ‘super site’ in Houston. Documenting conversion responsibilities and decisions helped avoid ‘finger pointing’ and ensured projects ran smoothly. CT went from 311 to 24 OpenWorks projects; and data volumes declined to 5TB—a testament to interpreters’ ability to ‘chuck stuff away’ (into the archive). This entailed considerable savings—for every 10% of CT’s data volumes archived, a $250,000 saving in data server costs was achieved.

Semantic Web

Oil IT Journal editor Neil McNaughton first encountered the ‘semantic web’ when developing an RSS news feed for Oil IT Journal. One quick win for RSS was the ability to read the feed from a Java-based smart phone. The techniques which form the semantic web may impact upstream taxonomy development and sharing. These include widespread use of XML namespaces to expose local taxonomies and the use of the Resource Description Format (RDF), a simple triple-based data modeling construct which allows metadata to be embedded in complex XML documents. RDF metadata is embedded in Adobe PDF where it leverages the Dublin Core metadata standard for bibliographic information. McNaughton’s paper is available on the oilit.com website.

XML standards

Trudy Curtis (PPDM) and Alan Doniger’s (POSC) joint presentation was most significant for its taking place. Doniger and Curtis made a promise of agreement on schema design principles—especially on units of measure—and on ‘profileable’ schemas. In the context of catalogues and taxonomies, the Petroleum Industry Data Dictionary (PIDD) ‘is coming back to life’.

Strategy

Mike Underwood (ChevronTexaco) advocates a proactive approach to data management, an approach developed by adapting best practices from ‘heritage’ Chevron and Texaco. Some 500 OpenWorks project databases were consolidated using Innerlogix’ Datalogix data QC and cleanup application. Datalogix usage was extended to project management with the adoption of pro-active processes for project database management. CTC is well pleased with the Innerlogix tools—Underwood said ‘Datalogix is the best thing our data managers have seen in the last five years.’

Shell

Shell’s data managers now deploy ‘pick lists’ of standard attribute names and quality processes according to John Kievit. Shell’s workflows pipe data from various databases and public data sources into interpretation systems (OpenWorks) and data browsers (PowerExplorer). Shell used to have data ‘hoarders,’ who claimed to be ‘too busy to archive’. This inevitably led to wasted time looking for inaccessible data. Today, Shell has a ‘golden bucket’ of QC’d, screened and compliant data. Shell established standard names for well logs and curves. ‘Amended’ data vendor contracts now stipulate the format and delivery mechanism. Data is loaded via an ‘advanced data transfer’ system using data ‘blending rules’ to establish which records to keep. Data history capture is automated using Innerlogix’ DataLogix QC workflow, a ‘health check’ for Shell’s corporate data. The data workflow is a resource intensive process, with thousands of vendor transactions per month to be QC’d before loading. Shell has now ‘stopped the bleeding’, reduced log deliverables and made it easier to use data.

Panel discussion

Ellen Hoveland (Anadarko) bemoaned the data management’s low profile, the subject has ‘little recognition’ and is generally regarded as ‘plumbing.’ Sarbanes-Oxley means that we all have to do a better job. Charles Fried (BP) believes we are ‘still in the stone age regarding data bases.’ Problems exist with data in Excel spreadsheets and shared drives are ‘all filling up.’ Alan Doniger (POSC) described structured and unstructured data as a continuum noting that the same metadata constructs should be used across the board. This should leverage metadata standards like Dublin Core and semantic web techniques such as OWL and RDF. Trudy Curtis (PPDM) believes too that data management has not got the respect it deserves. Curtis recommended the technologies developed by the W3C as having application to data integration. Knut Bulow (Landmark Graphics) spoke of ‘trans-integration,’ which is ‘more than technology’ and implies a corporate philosophy and implementation. Fried stated that BP has done ‘a poor job’ of tagging metadata from BP’s heritage datasets. Madeline Bell (ExxonMobil) set out everyday concerns to the interpreter such as ‘is the base map complete?’, ‘have I used every log?’ and ‘what color should this horizon be?’ Fried concluded by opining that little has changed over the years, all these disparate data types are ‘a pain in the butt’ to manage and that there is ‘still no money’ for this activity.

FluidReportML

Robert Aydelotte (ExxonMobil) described ExxonMobil’s ‘standardized’ XML-based fluid reporting protocol. The protocol integrates ExxonMobil’s Common Operating Environment (COE) and proprietary fluid characterization applications. Previously fluid property data was orphaned and often in the ‘DODD store’ (data on disk in desk)! Fluid-ReportML captures context in the form of a PDF document describing how things were done. This is stored on the file system and the XML goes to XOM’s corporate database. Information includes fluid transfer, reservoir conditions, fluid data, J-curve and separator tests and lots more. FluidReportML has been submitted to POSC as a candidate for a POSC standard.

E&P Taxonomies

Jeroen Kreijer described Shell’s previous efforts including the Shell Expro Discovery work and NAM’s own catalogue. The former revealed weaknesses in Shell UK’s IM practices. The NAM initiative shared the same objectives, but was largely disconnected from the Discovery work, ‘typical of Shell!’ Kreijer notes that taxonomies can be hierarchies or lists. For Kreijer, ‘unstructured data doesn’t exist, it is rather data of unknown structure’. Shell’s catalogue system (with reference to Flare Consultants) links content and context into knowledge collections to support asset integration, process compliance, organizational views and project delivery. The best ‘connection point’ for setting up a catalogue is the specific task—drilling a well, inspecting a pipeline—because of the ‘universality of the action.’ Shell’s ‘globally usable catalogue’ uses the concept of ‘document natures’. This strange term is used because ‘no one knows what it means – so you can spend time with them explaining what the system is all about’. Some 2,500 ‘natures’ uniquely identify each item.

Yukos

Vitaly Kransnov’s company, troubled Russian supermajor Yukos, operates several giant oilfields in Siberia, typically with thousands of wells per field. Operational decisions are supported by near real-time modeling and simulation. This is enabled by ‘tight integration of knowledge, IT and data’ at the local level where ‘everyday decisions are based on simulator output’. But Kransnov’s most startling revelation was that all the software used was developed in-house. This is because, in Krasnov’s words, ‘We can’t use commercial software to add value to our assets. These tools are over-generalized and lack vital features. They are over-complicated and hard to use. They offer poor connectivity to the corporate database and no national language support. It is also impossible to introduce innovations in a reasonable time.’ Yukos adapts its software to its corporate knowledge, not vice versa. Little is left to chance: education in the use of the software is provided through a technical website, backed up by an ‘expertise team’. The operational workflow begins with the selection of a promising sector via the ‘web waterflood portal.’ Other tools guide users through well studies with PVT and SCAL ‘wizards,’ reducing analysis time ‘from months to hours’. The R-Viewer visualizes streamline simulation and history matching performed on the YUSIM simulator and ‘draws users towards useful information.’

Halliburton

David Lamar-Smith, (Halliburton) described the ‘content problem’ of unmanaged, uncategorized information. Halliburton opted for a simple corporate taxonomy merging existing intranet product line hierarchies and SAP equipment and organizational codes. The University of Tulsa Abstracts and POSC’s catalogue also ran. All this was put in a giant spreadsheet for loading into the Interwoven content management system and Plumtree portal. Documents can be published in various ways; a ‘customer’ view, specific to a client, a business view (e.g. HR) or a product line view. Taxonomies are categorized by ‘facets’ including content type (technical document, sales document etc.), location, E&P lifecycle, business process etc.. The system automates and simplifies document classification. Search is enhanced by a search engine that understands the taxonomy. Smith strongly recommends migrating, combining enterprise hierarchies into a taxonomy, removing redundancies and associating synonyms.

This report is abstracted from a 15 page report produced as part of The Data Room’s Technology Watch Reporting Service. For more information on this subscription-based service please email tw@oilit.com.


Folks, facts, orgs, et cetera

Tid-bits and moves from Landmark, RPS, Geomodeling, Fugro, Labrador, Neuralog, Roxar etc…

Peter Bernard has been named president of Landmark Graphics, replacing Andy Lane. Bernard was previously with the Halliburton ‘mother ship’.

~

Environmental consultancy RPS plc has acquired Troy Ikoda. Troy Ikoda joins HydroSearch, acquired in 2003, in RPS’ Energy division.

~

Calgary-based Geomodeling Technology Corp. has signed up Second Summit as its UK marketing and support provider.

~

Gravity and magnetics specialist Fugro-LCT is to become an internal division of Fugro Robertson.

~

Oil vertical ERP vendor Sterne Stackhouse is to change its name to Labrador Technologies, reflecting ‘a renewed focus on the company’s core software business.’

~

Neuralog has signed Ruzica Kosir as software engineer. Rusica was previously with iDc and Divestco.

~

Roxar is has opened new offices in Kazakhstan and Tyumen and is to treble its Moscow workforce. Roxar claims over 400 installations in the CIS in companies including TNK-BP, Lukoil, Bashneft, Sibneft, Tatneft and Zarubejneft.

~

Don Lanman has joined consulting and training house SCA as Chief Petroleum Engineer. Lanman was previously with Amoco and JM Huber Corp.

~

Dave Work has left Energy Virtual Partners to join the board of Veritas DGC Inc.

Seitel is to raise $190 million in a private offering as per its chapter 11 reorganization. Proceeds are earmarked for creditors.

~

Schlumberger Information Solutions has licensed Norwegian HueSpace’s volume visualization technology.

~

Trango Technologies has hired Pete Bratton and Vern Campbell. Bratton was previously with Landmark Graphics.

~

Knowledge Systems has promoted Eamonn Doyle VP Operations for EAME and hired John McIntosh as account manager to the region.

~

Landmark has acquired IBM’s remaining shareholding in Norwegian Petrodata which is now a wholly-owned Landmark unit.


Gocad eastern hemisphere user group

Gocad has evolved into a full-blown interpretation package—but users still focus on R&D.

About 50 attended this, the second European Gocad user meeting. Earth Decision Sciences (EDS) recently raised €6.1 million and is expanding Gocad functionality with the intent of emulating, or surpassing, Petrel. While EDS demonstrates impressive ease-of-use and powerful functionality in its new ‘shrink-wrapped’ products, the users’ presentations still focus on the ‘old-style’ use of Gocad as a research tool, a plug in to in-house developed modeling software, or an adjunct to other interpretation products.

Gocad V 2.1

Having focused on ‘stability’ last year, development focus now shifts to usability and automation. Knowledge management- oriented functionality has added audit trails and improved project suspend-and-resume. Gocad can handle increasingly large data volumes, with data stored on disk and paged into memory as required. New functions include 2D interpretation, logs, seismic correlation and reverse and rollover fault modeling. The Volume Explorer demo showed an impressive mélange of independent probes on the same data set – including coherency, amplitude, water flood data and fence displays of seismic. A single probe can also display a variety of properties – which can be tabbed through by the interpreter.

Shell

Gocad is embedded in Shell’s in-house developed G&G applications. Stand-alone Gocad is also used in geophysical modeling. Shell is working with EDS to extend Gocad as a joint industry project to determine future directions. Shell showed a fault surface connector which snapped faults to Gocad horizons. Gocad libraries have been used to create tri-mesh faults in 123DI. A showcase study centers on a problematical stacked ‘Y’ and ‘X’ fault pattern on the crest of a rollover anticline. A ‘stair-step’ approach minimizes cell ‘crushing’ near the faults.

Karst Modeling

A Total-funded project focused on karst modeling, a complex problem where extreme heterogeneities can make for very large oil reserves. A stochastic approach is used to model petrophysical distributions – using a ‘random walk’ approach. GoKarst uses Total’s Neptune karst simulator for upscaling and downscaling to and from Eclipse. The study has added new tools in Gocad for ‘unfolding models’ with deformation tensors.

Bool-X

Total’s in-house geostatistics package ‘G3’ supports ‘long object’ modeling – but it proved hard to condition models to data and include existing objects. Bool-X, a ‘birth and death’ process, developed by the Fontainebleau School of Mines, ‘throws’ objects into the model until certain conditions are met. Objects include – box, puck, channel, sinusoidal channel and a ‘pixmap’ – user defined. A 3 million cell realization takes about hour. A demo showed objects being splattered around until well data was honored.

HP in oil and gas

HP, hosting the Gocad meet, offered a view of oil industry IT as ‘moving to Linux.’ HP’s PA/RISC market has declined to around 30% today, but is still worth around $1 billion. All HP XEON machines will be 64 bit this summer. Both Total and Shell use XW 8000 workstations – these too are moving to Xeon and support remote thin clients with ThinAnywhere and Exceed 3D. HP is currently looking at a cluster-based video rendering from ORAD for use with its Sepia ‘terabyte’ data browser based-on work done in the ASCI Views project. A Sepia PCI plug-in allows a workstation to become a member of a cluster of up to 1,000 workstations sharing resources. This has been tested with Shell’s 123DI interpretation system. EDS’s Jacta has also been implemented on an HP cluster and will be presented in a paper at the 2004 SPE.

This report is abstracted from a 4 page report produced as part of The Data Room’s Technology Watch Reporting Service. For more information on this subscription-based service please email tw@oilit.com.


PetroCanada’s secure document initiative

PetroCanada and ConocoPhillips use electroBusiness’ hub to sell finished products.

Petro-Canada is launching a cross company secure document exchange with help from solution provider electroBusiness (EB). EB will enable the exchange of confidential data between terminal operators in the downstream energy sector.

ConocoPhillips

Petro-Canada collaborated with ConocoPhillips Canada in a successful proof of concept. Other industry players are being invited to adopt the EB solution to standardize and simplify data exchange.

Ritchie

Brian Ritchie, VP, Energy Sector for EB said, ‘This application provides operators with the potential for industry-wide connectivity tailored to their particular B2B technology choices.’

Sale and distribution

e-Business Utility coordinates the sale and distribution of finished products through the exchange of fuel loading tickets. The solution is already commercially deployed in Canada to link oil and gas producers, marketers, refineries, transportation companies and wholesalers and retail outlets.


SDC Geologix releases WellExplorer

New software addresses well information management—leveraging the WITSML transfer standard.

As revealed in Oil ITJ (Vol. 9 N°5), UK-based SDC Geologix is launching a new well information management tool, WellExplorer (WEx). WEx optimizes technologies developed in Geologix’ GEO Suite and lets users deploy a departmental-level intranet for access to well summaries, logs, and reports. WEx is built on Microsoft Internet Information Server and a SQL Server back-end. Other software components support log-in, document upload, and data access, integrating with GEO authoring applications.

Dynamic

A dynamic document format means that users can interact with a log to change its zoom aspect, scale and depth reference. Multiple log layouts can be selected from a single GEO document.

WITSML

Geologix has been instrumental in the development of WITSML and now supports the data-sharing standard in all of its products. Geologix recently demoed WITSML use in a ‘MudLog Object.’ Data was exchanged with GEO and WITSML-compliant systems such as Paradigm’s Geolog, Sense Technology’s SiteCom, and Landmark’s OpenWorks.


Geographix targets performance, connectivity

The latest release of GeoGraphix interpretation software adds SDE mapping and runs 30% faster.

A new release of Landmark’s PC-software claims improved performance, enhanced usability, and an expanded set of features.

Mapping

GeoAtlas can now display layers from an SDE data server and supports display of dynamic spatial spatial data on the server without intermediate shapefile creation. Other enhancements include: production analysis for oil, gas, and water streams (with posting of graphs to a map layer), improved perforation data posting, user-defined petrophysics and environmental corrections, adding frequently used Baker Atlas charts to the Prizm definitions.

Performance

Significant performance improvements have been made to the process of loading and updating wells, and their associated data – formation tops, faults, velocity data, and deviation surveys. Performance metrics on a 5000 well project indicate initial loading of the data is now approximately 30% faster.

and also …

Other enhancements address password protection, connectivity with OpenWorks and grid data exchange with PowerModel and ZMap. Implementing DLL triggers has also been simplified.


PetroCom extends cell phone reach to offshore

PetroCom uses high-end cell phone infrastructure from Siemens to extend data/voice link offshore.

Houston based PetroCom has started building the first digital cellular network in the Gulf of Mexico. PetroCom’s coverage stretches from Brownsville, Texas to Mobile, Ala. and to 180 miles offshore.

Parro

PetroCom president Brad Parro said, ‘Our network will provide the high level of performance, reliability and security that the 30,000 potential users in the Gulf of Mexico require. We give companies mobile access to all types of information anywhere and anytime.’ The network will also be one of the first US deployments of EDGE*.

Siemens

PetroCom is deploying new ‘Mobile Enterprise’ technology from Siemens including customized base stations for indoors coverage, a solution for synchronizing e-mails while on the move and ‘mobile virtual PBXs’. The solution leverages open standards and supports smartphones, connected PDAs and other mobile devices using Symbian and Microsoft Mobile operating systems.

* A halfway house between today’s GPRS and the future UTMS systems.


Delphi Group studies taxonomy vendors

The survey found dissatisfaction with search tools and no clear market leader for taxonomy.

A new survey by Delphi Group ‘Information Intelligence: content classification and the enterprise taxonomy practice’ offers an interesting breakdown of commercial taxonomy solutions. The Report (which was part funded by major taxonomy suppliers) sets out to define the role of taxonomies in the organization and to examine future trends.

No single source

82% of respondents did not have a single point of search and management across information sources. Users are dissatisfied with their search tools and a majority had to classify their own material. The report found that 11% of information was not classified at all*.

Advice

The study offers advice on how to select a taxonomy supplier. In terms of market share there is no clear leader. Verity tops the list at 15%, closely followed by Autonomy (14%). Lotus, Stratify, Google and InXite also ran. Delphi anticipates consolidation of the taxonomy marketplace over the coming years.

* Surely a huge underestimate!


SAIC touts Field of Future and Content Analyst

Annual report shows solid energy vertical consulting business, but idiosyncratic software portfolio.

In its 2003 report, consulting behemoth SAIC lifts the veil on its oil and gas activity, headed up by vertical business leader Cheryl Louie. SAIC offers oil and gas clients performance benchmarking and strategy advice on ‘aligning IT capabilities with business goals.’

FOF

SAIC is working with oil majors to realize the ‘field of the future’ which will integrate systems and technologies with new work processes to increase production and reduce operating costs and capex.

ChevronTexaco

SAIC is helping ChevronTexaco to ‘achieve operational excellence and reduce costs’ and also supports majors’ efforts in the field of environmental responsibility and safety at refineries, retail and terminal facilities, and pipeline operations.

Content Analyst

In the software arena, SAIC has developed ‘Content Analyst’ to extract information from large volumes of documents and data. The software was originally developed for the intelligence community.

Conversion software

SAIC has entered the oil and gas software vertical market with a curious offering – a Units of Measure (UoM) Converter which ‘supports most standard oil industry units of measure’ and includes ‘approved’ American Petroleum Institute conversion specifications. The tool offers linear and non-linear conversion and includes an API for developers.

An API for the API? C’mon SAIC, how about releasing the UoM converter as Open Source?


SAS upstream business intelligence for Total

Total has built a reservoir management decision support system with SAS’ Intelligence Platform.

SAS Institute is moving back into the oil and gas vertical, with Total an early adopter of its new SAS Upstream statistical package. The Upstream offering is built around the SAS Release 9 Intelligence Platform incorporating data quality, data mining and process intelligence solutions.

Valois

Jean-Paul Valois, reservoir evaluation manager with Total said, ‘We needed to implement quick, in-depth reservoir management studies to accelerate the evaluation of mature fields. We engineered our reservoir management application with SAS which helped us speed the decision making process.’

Logical

SAS lets companies build a logical model integrating data from simulations, process, and equipment at multiple sites. Users can monitor performance, identify key issues and navigate hierarchical views of data.

Orenstein

SAS energy boss Horia Orenstein added, ‘Oil and gas companies use SAS solutions to solve problem areas efficiently through their organizations.’ Last year SAS acquired energy consultants RiskAdvisory. Previously SAS was active in the oil vertical with its now discontinued SAS GEO data management offering.


Quorum Pipeline solution for Shell

Quorum Business Solutions has a third client for its Pipepline Transaction Management solution.

Shell is to deploy Quorum Business Solutions’ Pipeline Transaction Management solution to replace its legacy pipeline transaction management system. The software will be used to oversee Shell’s deepwater Gulf of Mexico pipeline network with a 9 billion cu. ft./day capacity.

Weidman

Quorum president Paul Weidman said, ‘Quorum Pipeline Transaction Management represents the first new commercial pipeline transaction management software initiative in the industry for a decade. Our growing customer base validates the broad applicability of the software and demonstrates the demand for next generation software to manage the pipeline business.’ Shell Gas Transmission is the third company to license Quorum’s new pipeline transaction management product since its launch in 2003.


Invensys controls Kashagan super giant

AGIP has awarded the instrumentation, control and safety contracts to Invensys Process Systems.

AGIP has selected Invensys Process Systems to equip its Kashagan super-giant oilfield in Kazakhstan with integrated control and safety systems.

Multi-million $$

In what is described as a ‘multi-million dollar contract,’ Invensys will supply a package of control and safety solutions including the Foxboro I/A Series Distributed Control System and the Triconex Emergency Shutdown System to both offshore and onshore facilities.

2009

The contract, which will run until 2009, has been awarded to the field’s operator the Kazakhstan Caspian North Operating Company. Kashagan is the largest oil discovery in the last 30 years. Recoverable reserves are estimated to be up to 13 billion barrels, with a peak production rate of 1.2 million barrels per day. The investment for the full development of the field is nearly $30 million over 15 years. Invensys Process Systems reported operating profits of £33 million on sales of £768 million last year.


i-Store unveils asset management software

The Information Store is offering a new asset management solution to the upstream market.

The Information Store (iStore) has just announced its Asset Management System-Production (AMS-P) solution, a web-based application that ‘transforms disparate data into quick-read charts, tables and graphs.’

Irani

iStore president Barry Irani said, ‘E&P companies spend hours assembling terabytes of data which ideally should be thoroughly analyzed. But with so much information in so many different places, this has been virtually impossible. Asset management team members must learn several specialized pieces of software to collect and analyze data and jump from tool to tool and database to database in search of crucial bits of information.’

AMS-P

AMS-P addresses these issues by pulling-in critical information from multiple databases and presenting it in a usable format. AMS-P leverages existing technology investments without moving or reformatting data. AMS-P organizes this information into familiar objects like wells, reservoirs and fields. Data includes production and reserve information, financial statements, lease information, geological and geophysical information, and links to associated legal and business documents. More from www.istore.com.


API PIDX 1.2 supports downstream

The Americal Petroleum Institute has released a new version of its PIDX e-business specification.

The American Petroleum Institute’s Recommended Practice (RP) 3901, Extensible Markup Language (XML) Transaction Standards version 1.2 has been released by the Petroleum Industry Data Exchange (PIDX) committee. The standard allows oil field services to be bought and sold over the Internet. The new version is backward compatible with version 1.0 adding schemas for transferring messages related to: Custody Ticket for Petroleum Products, Order Status Request, Receipt and Advanced Shipped Notice (ASN).

Services

The PIDX standards support business processes for services including cementing, coiled tubing, completion, logging (for cased and open holes), perforating, stimulation, oilfield transportation, well drilling and testing.

Edwards

Mark Edwards, operations manager with Transport 4 and chair of the PIDX Pipeline Information (PIPENET) Group said, ‘The expansion of the XML Transaction Schemas addresses all downstream petroleum product custody transactions. Petroleum shippers can now receive pipeline custody transfer ticket transactions directly into their ERP systems, resulting in an increase in the speed and efficiency of their logistics and accounting operations.’ The standard is available at pidx.org.


$3 billion annual spend

Digital Oilfield claims 5,000 users and 1,800 suppliers onboard.

A new version of Digital Oilfield’s OpenInvoice/OpenContract e-commerce solution adds enhancements for detailed purchase orders, better invoice-coding and streamlined transfer of documents directly from suppliers’ financial systems.

5,000 users

Digital Oilfield claims nearly $3 billion annual spend from over 5,000 users, ‘many’ large independent oil and gas companies and 1,800 supplier clients.


‘Baird was in love with technology,’ Gould

Schlumberger president tells investors of ‘favorable climate’ and relates Baird anecdote.

Speaking to the investment community, Schlumberger president Andrew Gould painted a rosy picture of ‘perhaps the most favorable business climate we have seen in the upstream industry since the early 1970’s.’ Schlumberger’s earnings have grown faster than revenue with a return on capital employed of 13.3% in the last quarter. The divestiture program of the non-Oilfield businesses is substantially complete and total and has reduced net debt to below $2 billion— ‘a level consistent with the Schlumberger’s long-term capital structure.’

Ill-fated

In the Q&A, Gould gave an amusing insight into his predecessor Euan Baird’s sortie into the IT consulting market with the ill-fated acquisition of Sema Group. Gould discussed these events with a group of Schlumberger top scientists and ventured that Baird might have ‘fallen in love with a technology,’ suggesting that ‘as scientists, your job was to stop it before it went too far.’ But as one of Schlumberger’s brains quipped, ‘If you’re in love, it’s way too late.’


Microsoft, TietoEnator target oil vertical

Two new offerings address oil and gas construction and pipleine project management.

Two announcements this month describe project management solutions that target the oil and gas industry. TietoEnator and Projectplace are rolling-out a solution for distributed projects while Microsoft Corp. and The Project Group presented a case history of Microsoft Project’s use in pipeline construction. The first solution targets projects which cross organizational, functional and geographical boundaries where a formal project management methods and a central repository for project information are required. TietoEnator’s project management method ‘PPS Online’ is now available through the www.Projectplace.com portal.

Microsoft Project

Microsoft and The Project Group made their pitch in a webcast last month showing how Microsoft’s Project Intelligence could be used to manage all aspects of a multi-year pipeline project. Craig Crawford, CEO of The Project Group outlined how software could be used to manage risk, enhance project deliverability and to offer transparency to key stakeholders.


National Data Repositories re-invented

The loose grouping of government bodies is shifting emphasis from oil and gas to geoscience data.

The POSC-supported ‘National E&P Data Repositories’ group is shifting focus. While the first four international meetings had a largely oil and gas theme, the fifth meeting sees a re-baptizing of the loose governmental association as the ‘National Geoscience Data Repositories’ group.

AGI

The next meeting is to be held under the auspices of the American Geological Institute (AGI) at the US Geological Survey in Reston, VA, USA from the 21st - 23rd of September 2004. The meeting scope has broadened to offer representatives a chance to ‘share issues, concerns, and solutions related to the management and preservation of geoscience data.’

Preservation

This year’s meeting, NDR5, will address management and preservation issues, such as compliance with national laws and addressing the needs of energy and minerals industries. A recent report on geoscience data preservation from the US National Research Council has heightened awareness of geoscience data repositories within the United States. According to the organizers, this opens the way for further discussions between key decision makers from around the world. The first NDR meeting, held in London in 1996, was organized by the UK DTI.


IPoVSAT comms for Newfield Exploration

Schlumberger’s high tech satellite-based communications offer voice, email and data to the Gulf.

Houston-based Newfield Exploration has selected Schlumberger Information Solutions for the provision of a high bandwidth, dedicated satellite link to its Gulf of Mexico offshore operations. The link is built around a high-speed Internet Protocol over Very Small Aperture Terminal (IPoVSAT ).

Spicer

Newfield IT Manager Mark Spicer said, ‘We have rationalized the number of network providers we use and now have advanced voice and data communications at reduced cost.’

Email

IPoVSAT technology provides two-way email and Internet-based communications over a satellite network using a cost-effective method of sharing network capacity.

Rouylou

Schlumberger’s Jean-Michel Rouylou said, ‘IPoVSAT provides oil and gas operators working in remote areas of the world with a reliable satellite connectivity solution beyond voice-over-satellite connections, allowing users to use email, the Internet and video across a more efficient remote communications platform.’


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.