The problem is relatively commonplace, a few months before a bid deadline, your company acquires a package of data from a government, and virtually none of it conforms to corporate databasing standards. Additionally, you may well have a few filing cabinets, or rooms full of legacy data in the form of reports, paper sections and so on. To clean up and populate a conventional database with all this stuff would take months, and your G&G people need access in a matter of weeks, in fact yesterday would be just fine for them. Charles Fried recounted how Amoco were faced with this kind of a problem in the Venezuelan bid round earlier this year. The data supplied by the Venezuelan government makes for some impressive numbers, some 200 8 mm tapes with 200,000 files totaling 25 Gigabytes excluding seismic trace data. Amoco also possessed some mission critical information themselves gleaned from previous work in the country and wanted to integrate this with the data in the public package to gain a competitive advantage over users of the standard package. To make matters more interesting the data came in a multitude of different formats, Excel, Word, UKOOA navigation data, Tiff imagery and so on. Getting all this right became of the utmost importance to Amoco and all the other bidders in this round which ultimately netted $2 billion in bonuses at bidding time.
Rather than adopting the traditional approach of letting everyone climb over the data for themselves, Amoco decided to set up a hybrid WEB based GIS front end from which spatially indexed data could be located and viewed on screen. But this was not just new technology, but a new work paradigm. Populating the web based data base was to be a collaborative effort, - the "team web" approach offering self service input and viewing of data. A Microsoft Access database described as "PPDM like" - was built to contain header and index information while reports and scanned images were placed on a fileserver. The timely capture and organization of this data was only possible because the Venezuelan data itself was scanned and organized. A Java applet from ESRI was used to provide point and click access to the different datasets from maps. The server side solution comprised an NT Server, a UNIX box running an NT Server emulator, MS Access, ESRIs ArcView Internet Server, MS Active Server and Index Server. On the client side, windows95 clients ran Tiff imaging software from CPCView together with Netscape and Internet Explorer. The whole system was up and running in "a few months" and the bid data loaded in two weeks. 30 users were online with around 500 accesses and 4,000 pages viewed per month. Intriguingly one of the main lessons learned was the difficulty of changing the way people worked. Both the data sharing paradigm, with its self service data loading, and the on-screen data delivery stretched the traditionalists.
Click here to comment on this article
If your browser does not work with the MailTo button, send mail to email@example.com with PDM_V_2.0_199711_8 as the subject.
© Oil IT Journal - all rights reserved.