First of all, a big thanks to the new sponsors of the oilIT.com website. 2005-2006 sponsors are:
· Biznet Solutions
· Foster Findlay Associates
· Kelman Technologies
· Landmark Graphics
· OFS Portal
· Petris Technology
At the 2005 AAPG Conference and Exhibition in Calgary, I was struck by the ever growing computerization of geology. You may find this surprising. I mean, this is Oil IT Journal, I am a geologist (of sorts) and I should know about these things. Well I do, but I am still surprised. Just as things seem to be settling down around a shared earth model paradigm—or around a single ‘platform,’ the granularity of application software goes down by an order of magnitude. Let me explain
Models, models everywhere
Recently, we have been focusing on the 3D earth model as the focus and ‘finality’ of the interpretation process. There is no doubt that the choice of a 3D modeling environment is a tough one—both technically and commercially. But my bird’s eye view of the technology on display at the AAPG—and of that behind many of the papers presented—makes me doubt that convergence on a model is either possible or desirable.
The models that caught my eye at the AAPG were, at first glance, at the periphery of the interpretation process. There were niche models for fault plane seal analysis, for plate tectonic reconstruction, for structural balancing and palinspastic restoration, for geochemical modeling of basin evolution and reservoir fractioning. I could go on. But what seems to be happening is that these peripheral ‘niche’ applications are getting more and more polished, the science behind them is getting better and their use is on the up. The periphery is moving in on the mainstream.
Standard model returns
Each of these models of course has its own, possibly very exotic, data requirements. The geochemical characteristics of a source rock are unlikely to be found in the average E&P data store—although paradoxically, they may relate to measurements made in the refinery! How could we ever have thought that one standard model would fit all? Well we did once, and paradoxically, the corporate data model is back with a vengeance! But I’ll have to explore why that is so in another editorial. In the meantime, vive le paradoxe!
On the topic of data, another trend is for vendors to deprecate those ‘hard to maintain’ Shell scripts and what have been described as ‘bubble gum and bailing wire’ solutions to data management. The subtext here is of course, ‘Throw away your Shell scripts and buy our solution.’ But wait a minute, who says scripting is un-maintainable and who says a ‘solution’ is necessarily better? Not Greg Wilson, author of the excellent book ‘Data Crunching*’. Wilson clearly has a lot of hands-on experience of the ‘unglamorous’ activity of turning inconsistent ASCII-based data files with the odd typo into clean data in XML or maybe a database.
No unifying theory
Wilson offers no ‘grand unifying theory,’ just a lot of pretty up-to-date advice on using modern tools like Java, Python, Perl and Ruby and XML to process and condition data. There is good treatment of the XML Document Object Model, of XPATH and XSLT—and how these can be put to good effect crunching and managing your data. Use of older tools—the Unix Shell and SQL is also presented—with some good advice on when to give up on XML and use a database. If your SA still lets you see a Shell prompt, and you have not yet bought the ‘solution’ that does it all, this book is for you.
Finally, if you are looking for something to do in September, you may like to consider the AAPG International Conference and Exhibition to be held in Paris from September 11th to 14th. I say this because for the first time Oil IT Journal will be exhibiting at a major trade show. Hope to see you there!
* Data Crunching, Solving everyday problems using Java Python and more. 2005 Greg Wilson, The Pragmatic Bookshelf—ISBN 0-9745140-7-1 (www.pragmaticprogrammer.com)
© Oil IT Journal - all rights reserved.