Last month I wound-up asking ‘Will IBM’s Watson beat the traditional weather forecasters? If it does, will we even know?’ Well actually we do now. At least according to a study from ForecastWatch which found that IBM’s Weather Company provided ‘the most accurate forecasts overall across diverse geographic regions and time periods covered.’ It would be nice to see here a triumph of data-driven forecasting over forward modeling but things are not quite so clear cut (are they ever?). Alongside its own Weather Underground crowdsourced data, the company’s forecasts include the results from the government-provided data and forecasts with which it is ‘competing.’ All of which is mashed up inside of IBM’s proprietary Deep Thunder forecasting model. But while the big data vs. forward modeling question remains unanswered, Deep Thunder’s capability to ingest end process 400 terabytes daily is impressive.
A decade or so ago, maybe it was longer, folks who should have known better roamed the conferences railing against those old fools who used ‘legacy code’ in obviously outmoded languages, notably Fortran. The old farts were enjoined to get with it and program in C# or some other shiny thing. Such manifest nonsense (a.k.a. FUD) begat my minor research program and editorial ‘Don’t mention the ‘F’ word in marketing!’ This, in October 2002 (OK, a bit more than a decade ago), concluded that Fortran was still offering scientists and engineers quite a lot back then.
China’s recently-announced 10 million core and around 100 petaflop Sunway TaihuLight is the world’s most powerful computer in 2017. What compilers does it offer? A choice of C/C++ and, you guessed, Fortran. It would appear that venerable language, invented in 1950s, remains a tool of choice for scientists.
Our relentless search for meaningful progress from the semantic web community goes on. I recently attended a SemWeb Pro conference where the World wide web consortium’s data specialist Phil Archer bemoaned the fact that the semantic web and linked data have got a bad press. Archer cited Neo4J’s effort in unpicking the Panama Papers as a success for graph databases and the semantic web ‘although not for RDF.’ Archer wondered why Neo4J was better than a Sparql endpoint observing that this was ‘a good question for this community, maybe we are missing a trick.’ On the other hand, Neo4J ‘likes RDF and easily sucks it up into its proprietary system.’ Archer concluded rather lamely, ‘Don’t let people tell you that the semantic web/linked open data is not a success. Success comes from the output. It is useful and does things that other technologies can’t do.’
EY, a provider of ‘innovation in financial and operational excellence’ forecasts that for the oil and gas business, innovation in financial and operational excellence will be a main driver of value and competitiveness in 2017. Well they would say that wouldn’t they! EY’s Deborah Byers opined that ‘The industry’s hopes have been buoyed thanks to the OPEC output agreement and the Trump Administration’s positions on energy thus far.’
I know that I am on shaky ground here but I can’t see what exactly was so bad about the previous administration’s governance (or perhaps lack of) of the US oil and gas industry. US production of both oil and natural gas has risen spectacularly. I wonder just what the new administration can do better. This could be one of those ‘be careful what you wish for’ things. Maybe in a decade or so, industry will look back nostalgically at the ‘Obama production peak’ and the heady days of $100 plus!
I’m not sure if it is the new administration that has opened the floodgates (not perhaps the best choice of words) regarding global warming but the latest issue of the excellent Ryder Scott newsletter has a banner headline, ‘Global warming is not man made’ above a summary of meteorologist and former KHOU-TV weatherman Neil Frank’s view on the ‘hoax.’ I was curious to see what the SPE’s position on GW was and searched for ‘global warming’ on OnePetro. The answer came back right away, ‘Humans are not responsible for global warming.’ To read George Chilingar’s paper will cost you $25 on OnePetro. Alternatively, you can read our report from his presentation at the 2007 SPE ATCE in Anaheim along with the exciting Q&A and a curious intervention from someone claiming to be from the EPA!
The oil and gas industry loves a good narrative. For shale, the narrative turns on the notion that horizontal drilling and fracking can unleash oil and gas from tight, almost impermeable, shales. The narrative has suffered somewhat in the last couple of years as it has become clear that this only works up to a point, and that fracking in some shale areas is uneconomic. But not so for the Texas Permian basin where ‘Permania’ has taken hold with more drilling, M&A and increasing production.
A recent report, ‘Unravelling the US shale productivity gains’ from Petronerds and the Oxford Institute for Energy Studies offers what might be an explanation for the Permian basin’s success. Petronerds observes that ‘[Permian] growth has, in large part, been spurred by the application of unconventional drilling technologies in reservoirs that had previously been treated as conventional formations.’ The italics are mine, as is the conclusion that success in the Permian is probably down the fact that it is not shale!
© Oil IT Journal - all rights reserved.