Speaking at Offshore Europe in Aberdeen earlier this year, Bernard Looney, BP COO production enthused on the potential for big data analytics in the upstream. A trip to Silicon Valley convinced him that big data will revolutionize how we drill wells, optimize production and more. There may be ‘up to 50,000’ different routes through a typical hydrocarbon processing facility. Finding the best path through can add up to 4% to throughput. In a North Sea trial BP has screened a 5,000 well and 250,000 km. sq. 3D seismic datasets looking for ‘me too’ analogs of its ‘previously overlooked’ Vorlich discovery. Using big data analytics, 5,000 wells were analyzed in ‘just a few seconds.’
Meanwhile, in Silicon Valley, Industrial Internet specialist Bit Stew Systems reports that its MIx Core platform is ‘gaining traction’ in oil and gas, helping companies solve the data integration challenge at scale. The platform correlated an oil and gas data set in less than five hours compared to six man months with a ‘traditional’ approach. Bit Stew has also kicked off a pilot program to integrate its technology with GE Oil & Gas’ software. solution. Bit Stew founder Alex Clark said, ‘We are disrupting traditional approaches for integrating industrial data. Instead of relying on software technologies and data architectural models unsuited for the massive scale of data streaming from industrial systems, we have created an industrial data library platform that scales, rapidly dissolving data integration challenges.’ Bit Stew has financial backing from GE and Cisco.
© Oil IT Journal - all rights reserved.