Oil IT Journal likes to keep ahead of the curve. Our first report on the defence-derived ‘high level architecture’ (HLA) standard for synchronous operation of simulation packages from different vendors goes back to our January 2006 issue where we asked if the HLA could have application in the digital oilfield. Interest in oil and gas application of the HLA—or rather its latest manifestation, the Object management group’s (OMG) Data distribution service (DDS)—has been rekindled in a new OMG position paper.
According to OMG, little has been done since the adoption of the WITSML standard for representation of wireline data. In recent years, the Upstream Oil & Gas industry has begun to realize—especially when analyzing the root causes of safety incidents—that the lack of integration and interchange methods and standards has potentially dire consequences. Not only would sharing information create better predictive models and impact real-time operational decisions in a beneficial way, integrating and exchanging models and data for example during drilling, but hoarding information can also damage a company’s image in the eyes of the public and the legislators, ultimately threatening the right to operate.
Here, DDS is described as a successful, quality-of-service based, interoperable publish-and-subscribe mechanism for data acquisition and exchange. According to the OMG, proliferating sensors and growing data volumes in Scada systems present an opportunity for DDS. DDS is claimed to provide ‘more intelligent real-time data management than the widely used OPC-UA.’
DDS specialist UK-based PrismTech’s Peter Steele told Oil IT Journal, ‘DDS brings quality of service, military-strength high data volumes, visibility and internet connectivity that are missing from OPC-UA. One of our simulation customers, Teledyne Brown Engineering, publishes as much as 200k samples per second. To date, customers have been using the technology in training simulators but there is no reason why HLA/DDS could not be applied to any type of simulation that needs to exchange data in a distributed environment. We see DDS as ideal for interoperable real-time data sharing in oil and gas industry processes, such as real-time drilling control.’
In our 2006 article on HLA we identified Norwegian Sintef as an oil and gas user of the technology. Today, Kongsberg is leveraging HLA/DDS in its marine simulators, DDS is also used to link simulations with Esri mapping technology and National Oilwell Varco is rumoured to have leveraged the technology in its Autocon autonomous drilling test rig and in its ‘Novos’ real time operating system. More from OMG and PrismTech.
Over the last couple of decades, the Norwegian Diskos data service has flip-flopped between Halliburton and Schlumberger following regular contract reviews. Diskos’ operators, The Norwegian Petroleum Directorate (NPD) and the group of 56 oil companies that make up the Diskos collaboration have broken the mould with the award of the next Diskos round to a group of three companies, CGG, Kadme and Norwegian IT company Evry.
The new PetroBank will combine Kadme’s Whereoil platform and CGG’s Trango seismic data management. CGG will provide tape QC and transcription software. Operations, support and administration will be provided by Kadme and the IT infrastructure will be operated by Evry using Smedvig’s Green Mountain data center. Kadme MD Kjell Arne Bjerkhaug should know the ropes as this is his third stint at the Diskos helm.
The new contract will come into force in 2015 for a five year period. Three Diskos modules (seismic, well and production) were tendered in a comprehensive public procurement process. More from CGG and Kadme.
It is curious that at the Society of Petroleum Engineers’ IT technical section (ITTS) the ‘all clear’ announcement was made—and made more than once. Seemingly the digital oilfield is a done deal and we can all rest on our laurels (I exaggerate, that is not actually what was said—see our report from the SPE ATCE and the drilling systems automation technical section (Dsats) on pages 6&7).
Having argued previously that the digital oilfield was a done deal before the SPE began taking an interest, I am now going to turn my editorial hat around and argue that it still has room for improvement. To back up this reasoning I offer a potted ‘bunk-style’ history of the digital oilfield in three chapters.
Chapter One—In the last century, ‘digital’ in the oilfield meant the application of horizontal digital technologies. Telecoms, process control—in fact just whatever was to hand. Throughout Chapter One, the goal of interoperability remained elusive.
Chapter Two—Which roughly corresponds to the efforts of the SPE ITTS and others to inject a bit more enthusiasm into digital in the oilfield—covers the first decade or so of this century. This has seen a bewilderingly broad church of activities and projects masquerading as the digital oilfield, leveraging what was already established in Chapter One and adding in goodies such as Microsoft Excel, SharePoint and lots of file based data exchange. The interoperability question was addressed in a rather ugly fashion, by file transfers using more or less standard and more or less sophisticated data formats. A ‘string and sealing wax’ kind of solution.
Of course connectivity (as opposed to interoperability) was getting better and in fact you could exchange just about any data you wanted to. Especially if you stayed with the same service provider. Using a standard format à la Witsml is all very well. But established suppliers will invariably seek to expand their footprint and may have other constraints in achieving ‘industrial strength’ systems. All of which means that the interoperability movement still has a hard time.
This may not have been as important as all that in two very different contexts. At the low end of drilling and production, the sneakernet metaphor of moving data around on a USB stick or various similar approaches may be all that is required to make things work. At the other end of the spectrum, mega projects like high end Gulf of Mexico platforms or Australian LNG terminals, are developed on more of a ‘spare no expense’ basis where a single main automation contractor (MAC) will come in with boatloads of kit ready to be deployed.
Chapter Three—This is being written today by the non conventional sector. Here a perfect storm of requirements is forcing a shift away from both the sneakernet metaphor and the MAC approach. Non conventional development means doing things quickly—to stay ahead of steep production decline curves. It also means doing things cheaply—as the economics of drilling a large number of wells for a relatively small amount of production may be less than stellar. It also means doing things ‘at scale’ with ‘factory drilling’ of extensive shale plays.
The factory drilling paradigm hides a lot of complexity. While geosciences input may be downplayed, multi-stage fracs make for complex real-time interactions between multiple stakeholders. Key to the successful factory is automation capable of reacting to changing situations and assimilating increasing quantities of real time measures.
This is a really interesting situation where service contractors need to collaborate on fast moving, complex operations in a mobile setting. It is a great test-bed for novel control systems that may require new approaches like the one in this month’s lead. Drilling and completion is getting smarter—and it will get smarter still before we’re through.
In March 2014, Oil IT Journal will be publishing its 200th edition. Since starting out in July 1996 we have printed around two million words of reporting on upstream information technology. Most of this, that is to say all of it except for the current year, is available online at www.oilit.com as a free resource for operators, researchers and vendors. The current year’s headlines and editorials like this one are also available freely online. The full text of the Journal is restricted to our subscriber base which you can now join through our new online subscription system on our parent company website the-data-room.com.
We plan to invite our subscribers and others to chip in for what I hope will be a bumper special issue to celebrate the 200th birthday with comments about the impact (negative or positive) that Oil IT Journal has had on their businesses. And we will also be inviting comments on how Oil IT Journal could be improved to see us through the next 200 issues or so.
I guess that it is about time we did some market research because we have to a large extent been flying blind since the publication started out apart from occasional chats and email exchanges with readers which are generally positive. The odd occasion when such exchanges have not been positive have even resulted in a brusque cancellation of a subscription as Oil IT Journal was deemed to be ‘off message’ by the marketing department or politically incorrect in some editorial stance or other. Such happenings are thankfully few and far between and, as a reader recently remarked, are a sign that we ‘must be doing something right.’
The Norwegian Expert center for information management’s (ECIM) annual E&P data management conference was held earlier this year. To Oil IT Journal’s great regret we were unable to attend the show this year. But the organizers kindly provided access to the presentations from which this report is derived.
Henning Lillejord and Stein Sigbjørnsen (ConocoPhillips) describe 4D as driving a paradigm shift in seismic data acquisition and processing. The super-major Ekofisk field has had an extraordinary history and has been given a new lease of life since the topsides were physically jacked-up in 1989 and a massive secondary recovery program initiated. This has been tracked with four conventional seismic surveys from 1989 to 2008 when a value of information study was used to justify a 15 year long drilling and monitoring campaign including yearly ‘life-of field’ 4D surveys. Four component LoF data is used to image the central part of the field beneath the gas cloud. A single survey over the 200 km of ocean bottom cable produces 35TB of data—streamed ashore over a dedicated 1GB fiber link. Between surveys the array is used passively as an ‘Optowave’ seismic activity monitoring system with a snapshot image taken every 10 seconds and transferred to the University of Bergen and Norsar for seismic events detection. Around 20GB of data is collected per day which is creating interesting data management issues for ConocoPhillips, its contractors and the Norwegian authorities.
Sivert Kibsgaard, Statoil, offered another slant on the growing data volumes with the shift to managing a prestack databank. Statoil has around 1.5 PB of data online and some 10PB on tape. The Snorre LoF project generates 300TB every six months. Storage costs are a critical issue with a 40x price advantage of offline over online as per Diskos. Costs of backing-up and replicating a multi petabyte Diskos are scary. The ‘big data’ issue is also forcing a second look at Norway’s ‘Yellow Book’ seismic reporting regulations.
Vitaly Voblikov presented Lukoil Overseas’ data environment that is to support its expanding worldwide interpretation, drilling and production activity. Lukoil has around 150TB of online data to manage and is working on a data infrastructure that combines data in public sources (Iris 21/Enerdeq), application data stores (Petrel, Kingdom, Geolog and Roxar RMS) and an in-house developed BasPro corporate database. Kadme Wheroil is used as a front end and search portal to all of the above. Lukoil is now working to capture interpretation results in an automated process that identifies projects created in a specific application, tags them with contextual information relating to contracts and licenses and performs data QC. Project synchronization across Lukoil’s international operations is achieved with Quantum’s StorNext technology.
Eldar Bjørge described how Statoil is refocusing its data strategy to achieve the elusive goal of ‘quality data and active ownership of the data asset. ’ Despite hundreds of best practices and how-to guidelines per data type, it remains hard to connect the data management function to geosciences and onboarding new personnel is difficult. Data governance is harder than it looks as data owners tend to delegate complex tasks down the hierarchy to local resources with uncertain results. Statoil is working to fix these issues with better linkage between data management and geoscience, a new IM/IT architecture and a clarification of data ownership roles and responsibilities. The new push includes an extension into document management and life-cycle approach to data quality and retention—including automatic deletion. A key component of the new order is the allocation of sufficient time for data capture and QC—now a part of every project’s deliverables.
Elisabeth Hegle (Cairn) provided an update on the state of play in data management training in Norway. Back in 2008, ECIM sponsored the first year of what was to be a three year IM program at the University of Stavanger. Unfortunately the university did not continue with the second module of the program. ECIM has now turned to Aberdeen university whose online MSc in information management appears to fit the bill.
A joint presentation from Shell’s Dan Petcu and Enthought CEO Eric Jones outlined ‘data management friendly’ workflows. Shell has leveraged Enthought’s Python programming toolset in ‘Geosearch,’ a multi application, multi database front end for Shell’s in-house developed NDi Explorer. Geosearch goes beyond data monitoring and query with the ability to push data back to applications.
Gary Murphy (Schlumberger/InnerLogix) provided an elegant exposé/commercial on the merits of the manufacturing model for controlling upstream data quality. The current InnerLogix was inspired by the QC methods used in manufacturing. In an effort to improve the product, Schlumberger has taken a second look at how quality is controlled in manufacturing and how these concepts could be applied to E&P data. The study homed in on statistical process control (SPC). This involves flowcharting the production process, measuring different parameters and using ‘Pareto glitches’ to backtrack the process and pinpoint root causes of defects. Murphy suggests that QC in the future will see ‘more aggressive’ sampling of E&P data at appropriate points over time and the use of SPC to reduce variability by identifying and tracking glitch root causes. He also suggested that companies should work with their suppliers to produce quality-enabled applications and exchange formats.
Simone Andre described how Halliburton helped Pemex build a Petrobank-based seismic data repository, the ‘Acervo Geofísico Nacional,’ now a cleansed and reviewed 50 terabyte upstream dataset.
DNV’s Christian Hansen lifted the lid on the EU ‘Optique’ project, running under the auspices of the EU’s 7th Framework program. One use case from Statoil is to leverage semantic technology to answer the following type of geosciences questions—’Within this area of interest, return the wellbores that penetrate chronostrat unit C1 and return information about the lithostratigraphy and the hydrocarbon content.’ The Optique project is headed up by semantic specialist Fluid Operations of Walldorf, Germany. Others in the €14 million semantic junket include Siemens, Statoil, Schlumberger, Oracle, IBM, Halliburton, Microsoft and EMC. More from Optique and from Fluid. Visit ECIM.
There were three new oil and gas entries into the Top500.org list of the world’s fastest computers this month, from Italy’s ENI, and two unnamed oil companies—one in the US and one in China. ENI’s HPCC1 machine (N° 69) is an IBM IDataPlex with a 0.5 petaflop peak rating. The US machine is an HP/Intel cluster with a 1.1PF rating. Both are some way behind the oil and gas list leader which remains Total’s Pangea (OITJ June 2013) with its 2.2PF bandwidth.
Of course not all machines are actually in the Top500, notably BP’s new 2.2PF Houston Cluster, part of what the EnterpriseTech portal described as a five-year, $100 million cluster investment program. BP’s new machine is based on ‘vanity-free’ HP ProLiant SL230s Gen8 nodes each with 128 GB of memory.
Just in case you get carried away by all the oil and gas supercomputing prowess it is worth remembering that the top slot in the list is held by China’s Tianhe-2 with a 34PF rating. Number 2 is the Department of Energy’s Cray XK7 system at 18PF.
It behoves us to record Microsoft’s long term slide down the Top500 list. The Shanghai Supercomputer Center Dawning/Magic Cube (0.15PF), in at N° 11 in list 32 is now at 237 while China’s Faenov machine, in at 165 on list 40 is now at 309. Lots more Top500 lists.
A new case study from Schlumberger unit outlines how Seneca Resources has leveraged the Innerlogix data QC and management toolset. Prior to InnerLogix rollout, Seneca’s exploration team was spending days futzing with Excel spreadsheets to clean and import well data into its project data stores.
InnerLogix is used to validate well header, casing and deviation survey data from Seneca’s Marcellus shale operations. Automated synchronization procedures leverage a standard information hierarchy and rule-based data validation. The software checks incoming data against data already in corporate databases and pinpoints incorrect values and inconsistencies. Validated data is then propagated across the infrastructure.
Seneca data manager Carolyn Schuchert said, ‘InnerLogix has streamlined and simplified our data management. Data consistency and accuracy is significantly improved.’ Overall data load time per well has been cut from three days to 45 minutes.
Speaking at the Landmark Innovation Forum in Houston earlier this year, Dallas Dunlap and Sean Murphy of the Bureau of Economic Geology (BEG) at the University of Texas at Austin gave a strong endorsement to Landmark’s DecisionSpace technology. The BEG is working on a 3D geologic model and geo-referenced database of the state of Texas. The model will be used to expose the BEG’s extensive core and cuttings data sets to the public via a ‘compelling, web-based/GIS front end.’
The BEG has completed a pilot—the 3DT project—blending 2D and 3D data sources from different geologic systems, verbal, quantitative and physical information to educate the public and policy makers on Texas’ natural resources. The pilot included an evaluation of ‘several software suites’ to determine their suitability for the task and determined that ‘DecisionSpace Desktop, more than any other, fitted all our requirements.’ Landmark’s Dynamic Frameworks to Fill, OpenWorks and Recall were also key to developing the State-scale model.
Janet Hicks explained how Landmark is to integrate its Petris acquisition. Petris’ Recall bore hole data management and DataVera (now DecisionSpace data quality solution (DQS)) rule-based data QC technologies will dovetail with other applications within the Landmark master data management solution. PetrisWinds Enterprise (PWE) will extend the DecisionSpace data services and IM Portal.
Troy Kapiczowski showed how DQS was used to support Calgary-based Sinopec Daylight’s master data management and data governance. Sinopec’s ‘myWellFinder’ (MWF) is a fit-for-purpose well master that captures the main characteristics and description of a well. MWF leverages Microsoft SharePoint functionality to expose the well list to users and provide spreadsheet access including pivot tables from a ‘live,’ read-only connection to the underlying database. A custom search function was built with the Microsoft FAST search engine. MWF integrates with third party data such as the Daily Driller and Rig Records portal. Sinopec is now working on a rig scheduling tool.
Stan Cullick (Berry Petroleum) offered an independent’s view of the digital oilfield. Berry is using tiltmeters to monitor steam injection into its N. Midway-Sunset diatomite fields in the San Joaquin Valley, California. Tiltmeter data is gathered twice daily, correlated with injection and production and used to track fracture-related events. Data is aggregated into a diatomite dilation portal exposing dynamic QC’d data to all stakeholders. The same data drives alarms, virtual meters and predictive modeling with neural nets. Extensive use of Lowis data visualization was made in the project.
Olivier Germain presented the new Landmark Production data model (PDM) as the future of the intelligent digital oilfield. Landmark’s new ‘open’ data store is a PPDM-based model with ‘precise, consistent definitions’ for entities. PDM offers integration with business intelligence systems and ‘big data’ solutions. More from Landmark.
GSE Systems has released a gas oil separation plant simulator and tutorial under its EnVision brand.
IFS has released IFS Applications as tested and certified for deployment on Microsoft’s Windows Azure cloud.
Aveva’s E3D Insight is a Windows 8.1 application that lets project managers view and approve engineering designs from a mobile tablet device.
Blue Marble Geographics has released a .NET version its GeoCalc 6.6 SDK.
Cortona3D has released a Java SDK for its viewer of VRML97-formatted 3D virtual reality models.
Husky Corp. has unveiled a compact nozzle for use in drive-through fueling systems. The nozzle was developed in collaboration with Sweden’s Fuelmatics Systems.
Exprodat’s Team-GIS Exploration Analyst is an ArcGIS extension for petroleum play chance mapping, resource assessment and acreage analysis for play-based exploration workflows.
Hitachi has announced a dedicated storage adapter for Schlumberger’s Petrel, a Plug-in from the Ocean store.
Pegasus Vertex has announced Mudpro V3.0 with an enhanced interface, improved data loading and a SQL Server database.
Midland Valley has released FieldMove Clino for iOS and Android, combining a digital compass, clinometer, notebook and camera.
Elsevier’s Geofacets now connects to the Geological Society of America’s 80,000 map database and will be available as a Petrel plug-in early next year.
dGB Earth Sciences’ new SynthRock plugin adds forward modeling, rock physics and inversion to dGB’s OpendTect flagship.
A new 64 bit edition of Petrosys V17.4 brings ‘instant’ mapping, 3D point display and simplified depth conversion. Maps can be created by dragging required files onto the map canvas.
The 11.1 release of Caesar Systems’ PetroVR adds modeling of thermal recovery and unconventional workflows.
Francois Lepage (previously with Schlumberger) has launched ‘RockSoft,’ a new startup that is to develop proprietary subsurface modeling software and provide consultancy services in the field of geomodel software.
Tracero has announced what is claimed to be the world’s first subsea computer tomography (CT) scanner for the inspection of ‘unpiggable’ subsea coated pipelines.
The 2014 edition of Visage Information Solutions’ eponymous visual analytics software adds ‘V-Broadcast,’ to distribute ‘the right information and analyses to the right people.’
The US National Institute of Standards and Technology (NIST) has opened its preliminary Cybersecurity Framework (CSF) for public comment . The CSF was developed in response to President Obama’s cybersecurity executive order of February 2013 and is expected to be published early in 2014. A key objective of the CSF is ‘is to encourage organizations to consider cybersecurity risk as a priority similar to financial, safety, and operational risk.’ Checkout the 47 page CSF.
In a blog posting on the Industrial Defender website Venkat Pothamsetty has analyzed the CSF and its meaning for control systems professionals. While the CSF is a good framework for ICS security, it is ‘yet another framework to follow.’ The CSF means that ‘big data and analytics’ will increase in importance in identifying attack patterns by monitoring deviations from a baseline of network activity.
Speaking at the 2013 API Cyber Security event in Houston this month, a team from Lockheed Martin presented their analysis of ‘a new class of security threats from sophisticated and highly organized actors looking to steal intellectual property and disrupt operations.’ Lockheed’s Intelligence Driven Defense addresses the threats facing the oil and gas industry with ‘best practices to protect critical operations and lessons learned.’
Some help in the war on the hackers may come from Waterfall Security Solutions’ new FLIP technology which replaces control system firewalls with a stronger alternative. The FLIP leverages Waterfall’s unidirectional security gateway to strengthen oil and gas facilities IT and control systems and mitigate cyber attacks.
Meanwhile at a gathering hosted by IHS and Honeywell, former Homeland security secretary Michael Chertoff told oil and gas industry executives, ‘It no longer takes an army to fight a war. The top threat businesses face in the future will be from cyber attacks.’ Some 40% of all reported attacks in 2012 were directed at energy companies. Honeywell’s Roger Fradin added, ‘Cyber risks are constantly evolving, and we have to work together to find the right combination of solutions. Honeywell is at the forefront of developing new technologies and advanced cyber security solutions to help defend against cyber attacks, preserving the availability, integrity and confidentiality of industrial control systems.’ More from Fradin.
Tofino’s Eric Byres writes on an ‘avalanche’ of reports of new security vulnerabilities in DNP3 Scada systems. These were uncovered by researchers Adam Crain and Chris Sistrunk, using a new security test tool developed under the AEGIS Project. The NERC-CIP electronic security perimeter is seemingly ‘full of holes,’ specifically the ‘millions of physically insecure pad and pole devices around the world.’ Byres states that, ‘An oil well at the side of a prairie road [is a] potential entry point to a much larger critical infrastructure. All it takes is a test tool to find a backdoor in devices using protocols like Modbus, Ethernet/IP or Profinet.’
Speaking at the Society of Petroleum Engineers Digital Energy Technical Section (DETS) dinner event, Mehrzad Mahdavi (Dexa Systems) proudly announced that today, ‘digital technology is in daily operations’ and so the goal of the DETS has been achieved. Mahdavi attributed this success to the various SPE initiatives to ‘raise awareness’ of digital’s potential. How much of this is down to the DETS itself is a moot point as the DETS community home page on the SPE website has not been touched since 2007!
Diamond Drilling’s Moe Plaisance stole the show at the SPE plenary session on deepwater challenges with an impassioned plea to ‘slim things down’ in offshore drilling. Plaisance traced the history of offshore drilling from the 1953 Mr. Charlie rig that cost $1 million to today’s semi-submersibles. The 6th generation semis to appear next year will cost in the region of $700 million with a 10,000 ft water depth capability and dual 7 ram BOP stacks. For Plaisance, further progress will involve a change of tack, ‘we can’t just go on building bigger and bigger hammers!’ How’s this to be done? With new technology rigs, managed pressure drilling and monobore capability. As subsea kit gets more complicated, drillers need to know more about what is happening beneath their feet. More collaboration is needed. ‘If we need to pull something, we need to know like how big it is,’ not ‘OMG it broke! Can you be over there tomorrow?’
On the exhibition floor Steve Bowen presented Fluid Imaging Technologies’ novel system for real time drilling fluid monitoring. FIT’s FloCam actually came out in 1999 but only recently has image processing technology caught up with the requirement for super fast particle classification based on morphology and color—a total of 35 parameters are derived on the fly as the mud flows past the imaging device. More from Fluid.
Another intriguing device is 5D Oilfield Magnetics’ ‘Open Hole Net,’ a massive annular magnet that is positioned atop an open hole to catch any steel objects before they fall into the drill hole. Dropped objects not only cause wear and tear on drill bits but can also pollute the mud with metal fragments that affect directional drilling systems. More from 5D Oilfield Magnetics.
Oil and gas in the past has been categorized as a ‘technology timid’ and as a ‘low to medium tech’ industry. Queensland University’s Rob Perrons decided to find out if this was true with a survey (SPE 166084) of innovation and innovators in oil and gas. Companies were quizzed as to what were their main sources for innovation. Universities, government and trade publications came in some way down the list. Service companies appear to be largely self-reliant for innovation—reflected in a high level of patent intensity. One respondent observed ‘intellectual property is our business.’
Analyzing the potential of non-conventional production is challenged by deliverability parameters that vary slowly and, so far, little production history to work with. Charles Vanorsdale (Saudi Aramco) circumvented these issues with an in-depth study of ‘classic’ wells in US shale basins where production from naturally fracked shales has been underway for some decades. Comparing forecasts made from early years of production with ultimate production, Vanorsdale concluded that for single flow regime wells, production forecasts were conservative to good. For multiple flow regimes they tended to over optimism (SPE 166205).
Speaking in the Digital Oilfield technology session, Sushma Bhan described Shell’s e-Wellbooks (EWB) and its relationship to Shell’s global well, reservoir and facility management (WRFM) initiative. Shell is confronted with a technical data challenge of multiple data sources, incompatible stand-alone applications and data hoarded on shared drives and in Excel spreadsheets. Discipline silos make communications between drillers, production and reservoir engineers problematical. New data types—such as frac data—do not as yet have an owner. The WRFM initiative is a wide-ranging cross discipline effort to bring all of the above together with new systems, data loading and quality checks and new standard repositories. The EWB acts as a data integrator and delivery mechanism for the WRFM. Shell has built a ‘sustainable’ data ownership and governance around the system and claims that users no longer need to ‘know’ OFM, SAP or other corporate tools to access data. Users can now access rolled up data in the EWB rather than resorting to a spreadsheet. Shell’s technical data management has proved a key enabler for increased production (SPE 166339).
Total’s Raphael Henri-Bally presented Resqml V2.0, the latest manifestation of Energistics’ standard for the exchange of reservoir model data. Henri-Bally described Resqml V2 as a ‘much more ambitious’ than V1 which was released in 2011. V2 leverages the Microsoft-backed ‘Open packaging convention’ (OPC) to bundle a set of related XML files into a single object—or as Henri-Bally describes it, like an ‘Ikea flat pack.’ Upon unpacking, the happy recipient of a Resqml V2 package will see not only the reservoir model, but also features like faults, wells and seismic surveys and interpretations. Objects in the package can be connected by chronostratigraphic and topological relationships. Resqml V2’s first release is scheduled for 2014 and will bundle Witsml data along with the model. In 2015 the plan is to add ‘traceability’ and completion objects. Resqml partners include BP, Energistics, IFPen, Geosiris, Total, Texas A&M and Paradigm (SPE 166486).
USC’s Iraj Ershagi introduced the smart oilfield scorecard session observing that ‘smart is no longer an add-on but an integral part of our business strategy.’ Chevron’s Warner Williams underscored the shift in emphasis as the ‘digital’ tag was dropped for this session. What is now key is the overall framework—for Chevron, the upstream workflow transformation (UWT) initiative which applies lean sigma to the oilfield and applies behavioural base lines to ‘make sure folks do what we want them to do.’ Williams stressed the importance of data management—’if you don’t do this right you can forget the rest.’ Poster child for the UWT is the Petrotech portal, a Petroweb (1008) based one stop shop/front end to data in the systems of record. (SPE 166516).
Fareed Abdulla then presented Adco’s data-driven approach to production optimization. Adco uses artificial intelligence (neural net) to build surrogate reservoir models of one mature field. This has resulted in a simple scheme where fine-tuned choke control mitigates water breakthrough—a cheaper solution than submerged pumps. The surrogate model approach also obviates the need for a computationally intense full field model. The approach initially met with some scepticism, this was countered with a successful demonstration on a subset of wells in the field.
Klaus Mueller offered a retrospective of Shell’s successful smart field journey recalling an early digital oilfield exhortation not to ‘automate your inefficiencies.’ Shell has developed a structured process that aims to deploy the appropriate level of smartness (ALoS). Today 80% of Shell’s production is considered to operate at ALoS. Shell’s new ventures team now regularly evaluates an asset’s potential for improvement with smart technology before acquisition. Shell now operates nine ‘collaborative work environments’ (CWE). The company has also deployed ‘WRM,’ a well and reservoir management toolkit and ‘Radar,’ an upstream data management infrastructure which has reduced the time taken to locate pressure data from ‘three days to 10 minutes.’ During the data improvement program Shell found that 16% of all logs never arrived in the office. Today contractors are only paid when all the data is in the right place and format. Latest in the smart stakes is the ‘smart mobile worker,’ an AV headset and real time communications link that allows operators to stay in touch with the CWE. Mueller concluded, ‘The digital oilfield is here. Soon there will be no other way of working.’
A debate followed on ‘soft factors’ and the degree to which domain silos have been breached by the smart movement. The consensus was that the silos are still there but that the CWE has done a lot to make the barriers more permeable. Earlier in the digital decade, IT and the business were at loggerheads. Now they are ‘joined at the hip.’ The ideal person in charge of the digital oilfield is a ‘petrotech with an IT background.’ Such individuals remain in short supply. Another pain point is cyber security, with an increase in frequency of attacks and vulnerabilities. Large PI System deployments came in for criticism. Maintaining a 500k tag system can be problematic.
National Oilwell Varco has announced version 2 of its Intelliserv wired drill pipe with a 57k bits per second bandwidth. While still not exactly broadband, the technology still beats mud pulses with data rates only a few bps.
Kicking-off the safety and risks session, Qianru Qi (USC/Viterbi) observed that the oil and gas industry’s accident record is not as good as it is often presented. According to the US National institute for occupational safety and health (Niosh—1009) the fatal accident rate in oil and gas is seven times the US national average. For Qi, the answer lies in better safety technology and training but here there have been no significant changes since the 1970s. Better monitoring could give early warning of deficiencies such as corrosion, pressure build up or toxic gas release. Remote and automated control stations are a solution. And safety is a constant battle against ‘human limits,’ notably the tendency to be dismissive of safety protocols (SPE 166412).
EBN’s Guido Hoetz presented results from a joint industry pilot project, conducted with help from Netherlands research institute TNO, on drilling geo-hazard prediction. This has resulted in the development of a pilot drilling hazard database (GeoDhaps) of incidents and root causes. The project was hampered by confidentiality issues and some sensitivity as to ‘exactly what went wrong.’ A proposal has now been submitted to Nogepa, the Netherlands operators association, for a full scale geo hazard database (SPE 166254). More from the ATCE.
John de Wardt kicked off the 2013 SPE Drilling systems automation technical section meet last month in New Orleans earlier this year with a recap of its objectives. Drilling automation is seen as transformative technology, with the potential to address the issues of an aging workforce and the current ‘severe attrition’ of drilling personnel. One poster child for DSATS is RioTinto’s mine of the future. Another source of inspiration is the Federal roadmap for robotics. de Wardt mentioned en passant that the SPE’s connect social network has scrapped its SharePoint site and migrated to a Higher Logic-based portal.
Founder Nagesh Kulkarni described how Quarkonics is working to halve drilling times with the judicious application of a combination of full physics models and artificial intelligence. The technology, which is being tested on Talisman’s rigs, is claimed to make drilling ‘predictably uncertain.’
Canrig’s Pradeep Annaiyappa promoted his company’s rigsite data cloud offering (Oil ITJ May 2013) observing that even today, ‘WITS is the most common transfer standard because it is not on the rig’s control network and runs on a serial port.’ In general it is a challenge to integrate with a rig’s cyber systems and other fancy closed hardware. This means that less than 1% of drilling data actually leaves the rig. Witsml has been around for a while but has had little adoption—even amongst the consortium members, although BP has threatened suppliers in an effort to force compliance.
Moray Laing SAS’ oil and gas head straw-polled the audience to ask ‘Do we have a big data problem?’ Perhaps 20% thought we did. Laing is sure we do, citing tests of autodrillers where rapid variations of parameters such as tortque and drag call for the application of ‘Semma,’i.e. ‘sample, explore, modify, model, assess,’ enabled with SAS Enterprise Miner. This has been used on artificial lift and water injection projects with a healthy ROI. ‘We do have a big data problem. And an opportunity!’
Greg Wood offered a compelling argument for Rigminder’s huge (55 to 180”) monitors to replace multiple displays currently used in the doghouse. Rigminder offers one big multi-purpose touch screen with a see through/heads-up functionality.
One comment in the ensuing debate was that the rig was like a Christmas tree—richly decorated with goodies on Christmas Eve. But when the drilling contractor wakes up on Christmas day, all the presents have gone—they’ve been taken away by the service providers! More ftrom DSATS.
Landmark’s Janet Hicks has been elected to the Professional petroleum data management (PPDM) association board. Hicks is to serve as the board’s secretary.
Ikon Science has opened a real-time monitoring while drilling centre in Durham, UK.
Mark Fusco has retired as CEO of AspenTech. He is replaced by Antonio Pietri, president and CEO.
Richard Herbert is the new head of exploration at BP. He hails from Talisman.
Calsep has opened a new office in Moscow, Russia.
Common Data Access has launched UKOilandGasData.com, a single gateway to information previously in the Digital energy atlas and library and the CDA’s Well and Seismic data stores. CDA has also published Release 2.0 of its ‘Best practices for managing information transfer at the time of asset sales.’
Cortex Business Solutions has appointed Grant Billing to its board.
Claire Miller has been appointed CEO of the Energy Industries Council.
Jeffrey Lai is the new oil and gas advisor in Singapore for Intsok.
The GE Oil & Gas engineering team recently recruited Jesse DeMesa as CTO software.
Ipcos has appointed Chris Daniel and Pieter Kapteijn as independent members to its board. Daniel was previously with Honeywell, Kapteijn is also technical director at Maersk Oil.
GE Oil & Gas and Kuwait Oil Company have launched a new Competency Development training collaboration.
ISN has appointed a new CTO, Paul Downe, formerly with Dell.
Julian Bourne is the lead for the Fiatech/PCA JORD Scope D project.
LMKR has promoted Vince Molliconi to Executive VP Global Product Sales.
Steve Robb is now MD of Cimation’s Canadian operations. He hails from Weatherford’s CygNet software unit.
Former Juniper Networks and Microsoft executive Eddie Amos has joined Meridium as CTO.
NDB Asia Pacific has recruited geologist Min Xu, formerly with Exoma Energy. She will be based in Perth.
Former Esri Pipeline and Gas Utility Industry Manager, Rob Brook, has joined Coler & Colantonio as VP for Strategic Development.
The Trusted Data Manager (TDM) from EnergyIQ has been certified PPDM 3.8 ‘gold’ compliance.
Bob Potter, president of FMC Technologies has retired. He is replaced by John Gremp, chairman and CEO.
Former president and CEO of Baker Hughes, Chad Deaton, has been elected to Marathon Oil’s board of directors.
Forum Energy Technologies has appointed Bill Boyle as Senior VP—Subsea Technologies. Bryan Suprenant has been appointed VP well intervention.
Lynn Kis has joined Ryder Scott Canada as a petroleum engineer. She was previously with AJM Deloitte. Geologists Jake Emberson and Gillian Rosen have joined Ryder Scott’s Houston office.
Weatherford International has appointed Krishna Shivram as Executive VP and CFO, and has also promoted Dharmesh Mehta to Executive VP and COO. Shivram hails from Schlumberger.
Jeffrey Maskell, of Westheimer Energy Consultants, has announced the sudden and unexpected death of his son William Jeffrey Maskell at the age of 25. William was a co-founder and director of Westheimer and headed up its marketing operations.
Former Texas A&M chemical process safety specialist Trevor Kletz has died. More on Professor Kletz’ prolific career from Wikipedia.
Advent International has acquired P2 Energy Solutions from its previous owner Vista Equity Partners.
Australian HSE software house CMO has completed a management buyout backed by UK private equity investor, Inflexion. Loek Van den Boog, formerly with Oracle, is now chairman.
Chevron Technology Ventures has launched CTV Fund V, a $90 million venture capital fund to focus on companies developing emerging technologies with ‘the potential to improve Chevron’s oil and gas business performance.’
Ikon Science has acquired the software, services and intellectual property of Bergen, Norway-headquartered Terra Geotech. Terra Geotech founder Eamonn Doyle has joined Ikon as VP real-time ops.
Rockwell Automation is to purchase provider of wireless solutions in the oil and gas industry, vMonitor. vMonitor will integrate Rockwell’s control products and solutions unit.
RPS Group is to acquire Norwegian consultancy OEC and its subsidiaries.
SGI has acquired the assets of storage virtualization specialist FileTek. The agreement covers StorHouse and Trusted Edge software, FileTek’s customers, engineering team and services and support resources.
Silverback Enterprise Group has acquired master data management specialist Kalido.
Teledyne has acquired Aberdeen, Scotland based CD Ltd., a supplier of subsea inertial navigation systems and motion sensors for marine applications.
WEX is to acquire the assets of ExxonMobil’s EU commercial fuel card program.
Petrofac Training Services’ unit Oilennium of Norfolk, UK, has teamed with Edinburgh-based Acting Up World to offer an online suite of 'hard-hitting dramatic films that inspire oil and gas personnel to connect emotionally with their individual Health and Safety behavior.'
The short films and animations tell stories about people, avoidable accidents and their tragic aftermath. Oilennium MD Kevin Keable said, ‘If we can’t get participants to connect emotionally, they will not change their behaviour. By viewing these films people are much more receptive to safety training.’
One film, ‘Dead Jed’ explores the domino effect of letting negligent behaviors pass as acceptable. The ghost of an oil worker attends his own funeral and speaks to those involved in his death. Some of the stories already feature in Acting Up’s corporate training live performances.
Acting Up MD and founder Emma Currie added, ‘These stories can now reach and inspire safe behaviour around the world. We are exploring new ways to connect dramatically with those who work in the global oil and gas industry.’ Oilennium clients include Weatherford, Halliburton, Baker Hughes, Shell, Noble Drilling, Marathon and Dolphin Geophysical. More from Oilennium.
Writing on an Oracle blog, Melinda McDade provides some benchmark information from trials of Halliburton/Landmark’s ProMax seismic processing application running on a Sun cluster.
The test was run on with 48 X6275 server blades running in a Sun 6048 modular system connected by QDR Infiniband to a Lustre file system. The main tests involved scalability with increasing node count. For 3D prestack Kirchhoff time migration of a 71k trace dataset a 144x improvement was achieved going from 1 to 72 nodes. On a larger 283k trace volume, a 98x improvement was observed going from 1 to 96 nodes.
McDade attributes the ‘super linear’ scalability to hyperthreading (16 threads per node) and data caching. A 1.7x speedup was attributed to a recompilation of the source code for the current ProMax release using the latest Intel 11.1 compiler. McDade observes that current performance levels mean that ProMax can be run directly from Halliburton’s GeoProbe interpretation application to perform migrations on the fly, integrating logs and reservoir data from the OpenWorks database.
A new white paper from American Industrial Systems (AIS), ‘Hazardous areas and classified locations’ looks at safety protection requirements for electrical equipment deployed in hazardous areas. The white paper, authored by Santiago Consunji, is part a sales pitch for AIS’s explosion-proof solutions for upstream and downstream oil and gas operations, but also provides insights into plant safety and how to avoid fires and explosions.
Technical standards for explosion protection and safety are the subject of multiple regulations around the world. Equipment manufacturers are confronted with plethoric regulations and are duty-bound to align their equipment with the latest standards. The paper provides a table of the legislative basis and standards for explosion-proof electrical equipment in different legislations around the world.
In the EU, the EU committee for electro-technical standardization’s (CENELEC) EN 60079 and EN 61241 standards cover explosion protection. Other ATEX directives cover what equipment and work is allowed in environments with a potentially explosive atmosphere. In North America, electrical equipment deemed suitable for hazardous areas is certified by laboratories such as UL, MET, FM, CSA or Intertek. The American National Standard Institute (ANSI) coordinates US standards with international standards, so that American products can be used worldwide. AIS touch screens and panel PCs have application in drilling, on-site workstations, pipeline control and monitoring and refinery process operator control.
Noumenon Consulting’s new XmRDLManager promises ‘fast and efficient’ management of ISO 15926 classes. ISO 15926 is a standard for exchange of plant and facility data that has support from Norway’s POSC/Caesar Association and the US Fiatech standards body.
XmRDLManager exposes the ISO 15926 reference data library (RDL) for query in a GUI to provide an engineering view of reference data. The tool dovetails with Noumenon’s flagship XMpLant toolset and the XmVTE validating template editor. The neutral model comes as either an XML file, or in OWL/RDF.
Absoft is to install and configure SAP for Norwegian drilling consultancy Ross Offshore. The companies are to develop an online portal in support of drilling and well operations.
Allegro has announced a strategic partnership with Chinese IT service, NeuSoft to develop an IT solution for CNOOC’s LNG operations.
Petronas Carigali has entered into a 23-year oilfield service agreement with Baker Hughes to enhance recovery from the Greater D18 fields, offshore Malaysia.
Consultancy firm Competentia has secured a five year frame agreement with Statoil for consulting services relating to drilling and exploration.
Cortex Business Solutions has partnered with TCI Oilfield Factoring to offer direct invoice discounting.
Dril-Quip has entered into a five year global frame agreement with BP for the supply of subsea wellhead equipment and related services.
Flotek Industries and Gulf Energy affiliate Tasneea are to establish an oilfield chemistry and chemical research company in Oman. The new company will be called Flotek Gulf.
FMC Technologies has received a $340 million order from Tullow Ghana for the supply of subsea systems to the Tweneboa-Enyenra-Ntomme offshore development.
Foster Wheeler has been awarded a contract by Aramco to perform front-end engineering design of a gas compression project in Saudi Arabia.
Geospace Technologies has received an order worth US $29.4 million from a subsidiary of Norwegian Seafloor Geophysical Solutions Holdings for a marine data acquisition system comprised of 2,304 stations of the company’s OBX deep water seafloor nodes and related equipment. Geospace also won a US$18.1 million order from Dawson Geophysical for 9,000 stations of its three-channel GSX wireless recording system.
Prospectiuni and Geotrace are to offer ‘superior’ geophysical and interpretation services to E&P companies in Europe.
Honeywell has been selected by Preem AB to modernize the Preemraf Lysekil refinery in southern Sweden with Honeywell’s enhanced high performance process manager industrial process controllers.
Chinese IT solutions provider Huawei supplied a remote control and communications systems for the Sino-Kazakh natural gas project, operated by Asia Gas Pipeline. Huawei also deployed the ‘world’s first’ offshore oilfield LTE network (a.k.a. ‘4G’), for Norway’s Tampnet. Another ‘world’s first’ is the onshore oilfield LTE network in Xinjiang, China.
IBM has sent a corporate service corps on a pro bono mission in Angola, advising on how to better train small and medium sized businesses to tailor their services for the oil and gas sector.
Ramboll has standardized on Intergraph SmartMarine Enterprise for all of its onshore and offshore oil and gas projects. The standardization entails integration of several Intergraph tools, including Intergraph Smart 3D, SmartPlant Reference Data, SmartPlant Instrumentation, SmartPlant P&ID and SmartPlant Electrical.
Ipcos has been awarded a turnkey project by Lukoil Mid-East to design and build an integrated asset modeling solution using technology from OVS, Petex and OSIsoft.
Aveva has appointed GN Engineering Service and Solutions as its sales and support representative in Luanda, Angola.
Jason has selected Rock Flow Dynamics to contribute technology and provide the ability to simulate high resolution, high fidelity predictive models of subsurface reservoirs. Rock Flow Dynamics is to provide a cluster version of tNavigator to Jason customer, Kazakhstan oil company, KazMunaiGas Exploration Production.
RSI has signed an alliance with Share Oil & Services consulting group of Mexico, to provide geoscience consulting services.
Shell is using SAS Predictive Asset Maintenance software on its Perdido spar platform to extend equipment lifespan and run times.
Maersk Oil UK has joined Maersk Oil Qatar and Maersk Olie og Gas in using Serafim Future as a reserves management and production forecasting system.
Technip has been awarded a substantial (€100 to €250 million) contract by LLOG Exploration Offshore for the development of the Delta House field in the Mississippi Canyon area. Technip also won an ‘important’ contract from Qatar Petroleum for an offshore project.
Wood Group and Siemens have formed a joint venture comprising the Maintenance and Power Solutions businesses of Wood Group GTS, and Siemens’ TurboCare business unit.
The Global Reporting Initiative has launched the GRI Taxonomy 2013, updated for use with G4, the latest version of GRI’s sustainability reporting guidelines. The standard was developed with help from Deloitte and is available free. The 2014 taxonomy enables organizations to produce digital sustainability reports leveraging the GRI Guidelines.
The Oasis Emergency Management Technical Committee has approved the Emergency Data Exchange Language (EDXL) Distribution Element Version 2.0. The new spec describes a standard message distribution format for data sharing among emergency information systems. The DE uses a profile of the Geographic Markup Language (GML) and follows best practices for naming conventions.
The Open Geospatial Consortium and the PODS Association have signed a liaison agreement to agreed to work together to identify enhancement opportunities between the PODS data model and the OGC’s geospatial interoperability concepts.
The UK Companies House has launched its ‘Free accounts’ data product giving the public access to statutory accounts in the both iXBRL and XBRL formats. The initiative is part of the UK government’s open data agenda.
Livingston, UK-based Asset Guardian Solutions (AGS) has been awarded a contract by Australian Inpex for the provision of engineering IT asset management of the $34 billion Ichthys LNG project.
AGS is to provide a hardened IT control system management tool set to protect Ichthys infrastructure, including the LNG plant, central processing facility and floating production, storage and offtake vessel. The solution includes an inventory of automation system IT components and a structured management of change process for patching firmware and updating operating systems and applications.
The system provides an audit trail of control system modifications and consolidates multiple software applications into a single system that also generates reports to assure compliance with industry standards and regulations. Ichthys’ joint venture partners include Total, Tokyo Gas, Osaka Gas, Chubu Electric Power and Toho Gas. Production is scheduled to commence by the end of 2016. More from Asset Guardian.
Chinese oilfield automation provider Recon Technology has announced a new specialized supervisory control and data acquisition system (Scada) tuned to the needs of non conventional oil and gas development. The system was developed to overcome ‘inefficiencies’ in current Chinese shale development where equipment and services have so far underperformed. Recon Scada monitors and controls the operations of on-site equipment and transforms operations with an ‘intelligent’ production process.
Recon CTO Chen Guang Qiang said, ‘We have leveraged our experience and R&D investments in the new system. Non conventional gas requires a high continuity of production. Upsets can cause wells to water-out and shut off wells prematurely. Our automation systems help resolve such issues by providing real-time situational intelligence. Early detection of well failures means that our teams can intervene in a timely manner.’
Schlumberger has just released ‘Quartet’ a reservoir testing system using wireless telemetry (‘Muzic,’ what else!) to provide bi-directional downhole communications. Muzic uses mud pulses to transmit data to and from the test tool. For deepwater applications, repeaters are run in the test string to enhance signal to noise. The relatively low bandwidth offered by the mud pulse system is OK for well testing operations that last from days to weeks.
A deepwater well test trial for Petrobras in the presalt Santos basin compared real-time mud pulse data with data recorded on the test tool’s memory card. The real time data compared very well with the recorded data, even capturing short duration events such as the detonation of a tubing-conveyed perforating gun.
Schlumberger’s Sameh Hanna said, ‘When working in high-cost environments such as deepwater, it is important to be able to interact with downhole tools, manage wellbore events, validate and refine well tests in real time.’ The system enhances testing efficiency by making possible the isolation, control, measurement and sampling of the reservoir in a single run.
Energistics has released V1.0 of a new units of measure dictionary (UOMD) standard for public review. UOMD is intended to cover the needs of the oil and gas industry with units ‘in current use or which would be reasonably expected to be encountered.’ The units are oriented toward use in software. The Energistics UOM work group, which includes representation from PPDM and the SEG, has generated the unit names, symbols, derived units and conversion factors with an eye towards use in software and databases.
Energistics and its forbear POSC has a long history of authoritative UOM work—from the Schlumberger/API RP66, through Epicentre and a prior POSC UOM spec, whose version 2.2 is baked into Witsml. The new Energistics UOMD V1.0 builds on this previous work, providing a comprehensive although complex infrastructure for coding units.
Way back in March 2004, Oil IT Journal called for a public units of measure (UOM) description and discovery mechanism in a seminal editorial and report from the W3C titled, ‘A million miles of spaghetti are eaten every day’). While the computing community shows concern as to data types, it can cast fortune to the winds when it codes its UOM—with sometimes catastrophic results. The UOMD is therefore a step in the right direction although it would have been nice to see some web services for testing and validating new software.
BlueFire Equipment Corp. of Houston claims that its drill bit printing initiative is ‘well underway.’ BlueFire plans to manufacture its proprietary polycrystalline diamond cutter bits using a 3D printer. CEO Bill Blackwell said, ‘We will be integrating this leading edge technology into our manufacturing process. 3D printing could be a true game changer for our business.’
The rather curious release cites president Obama as saying ‘3D printing has the potential to revolutionize the way we make almost everything.’
Technologies on trial include ultraviolet stereolithography of photocurable resin, through fused deposition modeling to selective laser sintering.
To the untutored that we are, it is unclear whether the materials used in 3D printing will be sufficiently hardened to withstand the rigors of drilling. If the 3D process is simply to produce prototypes for manufacturing then why not use the design files in the computer aided manufacturing process? Unless BlueFire is printing the diamonds? That would be something. More from BlueFire.
Speaking at the 2013 annual meet of the Computer-Aided Process Engineering (Cape Open) in Lyon, France earlier this year, Gregor Tolksdorf (TU Berlin) presented ‘Mosaic’ a web-based modeling and code generation tool. Mosaic uses either LaTeX or the Microsoft Word formula editor to define a process model in Mosaic’s symbolic notation. Once the full model has been specified, Mosaic auto-generates the computer code.
Code can be generated for various programming languages including C++, gProms, Fortran, Python and Matlab. Models are then tuned in an iterative process—by running the model and tweaking equations of state in the documentation. Users do not touch the code manually.
The result is a modular modeling concept leveraging an enhanced symbolic notation to define platform independent models based on Java and XML documentation. All model components are stored in a web database to support team model development. Mosaic also generates code that leverages Cape Open physical properties for Matlab and gPROMS. The Mosaic project is supported by the Cluster of excellence ‘Unifying concepts in catalysis’ (Unicat) coordinated by TU Berlin and funded by the German Research Foundation. More on Mosaic and Cape-Open.
San Francisco headquartered Talksum has announced a cross-domain data router for oil and gas applications. The Talksum data stream router (TDSR) is a hardware appliance that converts sensor and other data into ‘flexibly managed event streams.’ The TDSR sits at the well site and filters and aggregates data from multiple sources for transmission to headquarters and real-time operations centers. Talksum founder and CEO Alex Varshavsky explained, ‘Oil and gas operations depend on a range of technologies and service providers. The TDSR ingests disparate data types and intelligently routes them to real-time BI tools and databases at the appropriate location.’
The TSDR can be programmed to provide alerts on flow rate and pressures, casing seals, blowout preventer integrity, fracture gradient conditions and pore pressure, gas detection, and other safety and efficiency mechanisms, and routes the relevant data to the respective locations. The TDSR data management solution offers a systems approach, which integrates multiple domains that together assess and potentially affect overall well integrity and safety. The TDSR also provides a cyber security solution and allows for changes in government regulations and standards and corporate policy controls. Talksum claims compatibility with RTU, PLC, DCS, Scada and Wits, LAS, Modbus, OPC, Witsml, Prodml and more and offers a 100K event/second bandwidth per appliance. More from Talksum.
Niobrara NatGas is planning to develop a $4.2 billion project, ‘Digital Fort Knox,’ a 662-acre data center and energy production facility in northern Colorado. Niobrara general manager Craig Harrison explained, ‘The data center has its own natural gas supply that also powers a fuel cell farm, heating and cooling across the proposed complex. Our next step is to find the right high balance sheet organization to seize on the potential of this development.’
The site, engineered by CH2M HILL, will enable conventional, cloud computing data center development and innovative ‘microgrid’ development scenarios with a high level of energy security and reliability.
Niobrara improbably claims the site to be capable of ‘sustainable perpetual motion energy,’ with ‘ten to fifteen years of uninterruptable gas contracts.’ These will hedge the data centers’ main cost, energy and provide protection from power outages. On site gas supply lines have a 1.5 billion cubic feet per day capacity. Power ‘could be’ augmented with renewable energies from solar, fuel cells, energy storage and wind. More from Niobrara.include ("copyright.inc"); ?>