Back in 2013, the Pipeline Open Data Standards association (Pods) and Open geospatial standards organization ‘agreed to work together to identify enhancement opportunities between the advanced geospatial interoperability concepts developed within the OGC’s consensus standards process and the Pods Association’s widely used Pods standard and data model. The PipelineML SWG is a result of this cooperation’. Since then, the two standards bodies appear to have drifted apart, although Pods has leveraged some OGC geospatial standards in its subsequent data modeling efforts.
The OGC has now independently released PipelineML as a ‘candidate standard’
to support the interchange of data pertaining to oil and gas pipeline
systems. The initial release of PipelineML addresses two use cases, new
construction surveys and pipeline rehabilitation. The spec is said to
promote ‘traceable, verifiable and complete data management practices’
and has ‘lightweight aggregation’ with other systems. This system
leverages its own LandInfra standards for right-of-way and
land management. Future extensions will embrace the OGC’s utility data
model and extend to cathodic protection, facility and safety.
We were curious to know of possible cooperation or overlap between the two pipeline data exchange initiatives. OGC PipelineML workgroup co-chair John Tisdale explained, ‘PipelineML (PML) moves data between applications, devices and databases including Pods. But its scope is broader than the exchange of Pods/operational systems data. PML is designed to share data across the entire lifecycle of pipeline assets. For example, schematics for a new pipeline system in AutoCAD can be exported as a PML file and picked up by an inventory management system. Once the components have been purchased, results are output as a new PML file with additional manufacturer’s part information. This can be imported into construction management software and/or survey and mapping tools. The system can then provide as-built results to the operator as a PML file. PML allows data exchange between three Pods databases (relational, spatial, NextGen) as well Esri APDM, Esri UPDM and other models and applications for risk, HCA, ILI, CP, NDE, design, construction, survey and mapping, operations, integrity and reporting.’
So it would appear that, rather like with PPDM and Energistics, the distinction between data at rest and data in movement (exchange) provides something of a rationale for there being two standards in a rather small space. But as we learned from Chad Corcoran’s (Andeavor) presentation at the 2018 Pods Fall conference, Pods too has an embryonic Data Exchange Specification (DES). DES is, or will be, a component of the Pods next generation model aka V7.0. DES is to facilitate data translations between databases, software systems, third party service providers and pipeline operators. The DES also enables system integration via service-oriented architecture (SOA) approaches. The DES is intended standardize data and schema exchange between Pods databases across the pipeline industry. DES comprises three XML files: schema definition/rules, data and schema mapping. The DES is intended to evolve into a general-purpose standard format for Pods data and includes substantial metadata along with a 'gml:id’, the OpenGIS handle for interoperability with other geographic schemas. The Pods 2018 Fall conference presentations are available here.
Oil IT Journal’s very first mention of a Pipeline ML is of a long-forgotten (and unrelated) 2002 initiative from POSC (now Energistics)!
A new white paperfrom Emerson, ‘Digital Twin: A Key Technology for Digital Transformation’ offers definitions and a check list for would-be deployers of the twin paradigm. The digital twin is said to be a ‘key technology’ of Emerson’s Plantweb ecosystem. But the white paper stresses that automation system vendor-independence is the first criteria for twin deployment. Next comes ‘selective fidelity’, a pragmatic approach that ensures that components are simulated in a cost-effective manner. An open architecture is also important for ingestion of control system, historical and design data. Here, the protocol of choice is OPC UA and/or OPC DA.
But what exactly is the twin for? The white paper enumerates use cases as follows, access to control system information by distributed teams, the use of dynamic simulation for designing and building automation solutions and ‘virtual’ operator training. The twin also allows for process optimization and experimentation ‘without risk to operations’. The twin embeds accurate models of motors, drives, valves and instruments along with full physico-chemical models of plant and process. For Emerson, these include Emerson Mimic and Aspen Hysys, using the new linkage announced last year. In the life sciences industry, the twin is said to be ‘proven and accepted for offline testing of an automation solution’ and is also used to provide effective operator training and operating procedure development.
Comment: Back in the day, Oil IT Journal had something of an epiphany when we visited BP’s Humberside plant to check out the latest in process simulation. We were naively surprised at the time that the simulator was not running the plant. Emerson’s white paper make it clear, contrary to others’ marketing spiel, that the digital twin does not ‘run the plant’ either. It is better conceived, at least according to Emerson, as an upgraded training simulator that is now used for offline optimization and experimentation. Doing anything resembling this in real time is, understandably a little harder, possibly even illusory, as we intimated in our DT investigation last year.
The US Bureau of Economic Analysis recently reported that in the decade from 2006 through 2016, the US economy grew at a measly 1.4%/year, in part down to the 2008 financial crisis. However, there was a ‘bright spot’ in that ‘digital economy’ grew at 5.2%/year, a whopping 67% over the period. My initial response was that ‘digital' is really a kind of service industry to the rest of the economy. I asked the BEA if this meant that the digital economy is growing at the expense of the rest of the economy. The BEA responded kindly to my troll, stating, ‘No. It simply means that the digital part of the economy (as defined in the report) was growing at a faster pace than the overall economy over that specific period of time’. Right... I was not greatly the wiser. I fiddled around with Excel trying to squeeze some more insight from the numbers and gave up, still wondering a) how long it would take ‘digital’ to become 100% of the economy, b) how do you ring-fence ‘digital’ in the first place and c) does this even make sense?
~
I attended a meeting of ‘Masters of Digital’ event this month. MoD is an annual event organized by Digital Europe, a Brussels-based trade body that represents the EU computer industry. DE members actually include a number of US companies, not least Google, Intel and Microsoft all of which sponsored the MoD event. Huawei is also in DE but wasn't a sponsor. MoD, like the EU itself takes the view that a foreign company is sufficiently European to take part in EU taxpayer-funded R&D as long as it employs a large number of Europeans. That would be OK if it were not for the fact that much of subsequent discourse turned on the ‘competitive advantage’ that digital investment all this R&D will bestow on Europe.
Mariya Gabriel, the EU Commissioner for digital economy and society, took to the stage to outline ‘Europe’s vision to boost digital innovation and enable digital scale-ups to thrive’. The EU lags behind in the world’s digital ecosystem. It is not a ‘center of digital innovation’. The web and other digital ecosystems are run by the US and China. So, what is the EU to do? It has to ‘master blockchain, quantum computing and launch moonshot projects’. And at the same time ‘put people first, framing innovation to align with EU values’. Europe needs to develop artificial intelligence ‘in a climate of security and confidence’ and to ‘make a competitive advantage of this value-based approach’. To achieve such and ensure the EU ‘rightful place’ in AI, the EU AI Alliance has been set up with 52 experts drafting guidelines for ‘ethical’ AI.
At the MoD, the DE’s ‘Future Unicorn*’ Awards (the ‘FU’ awards?) were presented to some rather unlikely candidates for unicornism*. One, Finland’s MaaS Global has developed ‘Whim’, a rather neat, multi-mode transport app. Another, a content management system from Umbraco (Denmark) is, by its own admission, unlikely to become a unicorn. CEO Niels Hartvig challenged the prevalent ‘obsession’ with unicorns and questioned whether they make for sustainable businesses. 80% of tech IPOs don’t make a profit and the unicorn model represents ‘extreme, capital intensive survival of the fittest’. Perhaps the EU should aim for a ‘true sharing economy’ where companies are not (just) evaluated on their financial worth, but also on ‘how much value is created elsewhere in society’. Such noble goals contrast with Silicon Valley’s invasion of privacy and China’s ‘invasive communism’. The EU GDPR is an ‘amazing achievement’ that will create privacy-focused companies that respect the citizen.
* Unicorns are by convention, companies with over $1 billion market capitalization.
Schneider Electric France president Christel Heydemann agreed that much digital technology is perceived as a threat by citizens. She implied that for the EU, the digital battle was already lost in the consumer space. The bigger digital opportunity lies in business to business (B2B) exchanges and in industrial applications of digital technology. So the EU should focus its attention on its current leaders in industrial software, Schneider Electric, Siemens, SAP and others. Turning to the environment, Heydemann described electricity as the most efficient way of transporting energy and to ‘decarbonize’. Here there are great opportunities in building automation. Buildings are ‘massively energy inefficient’. Green energy, automation and zero nighttime consumption are the way forward.
Speaking only a few weeks before Brexit, assuming it happens, Lord Ashton, UK Parliamentary under-secretary of state for digital stated that although the UK is leaving the EU and its digital single market, cooperation will still be needed on privacy, fake news and to ‘keep the internet as a global, multi-state, liberal enterprise’. We will need ‘common frameworks for uninterrupted data flows post Brexit’ and have a mutual interest in starting ‘data adequacy’ discussions asap. Fortunately, the UK data protection act is already aligned with the GDPR. ‘We need to use technology to make the bonds between citizens and governments stronger and to make sure that the UK and EU stay in touch’.
The rest of MoD was not too much of a disappointment, my expectations were low. We were regaled by sharp young start-uppers strutting their stuff and rather older EU buffs pontificating on this and that and encouraging more investment in ‘artificial intelligence and blockchain’. A great occasion for buzzword bingo. An 'entertainer’ came on the stage and had the assembled throng signing ‘We are the masters... of digital’ to the tune of Queen’s ‘We are the champions’.
I was left with the impression of a rather undignified scrabble for EU cash. The sums involved are considerable. Investment by the EU taxpayer in digital ‘initiatives’ is countable in the billions of euros. In order of magnitude, this is probably comparable to Series A funding from venture capital in the US. So, which is best? Obviously, the US has produced the big shots so far. But not all IP in the US is generated in the private sector. As Ms. Gabriel stated in her address, the US DARPA is a poster child for the EU’s digital initiatives. So who do you think is best qualified to decide where to invest, government or private industry? That sounds like a troll. Let me backtrack, after all, US technological successes are not all out of the private sector, from the Panama Canal to the atom bomb and Darpa’s own internet. But there is one significant difference between the public and private which comes into play when, as do many digital projects, there is failure. In private sector projects, failure is taken on the chin by the VC (and possible its kleptocrat backers) while in public-funded failures, it is the taxpayer that gets the sucker punch.
The interesting facet of all this scrabbling for funds is how can the various demands be justified. How do you arbitrate between investing in the latest shiny new technology and say building a new hospital, or not even bothering to tax folks in the first place? How much of EU research twisted and tuned to satisfying the politicos. Indeed, how do you arbitrate between spending money (public or private) on things digital. How can you be sure that a growing digital segment of the overall economy is a good thing as the BEA clearly thinks? I’m not going there right now, but I do have another thought for you.
Many years ago I used to listen to ‘Letter from America’, a BBC radio program where the late great Alistair Cooke kept the UK up to speed on goings-on on the other side of the pond. In one issue, possibly in the 1970s, Cooke reported on US census figures that showed that the number of hairdressers in the US had overtaken the number of steelworkers. This seemed at the time to be quite extraordinary and sparked off a long debate between me and my dad as to exactly what constituted a ‘proper’ job. We never did get to the bottom of that one. Perhaps in future years, increasing digital employment will seem as puzzling as the rise of hairdressers in the US. If you are making steel, selling hamburgers, producing oil and gas or whatever, then you may view the expense of cyber security, computer ‘upgrades’, ‘lifting and shifting’ to the cloud and a host of other ‘necessities’ as orthogonal to your business. Such activities may be ‘growing’ the digital sector of the economy, but probably do come at the expense of more directly productive investments. But to the economist, it doesn’t matter whether you are a steelworker, a hairdresser or a bug fixing code monkey, a job is a job. Matter of fact, I need a haircut!
ARMA, the American Records Management Association kindly provided Oil IT Journal with a review copy of ‘Igbok’, its 2018 Information governance body of knowledge, a 200-page compendium of information governance best practices and advice. What is information governance? Igbok defines it as an attempt at ‘balancing the cost of management tools and human resources against an organization’s information-based risks and opportunities’. This is achieved with the implementation of ‘transparent and consistently-applied policies, practices, and controls that address information needs across the organization’.
That is quite a broad remit and encroaches on many data management-oriented tasks. In fact, there is an interesting historical division of labors between the administrative and legal-oriented tasks of the typical ARMA member, and the overlapping fields of IT and technical data management and governance. So, how does ARMA and the Igbok stack-up against the more data and IT-related aspects of modern information management?
Our quick spin through the Igbok suggests that it does a pretty comprehensive job of addressing such issues. Igbok covers IM core concepts, the business value of the discipline and its cross-functional nature. Here, it offers insights as to how IM has evolved from its backroom tasks of organizing, filing, safeguarding and helping people find information to become the hub around which a constellation of information management tools and techniques revolve. Although technical data managers may see things differently. Igbok advocates application of the Generally Accepted Recordkeeping Principles ARMA’s own certification program but Igbok can be read without reference to the Garp. Information retrieval also falls under the Igbok purview. Here the approach is definitely one of ‘proper classification with the appropriate metadata’ rather than free-form text search. Igbok encourages records management professionals to work with IT on backup scheduling and on deciding what information needs to be retained long term to meet legal, regulatory and business requirements. Format conversion, the removal obsolete or redundant information are also addressed, as is the need for information protection throughout the information asset life cycle. Again, collaboration with IT and information security specialists is required to align information protection with regulations and corporate governance.
This review is turning into something of a check list which reflects the scope of the Igbok to an extent. 200 pages is probably a bit short to go into the depth required to implement all of the recommendations in the book, but there is more than enough food for thought and quite a few good questions to ask specialists elsewhere in the organization. Igbok’s main contribution to more IT-related practitioners may well be its coverage of the legal aspects of IM in a changing landscape. It points out that ‘protecting information that is under an organization’s direct control can be straightforward [ … ] but ensuring sufficient protection when information is transferred to a third party [ … ] such as a cloud-based service provider, requires extra diligence’. Cloud-based third parties must provide the same physical and virtual information protection of information according to the organization’s policies. This is to be achieved via service level agreements. Igbok observes that negotiating such with large service providers (read GAFA) is likely to be very challenging to all but the largest organizations with whom a significant volume of business is at stake. One suspects that getting cast iron guarantees from the GAFAs for this may be challenging even for the largest oil and gas companies.
Igbok equates ‘governance’ with a multi-disciplinary approach that spans information management, legal, risk/compliance, information technology, privacy, security and the business units. Citing author William Saffady, Igbok deprecates a ‘A siloed approach, in which stakeholders operate independently and, in some cases, competitively’ as incompatible with effective governance. An interesting observation in a world where new silos and specializations are constantly being created.
Igbok opines that ‘any major planned IT project, such as migrating to the cloud, rolling out a new big data strategy, or implementing new security software, may influence the IG strategic plan and priorities. Consult with IT and information security groups to accommodate such projects.’ Consulting may be putting things mildly!
Igbok speaks from the standpoint of a large organization with a substantial IM department operating in parallel with technical data management and IT. It is likely to be a valuable resource for such IM purists, but perhaps even more useful for smaller organizations (or larger ones that have downsized) who are desperately trying to map a pathway through the maze of the ‘cloud’, big data, GDPR, backup retention and, who knows, one day a major ‘discovery’ episode*.
Purchase your copy of the Information Governance Body of Knowledge from the ARMA store, a snip at $105.
* For a good example of such read our report of PG&E’s misery following the San Bruno explosion.
In a January 2019 webinar, Energistics provided an overview of its standards portfolio along with current and future plans. All Energistics standards now share a common technical architecture and consistent metadata, coordinate reference system and units. The well data flagship, Witsml now captures three data types, static reference data (used to move well header data between repositories), ‘snapshots’ used in daily reporting or fluid analysis, and ‘growing objects’ such as well trajectory, wireline logs and drilling parameters.
The Energy Transfer Protocol (ETP) represents a shift from earlier XML SOAP to true streaming real time data over Web Sockets (a companion to HTML5). These are now ‘orders of magnitude’ faster with much lower latency. ETP can also be used to stream data between earth models and even from high volume field-based digital acoustic sensing devices.
In 2019, Energistics plans to extend its data transfer work into the midstream/downstream area and to continue to develop new tools like the ETP dev kit from PDS, Equinor’s Dlis-io library (on Github) and FesAPI* from F2I-Consulting, an open source, multi-platforms, multi-languages API for the Energistics standards. Energistics is also looking at rising interest in the cloud and new data technologies like JSON.
A questioner asked if there was an open source tool to convert ‘vanilla’ Wits to Witsml (apparently there is not). Energistics CTO Jay Hollingsworth was pressed on JSON as a possible replacement for XML. This is going to happen for data exchanges but Energistics is currently waiting for a definitive version of JSON schema that supports validation. The new ETP is ‘full of JSON’ and, while the latest edition of Witsml is XML, logs are now embedded as JSON arrays. Hollingsworth also confirmed that the Energistics is working on an OPC-UA information model for Witsml and ‘real soon now’ for Prodml.
* FesAPI is now being developed by a consortium led by Dynamic Graphics.
Allied Reliability has announced SmartCBM, a condition-based equipment maintenance solution. SmartCBM builds on Allied’s proprietary failure mode library, derived from an analysis of some three million components in 1500 facilities. The solution delivers early diagnosis defects, identifies root causes of failures and prescribes corrective actions. SmartCBM can optionally integrate with PTC ThingWorx’ industrial internet platform for remote visualization of condition and process data.
Altair has launched SmartWorks, an internet of things and business intelligence platform that make operational data of engineering applications accessible.
AFGlobal has announced the Performance drilling platform for managed pressure operations. PDP is a hardware and software combination for ‘closed-loop’ drilling. Capabilities include: applying surface back pressure, set point pressure control while drilling, automated ramp schedules during connections, early kick loss detection and choke condition-based monitoring. PDP is built around AFGlobal’s rotating control device technology. The PDP toolkit allows operators to configure metering and measurement, task-specific intelligent control and ‘safely and efficiently maximize performance’. AFG has also announced DuraStim, a 6000 HP, cloud-connected pressure pumping solution for hydraulic fracturing.
Berlin-based startup Appygas has released its eponymous solution for gas traders. In the EU, much data on gas tariffs is in the public domain, but often in unhelpful formats and inconsistent units, ‘making it tedious, especially for new market players, to get a full and correct market overview’. Appygas gives traders ‘efficient and reliable’ access to gas market data from some 40 EU sources, harmonized and aggregated with to a ‘best/newest’ data logic. An Appygas route calculator calculates transport fees, classified by price, capacity type and operator. Appygas is backed by GRTgaz Deutschland. Register for a trial on Appygas and watch the video.
Aria Insights (previously CyPhy Works) integrates AI and machine learning software into its drones. The company’s intelligent, autonomous drones collect and analyze data to create actionable insights. Aria’s focus on artificial intelligence builds on years of experience. Aria’s 'flying robots’ can identify information of interest, send an alert when new information is detected and push data to a digital 3D map. Target applications include maintenance operations in confined spaces such around as oil tankers or pipelines. Real time capabilities can monitor disasters and alert first responders. More from Aria Insights.
AspenOne users can now maintain engineering applications from the cloud. Microsoft Azure shops can deploy Citrix’s virtual apps essentials. Alternatively, Nutanix’s Frame allows customers to deploy to either Azure or Amazon web services. AspenTech also reports support for AdoptOpenJDK in aspenONE V10.X. The AOJDK toolkit is an alternative to Oracle Java whose T&C’s are changing. More from AspenTech.
The 2019 release of Blue Marble Geographics’ Calculator includes a universal copy and paste function, a new angular unit conversion tool, support for Nadcon 5.0 and an updated seismic survey conversion functionality.
OSIsoft has announced the Element Asset Framework Accelerator, a combination of software (from Element Analytics) and services that are said to speed building PI AF data models by a factor of ten. More from OSIsoft.
Ensco has unveiled its continuous tripping technology, a hardware and process control solution to automate movement of a drill string in or out of the well at constant speed. CTT is currently under test on the Ensco 123 jackup. CTT was developed in collaboration with NOV, BoschRexroth and Keppel FELS. Watch the (great) video.
Esri has released ArcGIS Earth for Android and iOS. Users can add their own KML and KMZ data, take screenshots while navigating, connect to ArcGIS Enterprise and identify features by tapping on the globe. More from Esri.
Fieldbit 5.0 promises ‘next generation’ augmented reality for technical field service personnel from a redesigned Android interface, an integrated knowledge and ticketing systems and interactive chat/video and remote ‘over-the-shoulder’ coaching. The iOS edition adds image recognition and ‘Slam’, simultaneous localization and mapping.
GeoLogic Systems GeoScout V8.9 adds a drilling module along with data supplied by participating companies. GeoScout Drilling provides access to digitized tour report data, drilling time breakdowns and multi-well comparison.
Ikon Science’s RokDoc 6.6.1 delivers ergonomic enhancements along with enhanced seismic wiggle trace drawing and the ability to create difference gathers for the analysis of, e.g., 4D time-lapse production-related effects, or to assess seismic data before and after pre-stack data conditioning workflows. A new plug-in creator reduces the effort required to generate XML scripts to ‘wire’ proprietary code into RokDoc. JiFi, Ikon’s joint impedance and facies inversion technology now provides a full depth domain, amplitude and kinematic inversion capability.
iLandMan has issued an application programming interface (API) to let users integrate its lease management software with engineering, finance, mapping and other systems. ILandMan also offers direct connection to P2 Bolo, OGsys, Tobin, TGS, IHS, DrillingInfo and State conservation department data.
Sintef has released a new version of MRST, the Matlab reservoir simulation toolbox. The 2018b release adds a new GUI to the diagnostics module for postprocessing simulation output, automatic restarts and new sequential implicit compositional solvers.
RB Asset Solutions from auctioneers Ritchie Bros. introduces ‘data-driven’ asset management and disposal. The cloud-based solution offers high-volume customers a complete inventory management system, data analytics and dashboards, branded e-commerce sites and web-enabled external sales channels.
Schlumberger has announced several new software releases, OLGA 2018.1, PIPESIM 2018.1, CoreDB 2017.1 (sic) and ProSource 2018.1. Schlumberger has released Ocean for Petrel 2019 ‘field introduction 1’, now in ‘commercial-ready state for plug-ins’. Developers can compile and submit their plug-ins for Ocean Store acceptance and release in Petrel 2019.1. A pre-commercial license and approval from SLB is are a prerequisite. Schlumberger has also published a list of newly available plug ins for Petrel.
Thermo Scientific (previously VSG) has released V10 of its Open Inventor toolkit, putting its flagship visualization engine on a ‘modern’, QT5-based rendering engine.
Pays International has kicked-off a seismic machine learning classification pilot study, and is offering a free, four-week study of a client’s seismic and well data.
Petrofac, with help from a client, has developed the Petrolytics dashboard, an artificial intelligence and machine learning platform for predictive maintenance.
Petrolink’s new manage by exception function helps drillers prioritize and manage the high volume of alerts generated during drilling.
The 18.4 release of Rock Flow Dynamics’ tNavigator adds local grid refinements for detailed modeling in the vicinity of XYZ fractures, multi-point statistics and velocity cubes computed from checkshot data. History matching is improved with multitarget optimization using a particle swarm methodology.
Rose & Associates has announced Unconventional rapid assessment (URA), a ‘comprehensive yet affordable’ software solution to screen for resources and assess low permeability investment opportunities. URA provides a range of unconventional KPIs including number of productive units, wells per section and in-place and recoverable resources. URA analytics can feed into R&A’s ProjectRA for cash flow analysis.
R21 of Seeq’s analytics toolkit adds a new scorecard and fast Fourier transform analytics. FFT is used, inter alia, in oil and gas to identify process oscillations due to poorly tuned controllers or sticking control valves and other frequency-based indicators.
Siemens has integrated Calgary Scientific’s PureWeb interactive 2D/3D web tools with its Comos Walkinside 3D VR plant information viewer. PureWeb brings 3D viewing to any web browser or mobile device.
Yokogawa has announced a new cavitation detection system, a component of its Oprex plant asset management and integrity solution. The hardware and software solution can detect small, high frequency pressure fluctuations as a bubble collapses in the transmitter, providing earlier detection of cavitation than conventional detectors.
Landdox has released a new integration with ThoughtTrace’s ALI platform.
PermianChain Technologies has launched the prototype of its blockchain-based platform for trading potential oil and gas reserves. The platform is being developed on IBM’s Hyperledger Fabric.
Emerson has released Roxar Tempest 8.3, its integrated reservoir engineering suite. The new product suite strengthens Emerson’s end-to-end exploration and production software portfolio, comprising Paradigm and Roxar software solutions that help operators efficiently exploit both new and established reservoirs.
BHGE has launched Lumen, an integrated platform for drone-based methane monitoring.
PDS Group has announced the 2018.12 release of its Ava Clastics sedimentology and analogue database with new visualization and filtering capabilities across its Fakts, Smaks and Dmaks data sets.
Kappa Engineering has released Emeraude v5.20.
Schlumberger has introduced Concert well test solution that adds real-time surface and downhole measurements analytics and collaboration capabilities to well testing.
Ansys Cloud has been released providing ‘instant’ access to HPC from within Ansys’ products.
Oildex (now a Drillinginfo unit) has launched Owner relations suite, to provide owner support, lower costs and boost productivity.
Capgemini has launched Perform AI, a new portfolio of solutions and services to assist organizations in building and operating ‘at scale’ enterprise-grade artificial intelligence.
Hexagon has released CadWorx Structure professional 2019, adding workflows for structural element placement and tools to keep projects on schedule and budget.
In the old days, you would buy, unwrap software and run it on your server. In the new paradigm of the cloud and web delivery, things are not so simple. A recent blog, by Stewart Harper from GIS/ETL specialist Safe Software, explains how to deploy its FME Server flagship in the cloud. A visit to the KubeCon tradeshow (8,000 attendees) convinced Harper that Kubernetes is now the de-facto standard for container orchestration with ‘staggering’ growth. Kubernetes as a service is available from Amazon, Azure and the Google clouds and Safe has tested deployment on all of them. Harper recommends Google’s Kubernetes engine as ‘the simplest and a good place to start’.
Kubernetes deployment is not for the small shop! It is recommended for larger organizations with a dedicated team seeking efficiencies in software development and operations. Kubernetes makes it easy to automate deployment of complex large-scale applications by ‘shifting infrastructure to code’, allowing for version control of the infrastructure and making development, staging and production ‘testable and reproducible’. Kubernetes makes a clear distinction between the operating system, application management and application code, and allows specialized teams to focused on each area. Kubernetes builds an abstraction layer on top of the cloud infrastructure, abstracting any cloud-specific settings into properties that you set. The approach promises ‘cloud-agnostic’ deployment. For more read Harper’s blog.
Safe also recently published an informative ‘how-to’ on the eight different ways in which spatial data and maps from a variety of sources and formats can be viewed in a web browser.
An article in Technology Record reports that BP is using Microsoft Azure-based machine learning to predict oil and gas recovery factors. The previously tedious work that used to take weeks can now be done in days or hours. BP’s ML-based recovery factor model is in daily use by hundreds of subject matter experts at BP.
BP’s enthusiasm for artificial intelligence has led to its BP Ventures unit chipping in some $5 million in Belmont Technology’s Series A financing round. The deal follows BP’s $20 million investment in Beyond Limits, another AI boutique. Belmont’s knowledge graph technology will ingest geoscience, reservoir and historic production data in a ‘robust’ knowledge-graph of subsurface assets. These can be interrogated with natural language queries while ‘AI neural networks interpret the results and perform rapid simulations’. The combined technologies underpin BP’s ‘Sandy’ AI platform with Belmont’s ‘scalable knowledge-graphs’ feeding into Beyond Limits’ platform.
ELynx Technologies is collaborating with the University of Tulsa on the development and validation of ‘digital twins’ to predict the behavior of wells produced using artificial lift. The research is said to accelerate the shift from predictive maintenance to predictive optimization. The work is performed under the auspices of Tualp, the Tulsa University Artificial Lift Project. Elynx will contribute training data amassed during its monitoring of some 40,000 across major US drilling basins. Elynx data scientists and subject matter experts are to contribute a data-driven perspective to physics-based modeling of artificial lift processes. In return, Elynx expects accelerated development of new products including models for the latest breed of plunger-lift and ESP products.
Safe Software has added custom tools for embedding computer vision in applications via its FME 2019 development environment. A blog from Safe explains how FME uses the OpenCV computer vision and machine learning library in its family of RasterObjectDetector transformers.
IBM has teamed with Penguin Computing (both members of the OpenPower Foundation) to develop a hardware appliance dedicated to ‘intelligent simulation’. The appliance adds a ‘Bayesian' optimization capability to an existing HPC cluster ‘of any architecture’ to improve processing capability. Researchers tell the systems to exchange data and the Bayesian appliance automatically designs smarter simulation instructions for the primary cluster. Cray is also working with IBM on the new approach. IBM also reports work on new knowledge graph technology capable of reading 500,000 documents per hour, ‘bringing order to chaotic data’ and establishing a corporate memory of HPC work. This web-based tool is currently available at no cost from IBM Zurich. The technology is being added to IBM’s AI/deep learning platform PowerAI.
Neo4J sees its eponymous graph technology engine as having a ’symbiotic relationship’ with artificial intelligence citing work by futurologist James Fowler as chronicled in his book Connected. For Neo4J, ‘we are on the cusp of a new Cambrian explosion of graph-powered artificial intelligence’.
Quantzig has blogged on the ‘top four’ benefits of advanced analytics in the oil and gas industry. These are ‘predicting the success rate of downhole operations’, ‘information delivery in real time’, ‘analysis of log runs’ and ‘predictive maintenance’.
At the 2019 ARC Advisory Group Industry Forum, Pioneer Natural Resources and Devon Energy reported use of Seeq’s analytics toolset for machine learning and industrial control system connectivity. Devon presented on the use of ‘advanced analytics’ to optimize shale completions while Pioneer’s work targets condition-based compressor maintenance.
The contrarians in us were excited when we came across the Proceedings of the National Academy of Science (PNAS) release on the ‘limits of deep learning’. PNAS’ Mitchell Waldrop describes the ‘much-ballyhooed’ artificial intelligence approach as ‘boasting impressive feats but still falling short of human brainpower’. Minor changes (aka adversarial attacks) to imagery can easily fool AI systems. A fact that has suggested to some researchers that ‘we’re doing something wrong’. This is a ‘widely-shared sentiment among AI practitioners, any of whom can easily rattle off a long list of deep learning’s drawbacks’. More puzzlement stems from the gross inefficiency of the training data paradigm which comes-off poorly when compared with human learning. ‘For a child to learn to recognize a cow, it’s not like their mother needs to say "cow" 10,000 times’. The opacity of the learnings from AI systems is also problematic and may be unacceptable in any circumstance, even the answer is right. AI’s salvation may come from the adjunction of graph technology to deep neural networks. The graph network has been getting a lot of excitement over the last couple of years. Such deep-learning systems have an innate bias toward representing things as objects and relations*. A graph network, then, is a neural network that takes a graph as input and ‘learns to reason about and predict how objects and their relationships evolve over time’.
* Is that back to the semantic web future or what?
In the EU post the 2014 downturn, industry health has been ‘mixed’ and dependent on roles and geographies. There is action in the fields of artificial intelligence, machine learning and around plug and abandoning and non-invasive inspection. All fields where GIS is, or could be a part of the solution. Esri’s Danny Spillman observed that the advent of the digital transformation, data and analytics means doing things differently. A Dresner Advisory Services survey found that two thirds of respondents believe location intelligence to be critical for their business. While GIS may be losing its central role to data/analytics, ‘it is still part of everything you want to do in a digital transformation’.
Shell’s Rob Dunfrey spoke to the ‘geographic advantage’ in Shell’s digital transformation, which he dates back to 2013. Today, this is ‘gathering pace’ with the creation of a digital center of excellence and a company-wide digital strategy. Dunfrey agreed that geospatial has a significant role in digital transformation, the internet of things (IoT) and in artificial intelligence (AI) applications.
Spillman reported on other companies’ usage and the interplay between geography and digital transformation. ConocoPhillips ‘uses Esri to make maps’, but is also having conversations around Hadoop, R and Python, and has convinced its transformation folks that there is more to Esri than maps. GIS may not be the main player here, but it clearly has a role as a component of a system of reference. Additionally, as ArcGIS capabilities are recognized, the question becomes ‘how to use spatial analytics to drive insights’. In which context, surveillance imagery from drones needs a GIS/data strategy, as does geographically disperses IoT/sensor data. And finally mobile, where ESRI has no less than three strategies/offerings. Another new direction is the marriage of GIS and building information management (BIM) and visualization of the real world in 3D. You can now work with BIM files in GIS, adding context as ‘BIM and GIS coalesce in capital projects’. Esri is partnering with Autodesk to, for instance, design well pads with Revit. Format conversion is facilitated with the OGS’ I3S, indexed 3D sharing format (more from Esri and on Github) that retains Revit attribute information.
Spillman wound up commenting that digital big data is ‘overhyped and misunderstood’. In reality, it comprises business intelligence, analytics, machine learning and artificial intelligence. All of which may be owned by different groups in an organization. Increasingly, Esri’s offering addresses real time situational awareness, for instance in oil and gas, for competitor analysis. Elsewhere, Esri’s Python API is used for real time activity and object detection. ArcGIS for the IoT has been used to map power outages, track assets or the weather and monitor production. Location intelligence is playing a key role in the digital transformation.
ArcGIS Enterprise Sites (AES) lets you build custom websites with your own content. Enterprise Sites create a tailored web page from corporate GIS data that is accessible to non-GIS users.
AG Analytics for IoT will be out real soon now. Analytics for IoT collates data from millions of sensors into a constellation of big data microservices and containers (scala, kafka, spark play for map services) and spatio-temporal storage (under evaluation). The promise is for real time/big data GIS ‘as a service’. A ‘battery-efficient’ location tracking service leverages a NoSQL data store. A demo showed an inept golfer shanking left and right around the course, being tracked with ‘a Strava-like functionality’.
Another NoSQL data store powers high performance big data visualization with ‘quantization’ that adapts display sampling on zoom. The promise is for streaming performance of large data sets such as IHS Markit’s 4.7 million wells. The environment is a toolkit for ‘really big’ data that requires spatial awareness sans data movement. A ‘program-to-the-data’ philosophy leverages a Hive data lake with WebHDFS; U-SQL, Spark and HD Insights, all viewed in ArcGIS naturellement.
A GeoAI virtual machine (developed with Microsoft) combines AI with Esri geospatial adding AI/ML to ArcGIS. A demo use case involved a communications exercise with residents living near a pipeline. A buffer area is created with standard geoprocessing. Inside the buffer zone, aerial imagery is used to create training samples tagged with ‘house’, ‘hospital’ and so on. The data set is exported for deep learning and analysis by a data scientist. The resulting ML-derived model is imported back into ArcGIS which identifies similar structures in imagery. One GeoAI test identified 6,000 well pads from Sentinel II satellite imagery.
Rob Dunfey described digitalization as being ‘in Shell’s DNA’. Shell has a digital ‘center of excellence’, and a coherent approach that regards data as an asset. The business owns digital, and operates an in-house digital capability where the customer/user is central. Shell takes inspiration from new players in energy, Tesla and even Apple (selling surplus energy from its data centers). Disruption is both a threat and an opportunity. Current focus is on dig tech that is ‘reaching an inflection point’, such as the IoT. Here, Shell is working with Esri on a modern geospatial IoT platform embedding Geoevents into Shell’s MyMaps along with Scala; mongodb, kafka elastic, python and spark. Other aspects of the geo-digital transformation include robotics and drones for inspection and surveillance, mobile apps to locate pipelines and wearables with 360° photogrammetry for mobile Hazid*. 3D and BIM has been used at Shell’s Deer Park Refinery with a model hosted and accessed via MyMaps. Finance and competitive intelligence, aka ‘Facit’, leverages ML, machine vision and earth observation to map and monitor hot spots (as hedge funds are said to use in monitoring unconventional activity). In conclusion, geospatial is key to the successful deployment of several of these technologies. In the Q&A Dunfey was asked what new skills are required of the geospatial community. He suggested TensorFlow. Another questioner observed that many business apps have inbuilt spatial functionality, how does one avoid duplication of effort? This is a question Shell asks itself. ‘Does it matter what the endpoint is as long as underlying data is accessible?’
* Hazard identification studies.
Dominic Bull provided an update BP’s OneMap. BP’s global GIS community of practice is working to sustain and grow the platform. There are huge opportunities for geospatial across the modern oil company. BP uses Esri technology to devolve some aspects of GIS innovation and development to ‘citizen development’, although ‘we don’t want people to run wild’. Not everybody is an expert in all aspects of every platform. BP now has standard roles for GIS professionals and analysts and has ‘moved away’ from its earlier geographical organization. BP has built a geographic information science and technology body of knowledge, a university consortium for GIS leveraging Yammer and SharePoint.
BP has established IT&S standards for GIS data with quality controlled coordinates and data products in a ‘tips and tricks’ area. Github is used ‘quite extensively’ as are shared FME workspaces. Advanced analytics involves more interaction between GIS, geoscience and data science. Recent developments include a vehicle tracking service for Khazzan and ‘simops’ (simultaneous drilling and production operations) for Indonesia.
2018 saw the rollout of OneMap 4.0 with analytics, the cloud, big data and GeoIoT. In the Q&A Bull stated that Voyager Search was used to scan geo data for compliance. Citizen app development has proved very successful, ‘the results better than our own, it’s scary!’. Asked about the challenges of opening-up GIS systems Bull acknowledged that people need to be aware of what they are doing, ‘most work atop of our managed data.’
Tom Royston (now with DGTop) used to work for Addax Petroleum. Addax was acquired by China’s Sinopec Group, which eventually moved the whole company to Beijing. Addax’ data was housed in a range of applications (EnergySys, IDS, NeuraDB’s well header database, EDB, PSApphire, SeisQuest), most all visible from an Esri front end. To move Addax’ data, Royston used a combination of the Azure cloud, ArcGIS and a PPDM databasde. An initial proof of concept involved a move to the cloud for the EnergySys and IDS data. PingOne identity management was found to be a key enabler for working in the cloud. Following the 2017 POC on Azure, things accelerated will the early closure of the Geneva HQ.
Royston did some work on cloud comparison but in the event, Office365 proved an unavoidable front door-cum-trojan horse into the Azure cloud. This required a major network upgrade with dual connections especially for Addax’s African units. Local internet breakouts allowed access points to be securely distributed, with Azure used as a ‘virtual’ data center. The migration involved a choice between ‘lift and shift’ and a re-platform. Addax went for the latter with a move from Oracle on-site to SQL Server in the cloud. ArcGIS was re-installed with ESRI’s Cloud Builder and the ArcGIS Desktops moved to Azure virtual machines. Now all is up and running on Azure in Hong Kong, accessed by the Africa and China teams. The original desktop app running in cloud can be used across CNPC/Petrochina (not just Addax). Geodatabases are now in Azure so there is 'no more need to back up’. Lessons learned included the fact that you ‘can’t just jump in’. You need to consider exactly what the objectives are. Regarding the business case, the cost of the cloud may hold some surprises re cloud providers and price plans! If you can afford it, re-platform. In the present case, the editor of the accounting systems ‘would not play ball’. The cloud is not ‘out of the box’.
Cécile Noverraz works with the team building the controversial Nord Stream 2 gas pipeline from Russia to the EU. An onsite-deployed Esri ArcGIS/Portal master database is at the heart of the project with a loosely-coupled document control system from Easy. The comprehensive development tracks a variety of cultural information on cable crossings, UXO, berms and more. A Geoevent server integrates AIS ship tracking data from ExactEarth, pulling data every 5 minutes. Hundreds of boats are involved (30 in a single dredging operation) on the project so ‘proximity incident reporting’ is key. The project is creating a high-quality data set that can be handed over to inspection and maintenance on project completion. Wish Software’s VisualGIS server (an ArcGIS extension) integrates ROV inspection videos with the mapping system. Nord Stream 2 has some 100TB of video imagery to manage and share. Noverraz is also planning to manage pipeline geometry in ArcGIS Pipeline Referencing. The Esri Portal and GIS technologies have been ‘deliverables’ from the start of the project and are now ensuring long-term data integrity.
Jeff Allen (Esri) presented on the latest technology in pipeline data management. ArcGIS pipeline referencing and web services make it easier for partners and users to address particular issues. An initial focus is asset management (GIS, MAOP, TVC…) but these are also used by other groups (integrity, operations, HSE, business). APR, ArcGIS pipeline referencing, manages point and linear data as a dense dataset atop of the pipeline rout. Core tables from APR can be embedded in PODS Lite and/or Esri’s utility and pipeline data model (UPDM). A web app allows for event editing, query and QC with rule-based protection of the underlying database. Different apps for different parts of the organization takes the pressure off GIS editors. A use case involved an inline inspection pig survey, switching between schematics and imagery with above ground markers. Schematics can be captured the utility network model. This, along with rule-based asset packages, can be used for oil, gas, liquid, electricity and more. The UPDM 2018 file geodatabase can be downloaded along with an oil and gas profile/package.
Paul Cleverley’s research has led him to beware of cognitive
bias, i.e. ‘group think’ that favors consensus over critical thinking.
He has observed geoscientists talking about their work and
observing what papers (in OnePetro etc) they click on. He is less than
enthusiastic about current text-search technology and advocates rule
based, supervised and unsupervised ML for text mining. 100 years of
Geoscience World has been analyzed and spatialized from the text. This
includes extracting numerical information, like ‘0.4ppm’ or 500mybp
from ‘dark’ data i.e. numerical data hiding in text. Word2Vec and GeoDeepDive
from U Wisconsin also ran. The approach lets you ‘see big patterns and
develop new theories’. Cleverley acknowledged that there is a lot of
technology ‘propaganda’ about. It is unlikely that machines can read as
well as humans, but they can read a lot more. In narrowly-defined
tasks, this can be very useful. In the Q&A, Cleverley observed that
‘training data dominates the equation’ and that PDF is a ‘pretty awful’
source for text. It is easier with Word but better still, extract
the text itself, ‘curation and normalization are key’.
Keith Winning presented a paper at the 2017 EPUG on data modeling of a major gas pipeline. Since Pipeline Data Solutions' early adoption of the data modeling process the company has built 11 data models for some 500km of pipeline and 1.5 million records. Historically, pipeline modeling for operations and integrity management was performed post construction. PDS proposes a ‘digital twin’ approach, building the model concurrently. This allows the model to be used in construction/project management and to assure data quality post-handover. But there are challenges, both internal (schema changes, how complete is the model), and external issues, of data quality, out-of-sequence data arrival and managing change at the attribute level. To be sure, ‘you need to check and check’ and, if you can, incentive contractors to provide data. Excel data, written weld numbers scribbled onto a pipe in the field and many other issues make the process very error prone. Data acceptance and review is key although, ‘you can’t ask a contractor to focus on trivia’, a pragmatic approach is needed for secondary data. PDS uses a PODS schema with an extra cross reference table to map different contractors’ tags. ESRI Survey 123 and Collector also ran. But really, ‘the construction industry has to change and incentivize good data delivery’. Preferably delivered prior to releasing a section for hydrotesting.
Jostein Bjerkan described AkerBP’s MapInsight GIS portal. MapInsight supports geology, seabed and operations, interfacing with Office365 and SharePoint. AkerBP’s spatial services team has automated many data migration and ingestion processes. The move from ED50 to WGS84 brought a 3x speedup in mapping performance. AkerBP now creates its master data in GIS. Data can be consumed via a standards-based open source API for analytics. All is rolled-up in the Cognite data platform and a ‘massive data lake’. AkerBP likes to play with its data and has ‘thrown everything’ into Unity 3D, a gaming/rendering technology that connects GIS with Petrel Studio and enables 3D visualization of reservoir models, interpretation results, seabed data, pipelines and more. The video was underwhelming, ‘of course this is not implemented as a fully working tool yet…’
Mark Jones reported on Shell’s trials with Insights for ArcGIS, Esri’s ‘web-based data analytics for advanced location intelligence’. Jones sees synergies and commonality of purpose between GIS and AI. Insights for ArcGIS allows drag combination of spatial data into an analytics environment. Shell has used the tool to mash-up its own proprietary data with data from third parties including Westwood Global Energy Analytics and Woodmac PetroView. This allows for spatial queries on licenses by round, aggregate reserves by operator, resource type and so on. (ie all Oracle-style queries, apparently sans AI). It turns out that authoring dashboards is easy and Insights offers good enterprise IT integration. But is it really deployment-ready? For Jones the answer is … yes and no. It currently lacks reporting functionality and there are labelling issues as the solution moves away from full-blown GIS functionality. The close collaboration with advanced analytics can evolve though. Geospatial analysts are also data analysts. Insights can be a way of democratizing geospatial. Let the business ask questions we don’t know. Geospatial is a key lever in analytics and is ‘perfectly positioned’ to participate in the revolution. In the Q&A, Jones was asked how Insights compared with LogiInfo or Spotfire. These tools offer similar functionality but with less spatial functionality.
Rui Menezes provided an update on iGas Energy’s use of M-Files, comprising a ‘GIS and intelligent IM’ solution. A joint venture between Esri Finland and M-Files (also Finnish) has developed and embedded M-Files with ArcGIS Pro. Conversely, ArcGIS can now be launched from inside M-Files.
Advances in AI make searching historical data easier and more performant. Voyager Search helps connects the portals/dots, spanning SharePoint, ArcGIS Online, files and stuff in the cloud. This is achieved without move or copying data, just by index building. Voyager adds ‘where’ to conventional ‘who what when’ search. This is achieved with a built-in machine learning-based enrichment platform. Natural language processing also ran. Voyager has, inter alia, indexed Reuters’ news articles and extracted location information from place names. More from Voyager.
Geocap’s Seismic Portal
offers ‘Petrel to Esri’ integration, bringing seismic and well data from
Petrel Studio to ArcGIS Pro. A subsurface 3D viewer pops up and
displays geoscience data along with rig/platform and seabed kit.
Exprodat is helping clients migrate to Pro and showed-off its skills with a drone video embedded in Pro. Pro was released in 2014 to replace ArcMap (which will run to 2023). It is ‘time to think about getting into Pro’. Exprodat offers migration services and customization with Python and SDJ code to replace Arc Objects. More from Exprodat.
Conterra provides a security manager for ArcGIS that allows fine tuning of usage such that ‘everyone in the company uses what they need to do their job’. It can be hard to map all services to all users. ArcGIS Pro Server security provides service-level control. Conterra’s security manager extends the security concept with object/layer/spatial access control and intersecting spatial filters as required. Conterra also works on apps. The Conterra admin interface is integrated with the ArcGIS server manager. More from Conterra.
Woodmac PetroView for ArcGIS Pro provides intuitive data exploration along with business intelligence à la Spotfire. PetroView installs in parallel with Desktop and runs off the same data. Alternatively, the tool can run from inside Pro as a tab/add-in. PetroView performs impressive semi-intelligent queries across a big North Sea dataset. This seemed to show that North Sea production would be finished by 2027.
Thierry Gregorius (Getech) compered a Menti.com quiz by way of a closing ceremony. Menti allows the audience to ask free-form questions and vote in their answers. It was established that 24% were PUG first times, 11% had attended one event before and 66% serial ‘many’ EPUG attendees. ESRI were surprised to learn that in the parallel sessions there was a 10 to 1 preference for case studies over ‘technology’. Asked ‘Who is a data scientist’, almost all answered ‘Something I’m aware of but can’t do myself’ none admitted to being a ‘true’ data scientist. Around half were part of a ‘digital transformation’ the remainder split equally between not being part of such a team and not even having such a team. Over half were moving to the cloud out of choice, the rest were either not moving or moving because they had to. There was a consensus that the key skills to learn today are python and machine learning. Finally, a distinct preference was expressed for Paris as next year’s EU PUG location.
The presentations from the 2018 EU PUG are available here.
The Alberta Energy Regulator (AER) has named Gordon Lambert as interim President and CEO following Jim Ellis’ retirement. The board is now conducting a search for a permanent CEO.
Alexander Krane is now investment director at Aker ASA. David Torvik Tønne is Aker BP CFO. Lene Landøy is SVP strategy and business development.
Steve Sisneros has joined the American Petroleum Institute’s bipartisan advocacy team as VP external mobilization. Scott Parker is manager of rapid response. API has also hired Kevin Servick as central region campaign manager, Kenny Roberts as director of federal relations for midstream and Elia Quintana as director stakeholder relations.
Mark Lomas is now director of Aqualis Offshore’s European operation. He hails from London Offshore Consultants. Simon Healy heads up the newly established office in Perth, Western Australia.
Former CEO of the Australian National Fabrication Facility, Rosie Hicks, has been appointed CEO of the Australian Research Data Commons.
Navin Mahajan is VP and treasurer at Chevron following the retirement of Randy Richards. Dale Walsh is to succeed retiree Joe Naylor as VP of Corporate Affairs. Debra Reed-Klages is now member of the board of directors. Pierre Breber is to replace retiree Patricia Yarrington as VP and CFO. Mark Nelson is to succeed Breber as executive VP of downstream and chemicals. And Colin Parfitt, currently president of supply and trading is now VP midstream.
The nonprofit Center for Sustainable Energy (CSE) has named Larry Goldenhersh as president. Mary McGroarty, current vice-chair is now chair of the board of directors.
Former New Zealand prime minister Helen Clark is to chair EITI, the extraction industries transparency initiative, replacing former Swedish prime minister Fredrik Reinfeldt.
Matt Hopkinson is the new EVP for the Oil and gas and infrastructure sector at Element Materials Technology. He hails from Bureau Veritas.
Equinor has opened two onshore support centers at Sandsli in Bergen. By 2021, all Equinor fields on the NCS will be supported by manned onshore centers in Bergen, Stavanger and Stjørdal.
Chairman Christopher Gaut is now CEO of Forum Energy Technologies, replacing Prady Iyyanki.
Elizabeth Wilkinson has joined Flotek Industries as CFO. She was previously with RGP. Matt Marietta has resigned as EVP of finance and corporate development and Richard Walton has stepped down as chief accounting officer. Walton stays-on as consultant to facilitate the leadership transition.
Tauseef Salma is VP Marketing and Technology at Flowserve. He hails from BHGE.
Joseph Herridge (Corning) and Dave Cunningham (Network Integrity Systems) are now members of the Fiber Optic Sensing Association (FOSA) board of directors. Mike Hines (OFS), Kent Wardley (Fotech Solutions) and JJ Williams (OptaSense) were re-elected for two-year terms. JJ Williams has been promoted to Chair.
Michael Jennings is now member of FTS International’s board and will serve as chair of the audit committee.
Mark Richard, currently SVP of the Northern Region, is to succeed Jim Brown as president, western hemisphere at Halliburton. President and CEO Jeff Miller is now Chairman of the board of directors.
Hastings Equity Partners has hired Walter Pinto as senior advisor. Daniel Donaldson is to serve on the executive team of its Refractory Construction Services unit.
Beacon Offshore Energy is now member of ‘HWCG’ (previously the Helix Well Containment Group), aka the Deepwater Subsea Well Containment Consortium.
Andrew Meyers is research director, oil and gas at IDC Energy Insights. He hails from Westwood Global Energy Group.
Philippe Bissat has been appointed global sales and marketing manager at IntOp.
The International Society of Automation has named Paul Gruhn (aeSolutions) as its 2019 president.
Robert Siahaan heads-up Katalyst’s new Subsurface Datacenter in Kuala Lumpur, Malaysia.
Marathon Oil president and CEO Lee Tillman is to succeed Dennis Reilley as chairman of the board of directors.
David Crombie has been promoted to COO and executive VP at Nine Energy Service.
The Open Geospatial Consortium has appointed Nadine Alameh as CEO and Bart De Lathouwer as president. Kumar Navulur (DigitalGlobe) and Frank Suykens (Luciad/Hexagon) are now members of the board of directors.
Sander Scholten is MD at Olyslager North America.
Stephanie Waters (Chevron) is the new president of PIDX, the Petroleum Industry Data Exchange. Wissam Kahoul (BHGE) is now an international ambassador.
Lisa Madden (ExxonMobil) and Adriaan den Herder (Nederlandse Gasunie) are now PRCI’s executive board members.
ProStar Geocorp has hired Darrell Williamson as chief sales officer. He hails from FyrSoft.
Gene Austin has been appointed President and CEO at Quorum Software. He was formerly Bazaarvoice’s CEO.
Preem founder, Mohammed Al-Amoudi, has been released from jail in Saudi Arabia. He was among the businessmen and political leaders rounded-up in November 2017 as part of an alleged corruption ‘purge’.
Radiflow has named Michael Langer as its new chief product officer. He hails from the Israeli Defense Forces where he founded and headed the IDF’s cyber defense operations.
Bryan Preston is now director of public affairs at the Texas Railroad Commission. Jason Modglin is the new director of public affairs for the Chairman’s office.
Olivier Le Peuch has been promoted to COO of Schlumberger.
Lisa Graham is to manage Seeq’s Analytics Engineering team.
Wael Sawan is now Shell’s upstream director succeeding Andy Brown. Neil Carson has been appointed director.
Craig Muir is to succeed Christian Brown as SNC-Lavalin president, oil and Ggas. He hails from Petrofac.
Bernd Gross assumes the position of CTO at Software AG succeeding Wolfram Jost. Stefan Sigg is chief product officer and Paz Macdonald, CMO.
Blockchain Venture for Oil Trading, VAKT has named Etienne Amic as its new CEO. He hails from Vortexa.
Altair has appointed Amy Messano as CMO and Ubaldo Rodriguez as SVP, Global Sales.
Brendan Warn is Senior VP, investor relations at Total. He hails from BMO Capital Markets.
Total has opened a new research and innovation center at France’s Ecole Polytechnique cluster in Saclay, near Paris.
Alexandra Pruner and Michael Grimm are now independent directors at Anadarko.
Oliver Ratzesberger is now President and CEO at Teradata.
Helge Beuthan is to succeed Uwe Salge as General Manager at Wintershall Middle East. Wintershall has expanded its state-of-the-art digital technology center at its site in Barnstorf, Lower Saxony.
Inpex has appointed Osamu Nozaki, Wataru Nojiri and Junichi Ishihara to various general manager/senior development coordinator level appointments on its Ichthys project. Seiya Ito is Ichthys project SVP, Yutaka Shimura is GM planning and coordination at the America and Africa division and Hitoshi Okawa is VP Ichthys and president of the Australia unit. The company is also to establish a wholly-owned research and technical subsidiary in Japan.
Yokogawa has appointed Hitoshi Nara as representative director and president and Takashi Nishijima as representative director and chairman.
The University of Oxford has launched a new MSc in Energy Systems.
Tracy Shimmield is now director of the Lyell Centre, a purpose-built £21 m facility and joint venture between the British Geological Survey and Heriot-Watt University.
Mimi Stansbury has been promoted to OFS Portal SVP of Finance and Administration and will serve as a strategic advisor to the CEO.
Petrosys USA and Canada have recently moved into new premises and migrated most of their IT infrastructure to the Cloud.
PGS has opened a new office in Brazil.
Digital Plant Engineering is now member of USPI and will also join the CFIHOS project. Ron Bouman will represent DPS on the management board of USPI.
Steve Laubach is now editor of the Journal of Structural Geology for North America succeeding Bill Dunne who remains in an advisory role.
eLynx Technologies has promoted Scott Haven to chief business development officer. Ryan Richison is CIO.
Guido Gabriel Piccone is now ExproSoft’s well integrity specialist.
Christian Pedersen joins IFS as chief product officer. He hails from SAP.
Bernhard Eschermann (ABB), Juergen Weinhofer (Rockwell Automation), and Fabrice Jadot (Schneider Electric) are now members of the OPC Foundation board of directors.
Here Technologies is now an OGC principal member.
Charles Nadeau is now CFO with Sewall.
Karen Keegans is now Rockwell Automation’s SVP, human resources.
Mercell Holding has hired Arild Nilsen as chief product officer. Mercell hails from Schibsted. Fredrik Eeg is CFO. He was previously with Creuna.
View Software has named Sten-Roger Karlsen as CEO replacing Pål Einar Berntsen.
Roberto Castello Branco is the new CEO of Petrobras.
Matt McKinley is to lead global business operations at Coreworx.
The Carnegie Mellon Software Engineering Institute has named Tom Longstaff as CTO. He previously served with the ‘Asymmetric operations’ sector of Johns Hopkins University’s Applied Physics Lab.
Data release:
A 120 GB data set from the UK North Sea’s Greater Buchan Area is now available to download and/or order on media from CDA’s UK Oil and Gas Data system.
Deaths:
‘Fickert sheet’ creator Bill Eugene Fickert died on November 20th at the age of 94.
Geomechanical ‘trailblazer’ Arvid Johnson died in May 2018 at the age of 79.
Peter Benson has authored a 12 page ECCMA white paper, an overview of ISO 8000 and its role in data quality improvement. ISO 8000 sets out to assure data quality and portability between systems by correct ‘syntax and semantic encoding’. By requesting ISO 8000-compliant data, users of proprietary software can protect themselves from data ‘lock in’. Another facet of ISO 8000, specifically the Part 115 quality tag, adds a prefix to data items to disambiguate between different data sources. Companies can issue identifiers for their products or services and register such at the ECCMA SmartPrefix registry. A similar function is available to register legal entities.
The Linux Foundation has announced the LF Edge, a unified open source framework for the internet edge. LF Edge includes Akraino Edge Stack, EdgeX Foundry and the Open Glossary of Edge Computing, all formerly stand-alone LF projects. The LF Edge also includes a new, ‘agnostic’ standard edge architecture from Zededa.
The Open Geospatial Consortium has unveiled a suite of OGC validation tools, available on the OGC Beta Validator website. A new web feature service test suite is also available along with an updated GeoPackage.
The PPDM association is to release Version 3.10 (three dot ten) of its flagship Public petroleum data model, with support for completions, fracking and water management. The latest release also aligns PPDM’s coordinate reference systems with the authoritative EPSG dataset. Units of measure have been harmonized along with an improved data management capability.
The technical standards committee of the Society of Exploration Geophysicists (SEG) convened in Anaheim, CA last October. The TSC is now a part of the SEG’s ‘constituent engagement portfolio’ which is being put together to attract more activity and engagement. SEG-Y R2.0 has seen adoption since its January 2018 release with Chinese and Norwegian regulators now requesting data in this format. Exxon and Shell have also expressed interest. But more generally, standards adoption is challenging, and a great deal of effort is required to ensure that the major vendors are aware of the standards and moving towards adoption. The TSC is also working with IOGP on common ground for the SPS and P1/11 navigation formats. A Repsol-led group of operators is looking at blockchain for seismic data. While ‘this may make sense’ to track ownership and contracts (qv Data Gumbo), the TSC does not believe that logs, positioning and seismic belong in block chain. The TSC is looking at connections between its own standards and the HDF5 format of Energistics’ Resqml. Finally, the TSC observed that it would be a good idea if the SEG’s own SEAM R&D group leverage the the latest SEG standards. More from SEG.
The Open Geospatial Consortium (OGC) has signed a memorandum of understanding with the newly-formed World Geospatial Industry Council (WGIC) to collaborate on the promotion of geospatial and location-based technologies to government, markets, and industries worldwide. The WGIC was set up to ‘create a common voice’ for the geospatial industry in its dialog with government and industry communities. Initial focus is on interoperability challenges between the geospatial and architecture, engineering and construction (AEC) markets. More from WGIC.
The European Securities and Markets Authority is introducing inline XBRL as the mandatory new reporting format for reporting periods commencing 1 January 2020. The effort includes a complete translation of the IFRS taxonomy into all 23 official languages used in the EU. More from Europa.
The XBRL organization has released a new ‘open information model’ along with an xbrl JSON representation. The ‘syntax-independent’ representation enables transformation between different formats, including XML, CSV, JSON and relational databases.
ANSI, the American National Standard Institute and the International Society of Automation (ISA) have adopted FDT 2.0 as a standard for ‘open’, enterprise-wide device integration. FDT 2.0 will enable integration across various field buses and devices and ‘promote widespread implementation of standards-based automation solutions’. Regarding ‘open’, FDT 2.0 is a Microsoft .NET only technology. Members include Chevron, Emerson, GE, Shell, Schneider Electric, Siemens and Yokogawa. More from FDT Group.
CII, the Construction Industry Institute, has inaugurated a pilot research program devoted to enhancing advanced work packaging.
What a wonderful picture the ‘Internet of Things’ conjures-up. Imagine being able to assemble information from different vendors' sensors at remote locations, and mash all the information up into your own monitoring, big data, artificial intelligence system or whatever. This is the sort of dream that gets the chattering classes going and has kicked off several attempts to occupy the standards landscape. It has also driven massive re-badging of vendors’ and integrators' offerings to 'align’ with the ‘emerging’ IoT (or the Industrial IoT if you like).
Before this quick run-through of some of these standards and solutions it is worth reflecting on exactly what it would take to achieve IoT nirvana. If you want to grab some data from a remote device, you need to know a lot more than just the ‘value’. Knowing the units of measure would be good too. In fact, there are all sorts of other bits of metadata (the sensor’s position, the local time, the instrument’s dynamic range, calibration status and other idiosyncrasies) which may be essential to a proper, unambiguous use of the data value. Some aspects of the newer standards do indeed allow for more metadata than previously, but all IoT solutions come with a significant gotcha. The more metadata you need and get, the bigger the message. At the receiving end, managing data from vast arrays of different sensors will create a significant processing overhead. The IoT is not magic.
For the sake of this introduction to the IoT, we propose a three-tiered analysis: political, standards and protocol. At the political level, politicians, egged-on by the IT consulting community, dream-up a rosy picture of the merits of a super IoT ‘platform’. The standards bodies, often populated by vendors with a large installed, proprietary base to which disruption may be awkward, come up with something that sounds nice, but that is unlikely to cause them too much trouble. Finally, the skunk works artists and interoperability zealots look down into the protocol landscape and see if they can find anything relevant to their own needs.
One IoT birthplace is Germany’s Industrie 4.0, a politically-inspired attempt to sell ‘digitalization’ into an already highly digital market, the factory. The idea was that instead of having a lot of competing digital protocols on the factory floor, one overarching Industrie 4.0 system would allow manufacturers to be more 'competitive’. The organization has defined Rami, the Reference architecture model for Industrie 4.0. But, from the latest ‘official’ Industrie 4.0 publication, it appears that the initiative is still struggling with proprietary IoT platforms from SAP, Bosh and others. Even if these folks manage to get together for tradeshow interoperability demos. The I4.0 conclusion and outlook is ‘(today) few platforms are designed to make full use of the advantages and opportunities that platform-based business models offer. Most initiatives do not fully harness the power of the network effect. The (proprietary) platforms shield themselves from potentially harmful competition. The prevalent mindset is one of ‘platform protectionism’ and risk aversion’. That’s telling it like it is.
In parallel with the Industrie 4.0 and its Rami architecture, a large group of (mostly) US companies formed the Industrial Internet Consortium to 'bring together the organizations and technologies necessary to accelerate the growth of the Industrial Internet by identifying, assembling and promoting best practices’. Like Industrie 4.0, the IIC eschews anything that could be considered a communications protocol. So, nothing is ever going to ‘run’ on either Industrie 4.0 or the IIC. Early in 2018, the two bodies ‘announced the publication’ of a joint whitepaper, ‘architecture alignment and interoperability’ detailing ‘alignment’ of the two reference architectures*. A late arrival at the IoT ball is the World Wide Web Consortium, W3, which has put forward its RDF/Linked Data work as ‘important to the field of graph data’ and the ‘web of things’. The W3 is working on a standardization effort to use graph databases as ‘an important enabler for the IoT, big data, data management and data governance’. The International Standards Organization has also got onto the IoT bandwagon with a 77 page ‘reference framework’ from its ISO/IEC JTC 1/SC 41 technical committee. The ISO reference architecture proposes a ‘common vocabulary, reusable designs and industry best practice’. The framework is available from the ISO Store, a snip at CHF198.
* We have been skeptical about reference architectures since we first encountered Mura, the Microsoft upstream reference architecture which seemed at the time to be a rather nebulous marketing concept.
Digging down into the IoT we eventually come across some communications protocols of interest. According to ARC Advisory, the OPC UA is ‘well positioned’ as a basis for IoT solutions. OPC Unified Architecture is a platform and vendor-independent communication technology for a secure and reliable data exchange over the different levels of the automation pyramid. We discussed the extent to which OPC UA supports unambiguous exchange of metadata with Matrikon France's Antoine Capitaine who confirmed that it is indeed possible to send units of measure and other metadata over an OPC UA network, although, when tens of thousands of measurements are being broadcast, ‘there is not much point overloading the network with this information for each sample’. Metadata will likely be recorded in some configuration files. This is easier to imagine in a factory context, but it will limit data interoperability between networks. ARC is probably right in that OPC UA has application in the factory (and perhaps in the drilling factory) but it is not the main contender for IoT-enablement in our reporting to date.
Curiously, the IoT protocol that seems to have attracted most attention in the oil and gas space, at least in the US, is the venerable Mqtt (Message Queuing Telemetry Transport) spec. Venerable, because it was introduced by IBM back in 1999 and later offered to the Oasis standards body. In a report by Inductive Automation from the 2018 ICC conference, Mqtt co-inventors Andy Stanford-Clark and Arlen Nipper described how end users were ‘frustrated’ with proprietary automation systems that are hampering innovation. Mqtt is presented as an open source response to these proprietary systems with development support from the Eclipse Foundation. Mqtt is a pretty low-level spec which is generally augmented for IoT use with the Eclipse Tahu (formerly Sparkplug) platform. Tahu ‘addresses the existence of legacy SCADA/DCS/ICS protocols and infrastructures and provides a much-needed definition of how best to apply Mqtt into these existing industrial operational environments’. Tahu has been productized by, inter alia, Ubuntu as an IoT gateway framework.
Now this is getting interesting and it is indeed what we have observed at earlier Wellsite Automation events as process control engineers build their own hooks into their scada systems with the technology. But if you follow the Mqtt/Sparkplug breadcrumb trail you get to even more interesting stuff, if that is, you are of the do-it-yourself disposition. Once you can connect to your expensive proprietary control systems you will probably want to build and deploy some of your own sensors. This is now ridiculously easy with devices costing a few tens of dollars, notably the Raspberry PI, today’s ubiquitous Internet ‘edge’ device. The ‘edge’ implies a center i.e. the cloud and it is indeed the promise of cloud connectivity that has brought these skunky IoT projects along so far and fast. As an example, visit Walter Coan’s Hackster page which shows how to reading data from a PLC right into the Azure cloud. Another is BigClown’s kit to build your own IoT devices such as a motion detector, climate monitor, flood detector and more.
So where does all of the above leave oil country process control standards initiatives such as ExxonMobil/The Open Group’s OPAF and Saudi Aramco’s ‘me-too’ Process Automation System PAS. OPAF sounds more like a reference architecture, it was described to Oil IT Journal as a ‘standard of standards’ that ‘will not duplicate work where useful standards already exist’. OPAF has just provided an update on its developing spec which we will be reporting on in our next issue.
In some ways, the advent of cheap processing power à la Raspberry at the edge may make the need for standard communications protocols less pressing. If a puck-like device can just get data from the control system into the cloud, then what protocol is used in the moving is maybe not so important. There is maybe one other consideration here. Industry’s love affair with open source has maybe come a bit late in the day as Microsoft now owns Github and RedHat. And while Google and Amazon may push out lots of code into the open domain, they are still .. err.. Google and Amazon. It is hard to find open source stuff today that is not in one way or another an invitation to be locked-in to a commercial provider. Likewise all that ease of getting your stuff into the cloud may herald a shift from ‘proprietary’ control systems to a golden handcuff tieing you with your cloud provider. Of course if you are using Azure or TensorFlow, well, the handcuffs are already on!
BGS, the British Geological Survey has issued ‘3D GWV’, a 3D groundwater vulnerability assessment program to screen risks from beneath oil and gas sites in England.
IOGP, the UK-based International organization of oil and gas producers has published the following:
IOGP Report N° 456
(Second Edition, 2018, 88 pages), Process safety, recommended practice
on key performance indicators for the upstream oil and gas industry.
IOGP Report N°485 (November 2018), Standards and guidelines for well integrity and well control, a four-page enumeration of relevant standards.
IOGP Report N°594 Source control emergency response planning for subsea wells, a guide to mitigating and managing blowouts and spills.
IOGP has also announced project Safira, a focal point for its current
safety initiatives in process, aviation, transport and life-saving
rules.
Accenture’s Elfije Lemaitre blogged recently on the impact of digital investments on refineries and disaster management. Digital technologies can help refiners reduce refinery downtimes and optimize supply strategies to limit the disruptions caused by major weather events. Digital salvation comes from real-time feeds, digital tools for maintenance planning of damaged assets and digital investments to improve the agility and response times across the supply chain following a weather disruption.
The AFPM reports that modern computer-based controls of hazardous energy (e.g., mechanical, electrical, pneumatic, chemical, radiation) conflict with the US OSHA*’s lock-out/tag-out standard. The Agency has issued an RFI to help understand the strengths and limitations of the new technology, and its potential risk to workers.
* Occupational Safety and Health Administration.
The Gulf Research Program of the National Academies of Sciences is to award up to $10 million in new funding to support research projects that will advance understanding and facilitate improvement of safety culture in the offshore oil and gas industry. The monies come from the NAS’ Safer offshore energy systems initiative, following a 2018 workshop on the ‘The human factors of process safety and offshore worker empowerment’. More from National Academies.
High Alpha has announced ‘Anvl’, its cloud-based software that is to ‘reinvent’ workforce safety for the digital age. Anvl captures environmental, process and sensor data, to enable point-of-risk intervention and a targeted safety experience for high-risk maintenance and service environments. Anvl is currently under test at the diesel behemoth, Cummins.
UK-based Oil Spill Response has signed a strategic cooperation agreement with TechnipFMC unit Genesis Oil and Gas Consultants to support its Subsea well intervention service (Swis) members who now have access to Genesis’s expertise in subsea systems engineering, installation, operations and computational fluid dynamics. The agreement follows from the work of the IOGP (above) and the Subsea Well Response Project consortium.
The Lloyds Register Foundation’s new ‘Discovering Safety’ program sets out educate companies in global risk control and to bring a ‘step change’ in health and safety performance. Discovering Safety includes accident and incident investigation information accrued by the Health and safety executive, the UK health and safety regulator, over the last 40 years. More too from The Ashton Institute.
E2S Warning Signals has announced that its STExCP8 manual call points are SIL2 compliant to IEC 61508, allowing system integrators greater confidence when designing systems requiring an enhanced safety integrity level. The devices are part of the STEx family of explosion/flameproof audible and visual signaling devices for installation in harsh onshore and offshore environments.
Kongsberg Digital’s new K-Sim Safety solution trains officers in advanced firefighting with an interactive 3D walk-through animation of a tanker’s engine room and upper decks. KSS is based on a detailed 3D representation of a 152,000 tonne double hull Suez Max crude oil carrier with 7 decks. Training is in accordance with the STCW Code requirements. More from Kongsberg Digital.
DNV GL has certified Noble Energy for compliance with the SEMS (safety and environmental system) standard from the Center for offshore safety, an API RP 75-compliant standard for best practices in oil and natural gas drilling and production operations. More from DNV GL.
Equinor has acquired Epsis TeamBox for its ‘Geo Operation Center’ in Bergen.
Norwegian industry body EPIM has signed new frame agreements for its JQS Audit offering, a supplier register and qualification service for buyers in the Norwegian and Danish oil and gas sectors. New JQS members are Antenor AS, DNV GL and Wergeland Bedriftsutvikling.
Records management specialist Gimmal is to add AI-driven enterprise search from BA Insight to its eponymous solution.
GuildOne and R3 have teamed to provide ‘blockchain technologies’ on Amazon web services to the oil and gas sector. More from GuildOne.
The Dexpi P&ID standards body is to work with Korea’s Kyungpook National University on an image recognition project to create ‘intelligent’ P&ID data from scanned PID images.
Retailer Parkland Fuel has selected Metegrity’s Visions V5 enterprise inspection data management software. MVE has replaced Meridium at Parkland’s recently-acquired Burnaby refinery.
Novum Energy Trading has selected Infor’s SunSystems FMS financial management system.
PetroDE and Western Land Services are integrating their land management and geospatial analytics software.
Read Cased Hole has signed an exclusive agreement with Advanced Logic Technology to develop and market ALT’s ABI-43 acoustic borehole imaging technology.
4Subsea and Ashtead Technology have signed a partnership agreement for global distribution of advanced sensor technology to the oil and gas industry.
CGG and OMV have extended their partnership for another three years to operate a dedicated center at OMV’s head office in Vienna. OMV will continue to benefit from onsite access to CGG’s subsurface imaging and reservoir characterization expertise and technology.
AspenTech’s Aspen Mtell software has been selected and deployed by Southeast Asia’s integrated petrochemical pioneer, IRPC PLC.
Bluefield Technologies has obtained funding from Village Global, an early-stage venture capital firm supported by successful entrepreneurs including Jeff Bezos, Mark Zuckerberg and Bill Gates in the wake of a recent demonstration of its airborne optical methane leak detect system.
Engie is to implement Dell Boomi’s iPaaS (integration platform as a service) to ‘reinvent' its IT environment and accelerate its digital transformation.
Element Materials Technology has secured a three-year outsourcing agreement with TechnipFMC to provide on-site non-destructive testing services at its umbilicals manufacturing site in Newcastle upon Tyne, UK.
Energy Services Group has teamed with Allegro to provide oil and gas pipeline management and storage operators with expertise and automated solutions, including internet and EDI solutions from ESG’s Latitude unit.
ExxonMobil and IBM are to research and develop next-generation energy and manufacturing technologies using quantum computing. ExxonMobil is the first energy company to join the IBM Q Network.
Halliburton has signed two contracts with Eni to provide integrated drilling services at Eni’s Zubair Oil Field in Southern Iraq.
National Oilwell Varco has signed an initial three and a half years extensible agreement with Equinor to provide its IntelliServ wired drillpipe across Equinor’s fleet of offshore drilling rigs.
HWCG has commissioned Magma Global on the ‘offset flexible riser’ project, an M-Pipe-based emergency well containment riser.
Infinite Software has been awarded a contract to migrate legacy AS/400 applications to an Oracle database on Linux from ‘one of the world’s largest’ oil and gas companies.
Woodside has awarded Jacobs Engineering a three-year brownfields engineering and procurement contract for the Karratha Gas Plant in Western Australia.
Aqualis Offshore has signed a five years services agreement with McDermott to provide engineering reviews and marine warranty services for the company’s offshore operations in the Asia Pacific region.
Merkle Aquila has provisioned an Azure data lake for Total UK and has teamed on the development of cloud-based data analytics tools and processes to support automated reporting, improved analysis and ML use cases.
Chevron, Total and Reliance Industries have joined the VAKT Consortium to champion VAKT’s blockchain-enabled platform.
Novum Energy Trading Corp has deployed Infor SunSystems FMS (implemented by TouchstoneEnergy) to support its trading activities of refined oil products.
Petrofac has launched a new technology ‘Innovation Zone’ in Aberdeen, in collaboration with Accenture and its Industry X.0 offering.
SCA has entered into a joint marketing agreement with IHRDC to provide a ‘full menu’ of learning resources for oil and gas industry clients.
Siemens and Aker Solutions have partnered to jointly create software applications, joint service offerings including the development of industrial digital twins and to further develop specific offerings for the oil and gas sector based on Siemens’ Comos engineering platform.
Siemens and Bentley Systems have launched a joint technology and service solution, hosted on Siemens’ MindSphere, to leverage IoT and digital twins for power plant owners.
Petoro has awarded Schlumberger a two-year software as a service contract for the provision of the Delfi cognitive E&P environment, as well as use of the Eclipse and Intersect reservoir simulators. Dyas Norge has awarded Schlumberger a similar contract for use of Delfi on its Fogelberg gas discovery on the Norwegian Continental Shelf.
The Subsea Integration Alliance, a worldwide non-incorporated partnership between OneSubsea, Schlumberger and Subsea 7, has been awarded an integrated subsea engineering, procurement, construction, installation and commissioning by Esso Australia.
Tape Ark and Seagate Technology have partnered for a data recovery project using ‘AI-driven technologies’.
TechnipFMC in consortium with Malaysia Marine and Heavy Engineering have signed a six year agreement with Saudi Aramco for engineering, procurement, fabrication, transportation and installation of offshore facilities.
P2 Energy Solutions and ThoughtTrace have partnered to release a ‘Lease obligation intelligence’ solution for the oil and gas industry.
Uptake and Element are teaming up to develop an end-to end AI solution to automate data integration, data science model configuration and the production of financially optimized insights for industrial businesses.
WEX has been awarded a contract to develop, support and manage a fuel card processing platform for Z Energy in New Zealand and will process Z Energy’s fuel cards over the next 5 years.
DNV GL is to pilot its Marv ‘multi-analytic’ risk visualization tool for SoCalGas. Marv monitors external corrosion and potential third-party damage to natural gas transmission pipelines.
Shell has chosen Wood and KBR to perform front-end engineering and design for the Crux field development and tieback to the Prelude floating LNG vessel.
Woodside has awarded four contracts to McDermott Australia, Subsea Integration Alliance, Saipem Australia, and Intecsea for front-end engineering design activities at the proposed Scarborough project. Woodside has selected Bechtel for FEED on its Pluto Train 2 Project.
Egyptian ENOC and BHGE are to co-develop VitalyX, a real-time lubricant monitoring system.
BHGE has partnered with Petroleum Development Oman to open its first artificial lift systems assembly and repair facility in Oman to support PDO’s operations and other customers in the region. BHGE is also to build a state-of-the-art oilfield services facility in King Salman Energy Park to Serve Saudi Aramco and the region.
Atos has delivered its Quantum Learning Machine, the ‘highest performing’ quantum simulator in the world to the Science and technology facilities council’s Hartree center in Warrington, UK.
Specialized Petroleum Technologies is now APS’ authorized channel partner and representative in the Republic of Azerbaijan.
GuildOne and R3 have partnered to provide Amazon’s managed blockchain services to the oil and gas sector.
PetroDE and Western Land Services are partnering to integrate comprehensive land intelligenc and geospatial software.
Ikon Science has acquired Perigon Solutions, a wellbore data management and visualization specialist. Perigon’s iPoint solution adds a knowledge management component to Ikon’s RokDoc portfolio. The acquisition comes with ‘clear market, revenue and cost synergies’. Perigon’s technology will allow Ikon’s clients to leverage legacy rock physics and reservoir work and ‘derive quantitative, data-driven insights from past projects, analogues and global teams’ according to Ikon CEO Mark Bashforth. More from Ikon.
Rockwell Automation and Schlumberger have announced ‘Sensia’, aka, the oil and gas industry’s ‘first*’ fully-integrated automation solutions provider. Sensia combines Rockwell’s control and information solutions with Schlumberger’s technology for reservoir characterization, drilling, and production. Schlumberger will receive a $250 million payment from Rockwell at closing, funded by cash on hand after, which Sensia will operate as an independent entity, with Rockwell owning 53% and Schlumberger 47% of the joint venture. The new unit is expected to generate $400 million annual revenue with around 1,000 employees. Sensia CEO is Rockwell’s current CTO Allan Rentcome.
* The folks in Emerson/Paradigm may beg to differ.
Accenture has acquired Houston-based software house Enaxis Consulting, a provider of digital transformation services to the oil and gas vertical. Enaxis’ expertise includes data science and agile project delivery with ‘agile, safe and scrum’ methodologies. Enaxis’ annual leadership forum is billed as an invitation-only event for C-suite executives, futurists and academia.
ControlSoft has acquired eSimulation along with its joint venture partner, DxT3. eSimulation is a specialist in cloud-based optimization and operations management software for midstream natural gas gathering and processing companies.
DrillingInfo has acquired Cortex, a provider of accounts payable and receivable automation solutions for the oil and gas industry. DI has also acquired MineralSoft, a software platform for the management of mineral, royalty and non-operated working interests.
EnergyIQ has acquired PetroWeb’s EnterpriseDB, Navigator and Gateway Platforms. The software assets will integrate EnergyIQ’s Trusted Data Manager offering, adding spatial analytics to EIQ’s IQInsights search and visualization platform. Integration will be facilitated since both platforms are based upon the PPDM data model and have a similar architectural approach, according to Gina Godfrey, PetroWeb founder and CEO. PetroWeb’s data indexing and cataloging business continues with its GlobalSeismicLibrary portal and soon, a new well data product with tests, stratigraphy, geochemistry, cores and more from some 7 million wells across 100 countries.
Equinor has signed agreements funding basic research in five Norwegian universities and NHH, the Norwegian school of economics, with a five year dotation of NOK 315 million. The largest funding is a NOK 19 million per year deal signed with NTNU.
VC Silver Lake is to acquire a majority stake in GE’s ServiceMax field service management software. ServiceMax was acquired by GE Digital in 2016. The companies are to ‘continue to advance’ integration of Digital’s ‘Predix’ asset performance management suite and ServiceMax.
Halliburton has acquired SmartFibres, a developer and manufacturer of downhole fiber optic pressure gauges. SmartFibres will integrate Halliburton’s production enhancement portfolio.
Hexagon has acquired J5 International, a supplier of operations management software for industrial sites. J5’s solutions are used by oil and gas customers to replace the ‘troublesome’ mix of paper, spreadsheets, databases and other scattered manual data collection methods.
Graph database boutique Neo4j has closed an $80 million series E funding round led by One Peak partners and Morgan Stanley. The company has now raised some $160 million.
OpenText has acquired Catalyst Repository Systems for $75 million. The deal follows its 2018 acquisition of Liaison Technologies.
Osprey Informatics, a provider of visual monitoring solutions for oil and gas, has raised $3.75 million from Shell Ventures, Evok Innovations and InterGen capital.
Chinese oilfield process control specialist Recon Technology has received a letter from the Nasdaq warning that it did not meet the minimum bid price requirement. Recon now has until July 15, 2019 to regain compliance (and avoid delisting) by demonstrating a share price that stays above $1.00 for 10 consecutive days.
Seismos, a provider of real-time frac treatment and frac performance evaluation from acoustic flow metering, has secured $10.5 million equity financing led by Quantum Energy Partners.
Seitel has sold its Canadian seismic data library to Pulse Seismic in a CDN $53 million (plus an up to $5 million earn out) deal.
Risk management software house Sphera Solutions has acquired Aberdeen-based Petrotechnics, a provider of operational risk software for hazardous industries. Petrotechnics’ electronic permit to work functionality will integrate the SpheraCloud risk management and mitigation solution.
Houston-based Total Safety has acquired Clairmont, Alberta headquartered Vantage Safety Services, a ‘full-service’ provider of safety services to the oil and gas industry.
UniSea, a provider of software tools for operational support and HSEQ to the offshore oil and gas industry, has taken part ownership of Yxney, a ‘data-driven’ specialist in maritime energy efficiency.
Weatherford International has sold its surface data logging business to Excellence Logging for $50 million cash. Weatherford has also received written notice from the New York Stock Exchange that it is not in compliance with the NYSE’s continued listing standard. The company has six months to ‘cure’ the $1.00 per share deficiency.
Flow measurement specialist, Loveland, Colorado-based Western Energy Support & Technology has acquired ‘substantially all’ the assets of Houston-based L-K Industries, a manufacturer of oil centrifuges and petroleum sampling and measurement equipment.
Rockwell Automation has acquired Emulate3D, developer of automation systems simulation software.
Resman AS is to acquire Restrack, a tracer service provider, in a share purchase agreement.
Dassault Systèmes is to acquire the Elecworks electrical and automation design software product line from Trace Software.
The US Federal Energy Regulatory Commission is to transition from its current Visual FoxPro reporting software, now unsupported by the developer, to receiving reports in XBRL, the extensible business reporting language. Natural gas reporting will be impacted by the change.
The executive committee of the National Data Repositories (NDR) organization, comprising the Irish and four North Sea regulators is taking over the organization of the 2019 gathering from Energistics, which has organized the event since 2009. NDR was previously driven by an Energistics work group. The 2019 meet will be hosted by TNO in Utrecht, Netherlands. Energistics has ‘asked to participate’.
The Railroad Commission, the Texas regulator, issued a guide to users of its W3A notice of intention to plug and abandon. There is even a video. The RRC has also launched RRC Online inspection lookup (RRC OIL), an online database of oil and gas inspection and enforcement data. RRC OIL enables public, statewide search of inspection and enforcement information, including notices of violation and intentions to sever leases.
Ryder Scott reports that the SEC may relax its reporting rules ‘to allow producers to report more reserves’. The current ‘five-year’ rule is considered by some to be hurting shale producers. At issue is the $833 billion owed by oil and gas producers to lenders ($400 billion by year-end 2019) and the impact of reportable reserves on loan terms.
Transport Canada has awarded an equivalency certification for Quantum Fuels ‘virtual pipeline’ trailers, allowing the compressed natural gas trucks to be registered and used in Canada. The high-capacity trailers carry up to 638 MCF at 5,000 psi where trailer weight can exceed 80,000 lbs. More from Transport Canada.
Chairmanship of the Canadian Western Regulators Forum (WRF) is rotating to the Saskatchewan Ministry of Energy and Resource in April 2019. The WRF board has developed strategic plan and is currently working on incident reporting and on the use of standards in regulation, indigenous engagement, transparency, and pipeline regulation.
The 2018 Energy Conference Network Blockchain in Oil and Gas conference took place in Houston late last year. Blockchain appears to be gaining traction in oil and gas and indeed in other verticals. The ECN event proved to be a showcase for a variety of proofs-of-concept of the technology and gave some indications of how the blockchain community plans to links the secure exchange of a digital token with a real-world commodity or asset. Oil IT Journal readers will recall that we have expressed skepticism in regard to the logic behind this (see our 2018 editorial) and doubted that a digital token can ever be entirely and unambiguously attached to a real-world item in the face of serious attempts to cheat the system. We read through these presentations with interest to see if our skepticism would be assuaged.
Many presenters walked through the merits of blockchain. Terrahub’s Dan Giurescu summed these up as a way of stopping tampering with data since writing is ‘append only’. Old data can be made obsolete but not forgotten, leading to auditability and ‘bootstrapped’ trust, all without a central authority. A blockchain can be stored in different places providing ‘disintermediation’ ie removing the need for trusted intermediaries between untrusting parties and automating contract fulfillment. Giurescu enumerated the vast number of exchanges in existence both private (permissioned) and public, referring to Matteo Gianpietro Zago’s commentary on the ‘Internet of blockchains’, aka Web 3.0. In oil and gas, BHPBilliton is said to be ‘leveraging blockchain technology for supply chain management’.
A joint Microsoft and Quisitive presentation ran through the merits of blockchain and its potential for remote parties to share transaction details in real time and ‘immediately agree that the event is consistent with the terms of the contract(s)’. This is the ‘promise of blockchain, and what makes it fundamentally different than technology before.’ Connecting the tokens to the real world requires both parties to ‘together safely operate just one measuring device’ with the blockchain used to share data immediately between both parties. Somehow, ‘if a device detects contamination, that fact is recorded immediately and cryptographically signed such that it’s impossible to be tampered with.’ Such systems require periodic re-calibration by a certification authority. Certification is likewise digitally signed into the blockchain so that all stakeholders can agree on the update. Business rules, ‘contracts’ in blockchain terminology, can be implemented in code, stored on the blockchain and executed in response to certain events. Quisitive’s solution leverages Microsoft’s Ingestion Common Services and the Azure blockchain workbench. Quisitive’s poster child is a development of Enterprise Smart Contracts for an unnamed Midstream Energy Company. These allow the operator to detect abnormal operating conditions and ‘automatically dispatch a technician’ or shut down a section of pipeline. The Midstream blockchain solution also integrates with corporate financial systems for secure payment with transactions signed by both counterparties in such a way that both banks can observe and rely on the deal. Quisitive offers workshops, proof of concept and solution development in what is described as a commonsense approach to blockchain deployment.
Tony Giroti of the Energy Blockchain Consortium (EBC) acknowledged that, for blockchain in energy, ‘the jury is still out’, but there is already ‘lots of financial services usage’. The EBC is a non-profit consortium of energy and blockchain organizations and professionals who are committed to leveraging blockchain technology to solve ‘the most compelling problems in the energy industry’. The Consortium is developing ‘Catena’ an open energy blockchain framework along with use cases, interoperability standards and reference architecture. For more, visit the Consortium. The EBC recently signed a MoU with a sister organization, the Energy Web Foundation to ‘jointly explore the opportunities, benefits and challenges of blockchain’.
For Rebecca Hofmann and Equinor, blockchain represents a means of breaking down the data-materials-documents silos in downstream manufacturing and sales. Future blockchain networks are to provide a single-source of trusted real-time data. Hofmann provided a pointer to the 2015 blockchain backgrounder from the Economist. The expectation is for a more seamless way of working with a central source of truth where business activities can self-execute and transactions recorded ‘transparently in real time with no single point of failure’, making them more secure. Roadblocks to blockchain deployment include regulations, change management and the lack of standards. Possible use cases include royalty payments, hydrocarbon tracking, joint venture accounting, supply chain management, equipment and the environment! Equinor’s current approach passes through its Enterprise Ethereum Alliance involvement, the Swiss EnergyWeb Foundation and, in the US, the Oil & Gas Blockchain Forum. According to a report in Coindesk, as of November 2018, the blockchain platform built by Vakt Global, is up and running ‘facilitating the trade in crude oil between commodity firms’ and claiming to be the ‘first enterprise grade blockchain solution in the oil and gas market.’ The Vakt blockchain was developed with help from Deloitte and ThoughtWorks.
David Womack presented IBM’s Global Finance blockchain that builds on the Hyperledger open source fabric. Maersk Shipping is onboard. IBM is ‘co-innovating’ with SAP on a oil and gas version and is also working on its own blockchain for chemicals and petroleum. IBM claims some 500 blockchain ‘engagements’.
Marc Battistello presented PIDX’ solution to the absence of blockchain standards with a roadmap to update the PIDX Petroleum Industry Data Definitions with blockchain semantics for 2019 and beyond. PIDX is working with Chevron, OFS Portal and Amalto on various related projects.
Amalto is also developing its own ‘Ondiflow’ blockchain-based oil country field ticketing ystem. Field data sourced from IoT/SCADA devices is captured into a blockchain from whence orders/tickets/invoices and payments are represented in ‘smart contracts’. Compliance and reporting also ran. The solution allows for ‘tokenization’ and demand/cargo aggregation (like in the electricity markets). Energy tokens are to become ‘long term investment vehicles for retirement funds and indexes’. Somehow, Ondiflow promises ‘zero emission’ barrels! Ondiflow is a joint venture with Consensys.
Tetyana Colosivschi (Consensys) explained how the Enterprise Ethereum Alliance ‘full stack’ blockchain comes with development tools, a certification program for EEA apps and more. The EEA stack is a conceptual framework that characterizes and standardizes components from the Ethereum ecosystem. The standards ‘drive interoperability and avoid vendor lock-in’. An EEA Energy Group (aka the ESIG) is working to define and design industry-specific standards to make industry more efficient, cost-effective and sustainable. The ESIG also is developing a regulatory framework for the adoption of blockchain technology in the Energy industry. VIANT also ran.
For Tibco’s Mike Myburg blockchain is the next big thing in oil and gas. But writing smart contracts for today’s blockchain platforms is difficult. There is no standardization varyiable capabilities and limited tooling, all of which increases the likelihood of coding errors. Another issue is picking the winning technology from today’s vast number of choices. Enter Tibco’s project Dovetail*, a model-driven smart contract solution. This ‘digital wallet’ is to become the web browser of the blockchain allowing users to view and manage transactions. In true Tibco style, point-to-point interactions are replaced with a blockchain-based bus spanning the supply chain.
* Project Dovetail builds on open source components and TIBCO’s Golang-based Flogo Enterprise Studio open source engine.
Nick Spanos (Zap.org) presented ZAP’s Energy Ledger as a vehicle for tokenizing Venezuela’s oil!. This would allow direct investment in oil via a blockchain whose tokens relate to smart contracts tied to flow meters. Zap is billed as 'Venezuela’s missing link’ i.e. a solution to hyperinflation.
Bert Blevins (LINN Energy) tried to separate the hype from the current blockchain reality. The merits of the technology are many, but so are the current blockchain offerings (Blevins enumerated around 15). What is needed is blockchain to blockchain interoperation, perhap made possible via the Azure blockchain workbench. Blockchain is not the only interesting technology in the automated supply chain. Blevins cited Scatter (signing and identity), Flow, and FlowXO (for event triggers). But beware, to date, there has been an estimated half a billion dollars lost in smart contract transactions, due to bugs in code, nefarious actors, exploits, and the ‘unforeseen consequences of the massive one-sided power of the smart contract’.
Comment: You can see how the blockchain brigade is trying to hook its digital tech into the physical world with the connection to the IoT although it may be a bit naive to tie everything into a scada meter. Joint venture and royalties reporting are always more complex that they seem. Meters drift, breakdown, there may never be enough data on the blockchain to avoid some unscrupulous person siphoning off the crude or whatever. Folks further down the chain will have a hard time verifying this. In the best circumstances, verification itself may require technology beyond the ken of many partners in the chain. The complexity of the blockchain is compounded by the likelihood of having to interact with multiple versions of the same. One is tempted to ask exactly what the problem was that the blockchain is trying to solve. Also, curiously, there was no mention of the compute overhead and reportedly huge energy consumption that accompanies blockchain. So, where’s our skepticism today? Is blockchain a boondoggle? That, as they say, is a good question.
More from Energy Conference Network CN. You may also be interested in the upcoming US Oil & Gas Blockchain Forum March Luncheon chez Chevron, Houston on March 21, 2019.
The US Transportation Security Administration’s (TSA) has issued a 30 page report on pipeline security that includes a chapter on cybersecurity guidelines for natural gas and oil pipeline infrastructure.
The US House of Representatives has passed legislation creating the Cybersecurity and Infrastructure Security Agency (CISA) within the Department of Homeland Security (DHS). Once signed by the President, this will create a new agency and federal leader for cyber and physical infrastructure security.
The US National Cybersecurity Center of Excellence (NCCoE) has teamed with the NIST Engineering Laboratory on a demonstrator for ICS security through behavioral anomaly detection. The results are available as a draft NIST Internal Report (NISTIR) 8219. Visit the project homepage. A word of warning about NIST and other US government agencies. Last October, during the shutdown, the Computer Security Resource Center and all associated online activities were ‘unavailable until further notice’ due to a ‘lapse in government funding’. Open day for the hacking community?
Illusive’s ‘deceptions everywhere’ cyber security approach works by planting fake information throughout the environment. Within the first few moves of an attacker’s search-and-advance process, the attacker will inevitably try to use a false item, triggering an alarm and capturing a system snapshot for forensic analysis. Responders know that an Illusive alert requires immediate attention and can see how far the attacker has got and either can take immediate action or continue to observe and analyze the attacker’s activity. More from Illusive.
The Carnegie Mellon Software Engineering Institute’s CERT Division has released Cyobstract, an open source incident response tool. Cyobstract is designed to help analysts quickly and efficiently extract artifacts from any textual source or collection of sources, such as incident reports and threat assessment summaries. Cyobstract was trialed on a cyber security dataset of Department of Homeland Security incident reports. Download the Cyobstract library from GitHub. The SEI has also published a white paper titled, ‘Threat modeling: a summary of available methods’ that discusses twelve threat modeling methods targeting different parts of the development process. SEI has also released SEI-ACE for authentication and authorization of Internet of Things devices for use in edge environments. The SEI-ACE code is designed to run in resource constrained mission critical, Class 2 IoT devices, generally limited to around 50-250KB of storage.
ClassNK and TÜV Rheinland have concluded a worldwide partnership agreement for marine and offshore cybersecurity services. The partnership is to develop a maritime cybersecurity certification scheme.
A Leidos cybersecurity blog introduces new passive monitoring capabilities in its Industrial Defender ASM flagship. Many ICS/SCADA systems were developed and deployed before the evolution of today’s cybersecurity threats. Passive monitoring deploys non-invasive network sensors that capture communication between SCADA and PLC devices, looking for possible threats. An ASM REST API supports integration with third party applications.
In response to doubts about its Russian lineage, Kaspersky Lab has opened a data center in Zurich, Switzerland for its EU clients and launched a ‘Global Transparency Initiative’ to convince users that its technology is not being put to nefarious use. More from the Kaspersky Transparency Summit.
Speaking at the 2018 Upstream Intelligence Data-driven production conference, Erik Hawes from Morgan Lewis LLP explained the niceties of IP protection options for data collection and presentation systems. These range from ‘utility’ patents, i.e. the usual kind, for a new apparatus or method. These can be fairly broad, as, for example, APO Offshore’s US Patent No. 8,676,721 for a ‘computer implemented method for predictive analysis of multiple and varying topside equipment or topside systems used on one or more oil and gas field platforms … aggregating sensor data … [performing] … data analysis … [with a] … neural network …’. Such patents may be the best option for protecting IP in data collection and presentation systems. Getting this type of protection has become much more difficult since 2010, when the Supreme Court held that ‘abstract ideas’ are not patentable, even if they are new. To improve chances of obtaining a patent, Hawes suggests patenting the apparatus itself rather than the method and to avoid being ‘greedy’ by making multiple types of claims.
But a utility patent is not the only route to protecting IP. A 'design’ patent, one that covers how a product looks, may be useful to protect a novel GUI. A ‘trade dress’, which is similar to a trademark, also protects non-functional, distinctive design features. Finally,'copyright’, that protects the way an idea is expressed, rather than the idea itself, can cover software code, but only if it does not qualify as a ‘useful article’. As such, copyright might be applicable to labels or elements of a dashboard. Finally, the notion of ‘trade secrets’ can be used to provide a business advantage over competitors who are not supposed to know them and as long as they are kept confidential.
The US patent office has granted two pressure-based fracture map technologies to Equinor (formerly Statoil). US Patent 10,030,497 is for a method of acquiring information on hydraulic fracture geometry for evaluating and optimizing well spacing for multi-well pad and US Patent 9,988,900 is for the geometric evaluation of hydraulic fractures from pressure changes. The technologies underpin Equinor’s Image Frac technology, now licensed exclusively to Reveal Energy Services.
In the latest chapter of the long-drawn-out tussle between Ion Geophysical over the DigiFIN streamer positioning device, the US Supreme Court declined a petition by WesternGeco to rule on the propriety of challenges that led to the patent trial and to the appeal board’s invalidation of the claims. WesternGeco had complained about an ‘incoherent’ Federal Circuit precedent. More on this from the National Law Review.
Cisco and the Massachusetts Institute of Technology have created an open archiving platformfor prior art related to patents. The Prior art archive sets out to combat ‘low quality patents that waste money as companies fight litigation defending against patents that shouldn’t have been issued’. The patent examination process should stop patents from being issued on old or obvious technology. The partners observe that just because technology is old doesn’t mean it is easy for a patent examiner to find. This is especially true in the computer field, where much ‘prior art' comes in the form of old manuals, documentation and web sites that have, until now, not been readily searchable.
Schneider Electric has paid $1 million to Houston-based Tatsoft to settle a patent dispute over Tatsoft’s FactoryStudio for Industry 4.0. Tatsoft was represented by Gregor Cassidy, PLLC.