Oil IT Journal: Volume 24 Number 3


Disruption? Be careful what you ask for!

While Accenture sugar-coats US retail fuel ‘disruption,’ the Russian market is getting uberized.

An Accenture blog describes the ‘disruption’ of the US fuel retail market. The retail landscape across the US is shifting. Consumers are expecting a personalized digital experience, ‘a connection with a brand identity’, and a sense of purpose or ‘belonging to a community’ (really!). Disruption is coming from electric and autonomous vehicles, ride sharing and even mobile fuel delivery. There are ‘competitive threats’ in the convenience market from companies like Walgreens, Starbucks, and Amazon Go, all of which are expanding their offers and targeting the next generation of consumers. Fuel retailers can remain competitive by ‘embracing disruption’ and evaluating how well their current business strategy aligns with customer and consumer behavior and purchasing history across channels and product categories. Leveraging advances in data storage, analytics and visualization can enable ‘ongoing evaluation and creation of insights’.

Comment: It is worth reflecting on what ‘disruption’ really means. Uber moved in on the taxicab market and has trashed what (according to your jurisdiction) used to be a protected, regulated and rather inefficient industry, replacing it with low-cost independent drivers. True disruption is very painful to the incumbent. The Accenture study sugar-coats disruption in that it is presented as a opportunity to do more of the same, but to do it digitally.

Things are getting more interesting in Russia where a new app, ‘Turbo’ from Gazpromneft lets users buy gas in quantity when the price is low and then fill-up at any Gazpromneft station. The app even allows drivers to sell their pre-purchased gas at a higher price to a friend. Turbo raised $300k in a private placement last year. The app appears currently to be tied to Gazpromneft so may not fully uberized. An app from Russian behemoth Yandex that provided a similar service to Lukoil was little more than a loyalty card and has now closed. But Yandex is working on another version of its Yandex Gazstations app and has plans to restart the project by ‘connecting new partners to it’. A truly disruptive development that would see Yandex (or an extended Turbo) uberize Russia’s oils, creaming-off what is left of the retailers’ margin.

Gas purchase apps (GasBuddy, Gas Guru) in the US appear to be more of an information service, letting drivers know where the cheapest gas is available. A truly uberized retail market (in any country) would run into considerable opposition from incumbents who would undoubtedly be reluctant to share their margins with a third party. On the other hand, it could be that regulators might regard a refusal to sell as a rebirth of the cartel. Interesting times.


WesternGeco rolls-out global data discovery solution.

Schlumberger’s seismic unit demos Gaia, an impressive map-based interface to a vast array of third party data sources. Gaia is ‘built on’ Schlumberger’s Delfi E&P environment and a new ‘high performance’ digital map – but no, it’s not Esri!

Since WesternGeco went ‘asset light’ in 2017, you may have been wondering what the geophysical behemoth has been up to. Turns out that it has been busy reinventing itself as a software provider. At the 2019 EAGE Conference & Exhibition in London (report in our next issue), Schlumberger’s WesternGeco unit unveiled ‘Gaia’ a combination of a high-performance digital map for global data discovery and 3D visualization of basin-scale subsurface data. Gaia currently provides access to some 3 million km2 of 3D seismic surveys, 3 million km of 2D seismic lines, wells and other exploration data types from a partner network of seismic and well data providers.

Gaia is said to be ‘powered by Delfi’ Schlumberger’s ‘cognitive’ upstream data environment. But what impressed at the EAGE demo was the integration and delivery of well data from third parties, we spotted IHS Markit wells, TerraMetrics, Google and INEGI imagery. While map-based access to data in hardly revolutionary, the demo was certainly compelling both in terms of the diverse data providers that are contributing to the Gaia ecosystem and indeed the ‘high performance’ digital map. This multi-endpoint mapping technology comes from MapLarge. MapLarge exposes a JavaScript API and pre-configured components for customizing map design. The API provides access to MapLarge’s redundant data center infrastructure and is claimed to enable the imports of billions of records per second as well as performing ‘trillion record aggregate queries in milliseconds’.

WesternGeco has added functionality for streaming seismic data from the cloud into a ‘UX-infused’ knowledge management system designed to span the upstream opportunities pipeline. At the EAGE, Gaia was demoed with a zoom in and drill-down into the current Egyptian licensing round. 2D and 3D seismic data sets can be inspected in the viewer. In future editions, natural language processing technology will enable intelligent access to relevant reports and news articles (so far, only the seismic data viewer is available). Gaia comes at three access levels, Gaia Earth (look what’s there), Gaia Viz (check it out) and Gaia Pro (with full functionality of Schlumberger’s Delfi SaaS portfolio). Test drive Gaia here (registration required) or visit the Gaia homepage.


Editorial - from Fourier and Nyquist to ... kriging, big data and ... welding

Neil McNaughton ties-up some lose ends in his exposition of numerical sampling before weighing-up the merits of a digital education with the lost art of welding.

Back in my 2017 editorial ‘Digitalization,’ from Harry Nyquist to the ‘edge’ of the internet I said that I would return to the topic of mapping things of temporal and/or spatial extent. The issue is at the heart of ‘digitalization’ and is, or should be, of concern to those working in just about any science and engineering field. As the digital world takes over from reality, questions arise as to the fidelity of the digital representation, the ‘twin’ if you like.

To avoid ‘aliasing’, when sampling a time series or when making maps from discrete point measures of topography, gravity measurement or radar altimetry, it is necessary to filter-out higher frequencies. This leads to a ‘world view’ that that is constrained by the sampling interval. A constraint that is OK-ish if you are sampling your data at a suitably high frequency. A seismic record every few meters will do nicely when mapping a large object of interest. But what happens when there is no seismic information. Perhaps because it is too expensive to acquire, especially now that those acquisition folks have gone “asset light”, or when seismics does not do a very good job of imaging the target. That does happen. Or, in the case of shale exploration, seismic acquisition may or may not fit well with the ‘factory drilling’ paradigm.

If there is no seismic, then you may just have to make do with well information. To keep things simple, I am going to consider vertical wells (remember those?) drilled at regular intervals across an area of interest. If there are relatively rapid lateral variations in the target, say with lateral changes in facies between wells, then you are potentially in a Nyquisty-kind of situation. One approach would be to contour say the formation tops (or net to gross or whatever you are looking for) only taking the well spots into account. This would produce a smooth, low-pass filtered picture which you may feel, especially if you have a geological background, is unrealistic.

Geologists may look at the well data and decide that, for a given sedimentary interval, the well data suggests a particular environment. Perhaps a shoreline with marine stuff on one side, and duly eyeball the extend of say, the facies of interest and locate future wells accordingly. This approach involves adding ‘prior’ information to the raw data, here the prior is the fact/notion that there is a limit between say, a sand-prone facies and deeper water shales.

As any geologist knows, this is an oversimplification driven by the low spatial frequency of the well data. Reality is likely to be much more complicated with inter-leaving facies, with outliers and in general, lateral changes that happen at a higher spatial frequency than the well data can capture. This is a situation that the folks involved in hard rock mineral exploration know well. The approach here is one of statistics. In fact, it is a specialist sub-area of statistics called geostatistics that was developed by South African mining geologist Danie Krige and later developed, notably in France at the Geostatistics Center of the Paris School of Mines and its commercial spin-out Geovariances. Note that the ‘geo’ in geostatistics is not geology but geography, in that these approaches have application in many other fields where there are spatial variables, the Paris Mines site gives examples including site pollution, air quality, and epidemics all of which are amenable to the approach.

To return to our geological example, there is one very convenient ‘prior’ than can be added into a study. Back in the 19th Century, one Johannes Walther observed that the vertical and lateral distributions of sedimentary facies are related. Walther’s ‘law’ allows us to use the vertical observation of facies down the well bore as ‘prior’ information to inject into a statistical model. As I was mooting this editorial, the folks at Agile posted an elegant description of the use of Markov Chain statistics to map/predict facies in a direct application of Walther’s law.

Geostatistics, à la Krige, coming from the mining industry, does not have a sedimentological orientation so the sequence/stratigraphical approach is less obvious. Mining geostatistics’s prior is the assumption that we know something about the likelihood of a given value occurring at a given distance from any point in the sample space. The technique uses a ‘variogram’, a spatial probability diagram, to characterize and map a parameter in sub-observational detail. This may be useful if your data is varying in a more random manner, say something like the porosity in a vuggy limestone where there is less stratigraphical order to leverage as a prior.

The notion of ‘characterization’ is important. Both Markov and Krige provide data where none was observed, either spatially or in the future. They fill in the missing information with data that has the same or similar characteristics as that observed elsewhere. Statistical techniques can be used to ‘characterize’ data from control systems which may be useful in a digital twin. I heared recently from a cyber security expert that the developers of Stuxnet had cut and pasted bits of plant time series data to spoof plant recordings. This actually made it rather easy to detect. It would have been harder to detect if the data had been ‘characterized’, ie made-up to be statistically similar rathe than a copy.

Geostatistics today is a bit old hat and seems to have been overtaken by ‘data science’ with a proliferation of estimating and forecasting techniques used in time series (like production forecasting) and other fields. One interesting gotcha in the big data approach is that if, as might seem a good idea, you try to impose too much science, in the form of prior information, the machine learning may not work so well. The more constraints there are, the harder it is for the model to converge. Also, doing a lot of statistics does not change the fundamental issue that characterization does not equate to reality. An AI/ML model is judged on how successful it is in fitting data, not on how ‘scientific’ it is.

Finally, who is best qualified to do this modeling stuff? With today’s enthusiasm for AI and data science there is huge pressure for knowledge workers (geoscientsts and engineers, even pumpers) to ‘learn Python’ and become data scientists. The notion that Python programming is now a necessity to advance your career is now quite widespread, I’ve noted entreaties to ‘teach Python’ at business schools.

But geologists have a lot of other stuff to learn that may be more directly targeted at doing their job. They may have to do arduous trips to the Caribbean to commune with the sedimentological processes that are current day analogs for their reservoirs. There are thin sections to study, crystal forms to learn and vast bodies of knowledge to ingest. Learning your domain specialty sans big data and AI is still a noble goal. If nothing else, it helps decide if the priors that your modelers are using are correct and/or if the results that the machine spits out are plausible. But the pendulum of science and things digital has now, I suspect, swung too far in the direction of AI. In defense of this thesis I offer an observation from a completely different field.

France is a world leader in the field of nuclear power. It is also pretty good at churning out mathematical and scientific whizzes. I recently attended a presentation on the future of quantum computing where it appears that the government is ready to put a few zillion Euros into quantum computing R&D. Meanwhile, back in Flamanville, the latest nuclear build is a decade overdue and a few billion Euros overbudget. This is in part down to a failure in some high-tech welding that is now buried in concrete and which is going to have to be dug out and fixed. I may be crazy, but I see this as the way of world. If it’s digital it gets attention and funding. Welding is a lost art.


ClimateNet, deep learning for climate research

US DoE Lawrence Berkeley National Lab to develop ClimateNet, a deep-learning approach to weather and climate forecasting. Oil IT Journal quizzes Berkeley Lab researcher on the state-of-play in the physics-based modeling vs. data-driven forecasts, ‘still a hot topic for research’.

Back in 2016, in his editorial ‘Watson and the weather’, Oil IT Journal editor Neil McNaughton speculated on the possibility that weather forecasting might prove to be a litmus test for artificial intelligence. The huge data sets acquired over decades ought to provide a great comparison between full physics based forward modeling and data driven prediction. It turns out that such questions are at the cutting edge of climate research, as a recent release from the Department of Energy’s Lawrence Berkeley National Laboratory showed. A Berkeley Lab team is developing ClimateNet to ‘bring the power’ of deep learning methods to identify important weather and climate patterns via expert-labeled, community-sourced open datasets and architectures. The release has it that the resulting deep learning models will identify complex weather and climate patterns on a global scale.

We spotted an opportunity to take a raincheck on the state of play in the match between deep learning and physics-based modeling and asked Berkeley researcher Prabhat why a data driven model might do better than a traditional, full physics-based forward model, and whether this would be true for other industries that use computer modeling, like oil and gas. Prabhat first pointed out that ClimateNet is purely aimed at post-processing climate model output (i.e. finding patterns in simulation data) and is not proposing to replace a climate modeling with a data-driven ML/DL approach. He added, ‘When comparing data-driven ML/DL models to physics-driven models, we don’t have sufficient evidence to prove that ML will be universally better. If anything, ML methods will suffer from access to limited training data and may not generalize to regimes that have not been seen before’. We then invited Prabhat to read our 2016 editorial and comment which he kindly did.

Hi Neil,

This is probably a much longer conversation, but some quick observations:

Within the DOE, we’ve identified three broad areas for Deep Learning/Machine Learning: 1) DL for Data Analytics (think analysis of datasets from telescopes, microscopes, genome sequencers, etc. Much of ‘Big Data Analytics’ technologies are targeting this space. 2) DL for Simulations this is about augmenting, enhancing and *maybe* replacing conventional PDE*-based solvers which your article refers to this as ‘forward’ models 3) DL for control this is about controlling experimental facilities (telescopes, light source beam lines) or computational facilities.

Your article was speculating about whether data driven methods (DL/ML/Watson/…) could replace conventional models. Clearly, this is a hot topic for research, and there aren’t conclusive trends at this point. We understand very little about the theory and generalization properties of DL, we don’t have ‘proofs’, training the DL system on a certain climate regime, and asking it to make predictions in another climate regime could be problematic.

It is becoming clear that we shouldn’t throw away 40+ years of applied math research. Incorporating some notion of physics laws/principles into black-box DL models will likely be key. For more on this checkout the paper below**. We also have a major paper under review (for the Gordon Bell Prize) that employs Deep Learning to learn solutions of stochastic differential equations.

Prabhat

* Partial differential equation.

** Deep Hidden Physics Models: Deep Learning of Nonlinear Partial Differential Equations, Maziar Raissi 2018 - https://arxiv.org/pdf/1801.06637.

In a possibly similar vein, we came across a recent NIST investigation into Explainability in Artificial Intelligence and Machine Learning in a computer security context.


Total’s Cognitive Search

Service provider Sinequa has developed an information retrieval system running in the cloud leveraging domains-specific knowledge representation.

Mathilde Fourquet presented Total’s ‘Cognitive search’ platform at the 2019 ‘Big Data Paris’ conference. Cognitive search seeks to develop the ‘transversality’ of information retrieval by leveraging unstructured textual data across different fields of expertise, provide support for new uses of information including natural language queries. Cognitive search is said to improve the user experience for all Total employees. Total is a long-time advocate of best practices in data science having engaged France’s Sinequa to develop a refining and chemicals information portal back in 2008. In 2016 Total awarded Sinequa a service contract to develop a dedicated platform to handle some 50 million (now up to 150 million) engineering and geoscience documents.

The work has involved the consolidation of multiple text processing and portal initiatives around the company onto an information retrieval competency center for the Total group. The center is developing technology for dynamic categorization of 20 million geoscience documents, using AI to classify scanned documents by well, basin and field. Natural language processing has been applied in a proof of concept on rotating equipment. This has produced a 70% hit rate for ‘relevant’ document retrieval thanks to a major, domain-specific ontology and semantic description effort. This is now being extended to ten new domains with improved query and results. Initially the solution was deployed on premise but encountered performance issues. It is now running in the cloud.

One so far unsolved problem is cross-discipline retrieval. Total is looking for a broader industry-wide solution that looks at the big picture. This should allow for natural language queries to retrieve relevant documents, images or videos. Total is now working with Sinequa on a federated search platform and a new interface, tuned to new use cases. For Fourquet, ‘the key is to build a model and user experience that integrates with our own DNA and problem set’.


Agile Scientific announces Open Subsurface Stack

A collection of open source Python tools is mooted as interest in open geoscience software rises.

Agile Scientific has announced the Open Subsurface Stack, a collection of open-source Python tools for earth scientists. Agile CEO Matt Hall blogged recently on the rise of open source software in the upstream, observing ‘open data platforms are appearing all over the place’ and that ‘there seems to be a renewed appetite for open source subsurface software’. Hall warns that ‘the word open has been punted around quite a bit recently, but you have to read the small print’. A robust and trusted stack will require more than code. Testing, documentation and supported will be key. Agile’s community already has a ‘proto-stack’ with Python routines for ‘segyio, lasio, welly, and bruges’, all available on the Software Underground.

Comment: Agile’s efforts are laudable but they collide somewhat with the recently announced Open Subsurface Data Universe from Shell and The Open Group. Looking further afield we have Halliburton’s Open Earth community and the announcement of ‘open source’ software from Schlumberger. The Schlumberger and Halliburton platforms may expose code, but they are, or will be, somewhat tied to their respective software platforms. We understand that the OSDU’s code base will be opened up to non-members later this year.


Emerson and Total co-present OpenDB at EAGE

Resqml-based database solves multi-disciplinary, multi-vendor reservoir studies challenges with a standards-based solution spanning the subsurface value chain.

Speaking at the 2019 London EAGE (more in our next issue), a joint presentation by Total and Emerson/Paradigm provided an update on the OpenDB project, a database for reservoir engineering applications that leverages Energistics’ Resqml data standard. OpenDB has been used in several multi-vendor proofs of concept involving either file-based data exchange with Resqml or streaming data between tools with ETP, the Energistics transfer protocol. OpenDB, whose development is steered by a consortium of operators, focuses on capturing the results of multidisciplinary reservoir studies. Total uses the tool to ‘encourage’ the organized storage of validated study results, to promote collaboration, smooth handoff of relevant data and results between disciplines and to enable interoperability in a heterogeneous multi-vendor application environment (OpenDB is even designed to be self-contained and ‘independent from Emerson applications’).

Flagship trials include the ongoing Resqml SEG Kepler field pilot* which leverages file-based data transfer and cloud data synchronization (Google, Amazon). Another trial involves microservices leveraging the ETP Websocket asynchronous communication with an Azure data lake. Total concluded that the efficient exchange of data and results is a key factor in determining a successful outcome in a reservoir study. OpenDB solves multi-disciplinary, multi-vendor reservoir studies challenges by providing a data management based on open standards that covers the whole subsurface value chain including reservoir modeling and simulation while avoiding vendor ‘lock in’. OpenDB is also said to be an integral part of Emerson’s digital transformation strategy.

After the EAGE Oil IT Journal interviewed OpenDB product manager, Emerson’s Alice Chanvin.

Oil ITJ - Is the source code for OpenDB available and if so, in what language/database technology?

Chanvin - The source code could be made available to a consortium member under some conditions, but it is not open source. OpenDB is available on Oracle and PostgreSQL.

Oil ITJ - Will it be available to consortium members and non-members?

Chanvin - Today, OpenDB is only available to consortium members. When it goes officially commercial it will be available to any customer.

Oil ITJ - Does OpenDB resemble in any way (coverage/style/inspiration) the PPDM database**?

Chanvin - No, it is based on Energistics’s standards and really wants to focus on domains that are traditionally not, or poorly covered by traditional data management systems principally reservoir modeling and reservoir engineering (in addition to seismic interpretation).

Oil ITJ - What exactly is the relationship between OpenDB and the OSDU? Or what might it be?

Chanvin - Today, there is no official relationships between OpenDB and OSDU. Emerson is active in OSDU. We feel it is important for OSDU to adopt Energistics standards and believe OpenDB can play a role in the OSDU Data Platform. We will provide more information as both OSDU and OpenDB consortium evolve.

* More information on this presentation is available from Energistics.

** Back in 2014 the first version of OpenDB was described as ‘blending’ PPDM and Resqml.


AI big data news

BP’s algorithms replace pumpers. BP seismic guru havers between computational physics and ‘possibly overhyped’ machine learning. US NSF 2019 update on AI R&D and standards. Matlab’s predictive maintenance toolbox.

It is customary for those advocating the application of artificial intelligence (and not just in oil and gas) to conclude a presentation with reassuring words along the lines of “of course AI will not replace you, it will ‘free you up’ to concentrate on more productive work”. This is not how things worked out for BP’s Wyoming-based pumpers whose jobs, according to an article in Forbes, are being ‘taken over by algorithms’. BP has deployed San Francisco-based start-up Kelvin’s AI to monitor real-time data streams and optimize production from its Wamsutter oilfield. The result is that BP needs 40% fewer workers to keep its natural gas flowing in Wyoming. What’s more “field techs are now getting trained in Linux and Python.”

In a ‘Seismic Sound-off’ podcast from the Society of Exploration Geophysicists BP’s John Etgen opined briefly on the yin and yang* of conventional computational physics (CP) based seismic and novel machine learning (ML) approaches. With some understatement, Etgen said that there is a chance that the promise of ML is overhyped. ‘I'm not going to predict that the CP approach is dead. But it is drawing itself into a corner where every new advance requires an order of magnitude increase in compute power and makes the fundamentals of the inverse problem much more complicated to solve. We are making life harder for ourselves. But I'm not sure the ML stuff is going to deliver everything we want either. There’s good work to be done to figure out what that optimum pathway actually is. Other data intensive domains that are not math/physics based, where you cannot "prove" what the truth is, are moving towards ML. We scientists and engineers are not moving as fast in that direction as some others are. We’ll see how it turns out. I don't know.’

* See our Watson and the weather update elsewhere in this issue for more on this.

The Computer and Information Science and Engineering division of the US National Science Foundation has published a short introduction to the US National AI R&D Strategic Plan. AI began back in 1956, when a small group of computer scientists and mathematicians met on Dartmouth’s campus and first coined the AI term. Investment in AI is now a national priority, with notably a 2019 White House Executive Order on ‘maintaining American AI leadership’ and the American AI Initiative. The 2019 update to the National AI R&D strategic plan describes work done to develop AI-relevant standards including P1872-2015 (Standard Ontologies for Robotics and Automation and the standardization program of ISO/IEC JTC 1 SC 42 on AI. Those looking to benefit from the NSF’s largesse need to visit the awards searches page.

For a quick introduction to predictive maintenance, download the free Matlab handbook, a 17 page walk-through describing how to identify condition indicators and discriminate between healthy and unhealthy machine states and how to develop, train and deploy ML models in production. The software is available as a trial.


2019 CFIHOS Annual Meeting

Capital facilities information handover standard meeting hears from Yokogawa on Yamal LNG, Autodesk on DEXPI piping and instrumentation standard, Equinor’s ILAP project.

While global warming and sea level rise is not particularly germane to a discussion of Cfihos, the capital facilities information handover standard, the town of Amersfoort, where Yokogawa recently hosted the annual Cfihos board meeting is a Dutch talisman for sea level rise. The Dutch phrase ‘Amersfoort by sea’ is shorthand for what might happen if Holland’s pumps stopped pumping*. As we reported in our last issue, the main business of the day was the mooted/impending* transfer of the Cfihos intellectual property from USIPI NL to the UK IOGP’s JIP 33** technical standards committee. The fact that Cfihos’ future was unclear allowed the meeting to cover other, equally interesting, non Cfihos activities.

Elbert van der Bijl and Patrick Kools, both from Yokogawa Europe, teamed to present Yokogawa’s contribution to the Arctic Russian Yamal LNG project, ‘a project where there were no standards’. As a subcontractor to the Technip/JGC Corp. Yamgaz joint venture, Yokogawa was the main automation contractor for Yamal and provided process control, safety instrumentation and alarms from its Integrated Production Management System (IPMS) portfolio. Yamal is quite a beast, the equivalent of around two Groningen gas field. Yokogawa designed and delivered the IPMS during construction, integrated with the Quantum control system and OSIsoft PI. Delivering first gas went well but delivering the first invoice required ‘a new way of thinking’ for the Russian partners. An IBM middleware bus and PI OPC connectors connect 16 apps from 16 vendors systems across Aveva’s engineering data warehouse, Meridium work management, Lims and gas management systems. ‘All without standards I must say!’ Each app provider tested its own integration before acceptance. In the end, there were 48 point to point integrations, each a joint effort between business owner, app vendor, middleware vendor. Ownership was shared between the business and the information architecture team. Yamal now has a single version of the truth. In the Q&A, Anders Thostrup observed that ‘you made a standard on the fly’. Note however, that the Yokogawa system was delivered for operation of Yamal. So this activity is all post-handover.

Comment – Not however that neither Total or Technip made use of Cfihos during construction. Yamal could be taken as a great example of the centrality of middleware as opposed to a protocol definition. It also may be telling us that interoperability is less of a problem now than it used to be.

Reiner Meyer-Rössl (Autodesk) presented on DEXPI, data exchange across the plant lifecycle. Dexpi’s focus is the piping and instrumentation diagram (P&ID), one of most important documents in the plant. P&IDs are 2D drawing along with equipment lists and structures. They are currently delivered as either Autodesk DWG files, on paper, in Excel or as PDF. There are multiple systems and workflows for dealing with P&IDs that are hard to relate one to other. DEXPI’s aim is to migrate from document to a data handover. The concept already has currency in vendor solutions such EG Siemens Comos P&ID, Autocad PID and X-Visual. These can also re-used DEXPI-formatted data in other applications. Dexpi holds a bi-yearly hackathon, the most recent was chez eVision in The Hague. Dexpi is work in progress, P&ID is just a start. The Dexpi spec V1.2 is a free download from the website. There is also a Dexpi validator from AixCAPE. More too from Github.

Robert Skaar reprised Equinor’s ILAP project as ‘a slightly different approach from Yokogawa’. All vendors (Yokogwa, ABB, Siemens, Oracle...) want to sell you their own stuff ‘at great cost’. We don’t want to pay more! Oils need to exchange data and schedules between SAP and Oracle seamlessly but ‘sometimes we pay 2x 4x the price – and get poor quality’. Every industry in the world is paying over the odds for lower quality data. Why do we accept this? Habit? Most industries (banking, airlines with Amadeus) have done interoperability since the 1970s. Not so in oil and gas. ILAP (integrated lifecycle asset planning) started in 2012 under the auspices of the IOGP/ICCC. Its findings are published as ISO 15926-13 and are delivered as an API for Primavera, SAP, Safra, Excel and MS Project, built on common planning theory and practice. Implementation strategy is simple, users need to require ILAP in contracts. The system went into production at Equinor in May 2018. ENI, ConocoPhillips and AkerBP are in late phase pilots. ILAP V1.5 was out in December 2018, Ilap 2, aka ILAP-as-a-Service, will be delivered ‘real soon now’. Other related Equinor projects include ILAC (project control), ILAR (risk) and READI (documentation).

* Anecdotally, we heard that a couple of days of power outage could put half the country underwater!

** See also the letter from IOGP elsewhere in this issue.


On the transfer of CFIHOS to IOGP

Oil IT Journal received a clarification from the International oil and gas producers' association regarding the transfer of the Capital facilities information handover standard.

To IOGP

Ladies, Gentlemen,

I attended a meeting of the USPI-NL Cfihos standard earlier this year and am in the process of writing a short report on the event. I learned that Cfihos, the capital facilities information handover standard is to be handed over to the IOGP where it will come under the JIP33. I would be extremely grateful if you could confirm this and perhaps provide a few lines on your plans for Cfihos in its new home.

Regards Neil McNaughton
Editor, Oil IT Journal

To Oil IT Journal

Neil,

Your enquiry has been passed to me because I am leading the Task Force that is managing the transition of the CFIHOS development from USPI, which has been administering it until now, to IOGP. This reflects a desire by the CFIHOS members to accelerate the development of CFIHOS and expand its scope in the supply chain by accessing the additional resources and funding that IOGP can provide. The CFIHOS work will be conducted under a new IOGP Joint Industry Project (JIP 36). All current and prospective CFIHOS members will be invited to participate in the new JIP 36, including EPC Contractors, equipment suppliers and software providers.

CFIHOS currently has approximately 70 participants. A further advantage of bringing CFIHOS into the IOGP standards development work is the linkage to JIP 33: the information requirements in those technical standards can be directly aligned with the CFIHOS Reference Data Library and Data Model, facilitating the industry’s transition to more digital work processes. CFIHOS version 1.4 is to be released shortly. The next steps in the development of CFIHOS will be governed by a new CFIHOS Steering Committee that will be elected by the CFIHOS members participating in JIP 36 later this year.

CFIHOS will continue to be a collaborative venture between Oil and Gas Owner/Operators, other process industry Owner Operators (mainly Chemicals and Nuclear), EPC Contractors, Equipment Suppliers and Software Providers, all of whom are providing valuable input to ensure that CFIHOS is a practical standard that can be used by all stakeholders to increase efficiency – reducing both cost and cycle time.

Regards, Robert Talbot
Chief Project Management Engineer
ExxonMobil Global Projects Company


Software, hardware short takes

Aker Solutions ‘ix3’. AspenOne V 11.0. Axis Core. Bluware Teleport. CGG MagCube. DGB OpendTect. DNV GL Veracity Deep Search. Emerson Geolog 19. EIA State energy portal. Esri ArcGIS Earth 1.9, AppStudio for ArcGIS 3.3. FutureOn FieldTwin. Geophysical Insights ML workbench. Inductive Automation’s Ignition 8. Ikon Science RokDoc 6.6.3, iPoint 2019.2. INT IVAAP 2.3. IHS Markit Kingdom 2019. KBC Visual MP3.7. Lloyd’s Register IP 2019. Orbital Insight GO. PDI Marketing Cloud. Petrotechnical Data Systems Ava Structure. PetroWeb Global Well Library. PrecisionAnalytics Energy. Rock Flow Dynamics tNavigator 19.2. Siemens XHQ V6.0.

Aker Solutions has announced ‘ix3’, ‘a software and digital services company’. Ix3’s flagship is ‘Integral’, a digital twin platform that collates engineering, manufacturing and test data with live-streamed operational data to accelerate field development and optimize asset performance.

AspenOne V 11.0 sees the introduction of Aspen GDOT dynamic optimization software and new prescriptive maintenance, planning and scheduling solutions.

Axis Energy Services has announced Axis Core, a data acquisition and predictive analytics system for safer, more efficient completion and workover operations.

Bluware’s Teleport and VDS compressed seismic data format has demonstrated sustained 200 gigabytes per day streaming of seismic data from a Polarcus vessel.

CGG Multi-Physics has announced availability of pre-calculated MagCube in-field referencing models for measurement while drilling surveys over seven US onshore basins and plays. MagCube combines a global geomagnetic reference model with local magnetic survey data to deliver dip, inclination and total field values at depth, providing a reference frame for directional drillers.

DGB has announced a new seismic machine learning plugin for its OpendTect Pro flagship. The new tool links seismics to the R&D world of Python, TensorFlow, Keras and Scikit Learn. Watch the video.

DNV GL’s Veracity Deep Search perform search across business-sensitive data, from text to imagery, discovering and presenting queried data and information ‘within seconds’. DNV GL has also released a new version of ISRS, its International Safety and Sustainability Rating System, which now covers ISO 55001, ISO 50001 and ISO 27001 standards for asset management, energy management and information security.

Emerson has released Geolog 19, the latest version of its formation evaluation software suite.

The US Energy Information Administration has announced the State Energy Portal, providing ‘timely state-specific energy information in one place’.

Esri’s ArcGIS Earth 1.9 includes enhancements to legacy KML/KMZ file types, support for 3Dconnexion SpaceMouse for 3D navigation and usability enhancements. AppStudio for ArcGIS 3.3 has also been released with new 3D viewer sample app, support for building Android 64-bit apps and an extension for Visual Studio code.

FutureOn has announced FieldTwin, a cloud-based digital twin of a subsea field from first concept to first oil and beyond.

Geophysical Insights has announced new applications in the Paradise machine learning workbench. New deep learning/convolutional neural network technology have been applied to seismic facies classification and fault detection.

Inductive Automation’s Ignition 8 release adds ‘Perspective’, a ‘pure-web’, mobile solution built specifically for industrial applications that enables building of full SCADA, HMI, alarming systems, and other applications from a mobile device.

Ikon Science has launched RokDoc 6.6.3 and iPoint 2019.2 with higher data volumes, faster well and seismic data processing and new reservoir optimization through time-lapse modelling and ‘Deep QI’ machine learning.

INT has released IVAAP 2.3, with cloud storage connectors and integration with ArcGIS and PPDM.

IHS Markit has released Kingdom 2019 with direct connect and query to Enerdeq, a modern look and feel and more consistency across IHS Markit desktop applications.

KBC (A Yokogawa Company) has released of Visual MESA Production Accounting 3.7 for hydrocarbon processing and chemical plants. VM-PA automates the capture, balance and tracking of complex-wide systems to reduce operating costs.

Lloyd’s Register has released IP 2019, its modular formation evaluation suite, a ‘capable and customizable’ solution to share and interpret well logs and other data types. The basic bundle delivers data management, calculation and deterministic workflow capabilities with 20 further optional modules.

Orbital Insight has announced ‘GO’ its commercial geospatial analytics product. GO is a multi-source intelligence platform that ingests and contextualizes millions of data points from satellite imagery, SAR sensors, geolocation data from nearly 800 million connected devices, and other sensors to provide observations based on the user’s criteria. GO’s computer vision and data science algorithms monitor various activities, including oil supply, in near real-time.

PDI has launched the PDI Marketing Cloud, adding marketing and loyalty capabilities to its ERP, logistics and fuel pricing portfolio.

Petrotechnical Data Systems has released Ava Structure, a structural geology investigation platform that helps determine fault extensions and reservoir compartments. A post-processing function creates optimized inputs for structural and property modelling workflows.

PetroWeb has released its Global Well Library a catalog of information on nearly seven million oil wells worldwide.

Drone operator PrecisionHawk has announced PrecisionAnalytics Energy, an aerial mapping, modeling and inspection platform that uses AI to automate analysis of aerial data including oil well pads or utility towers.

Rock Flow Dynamics has released tNavigator 19.2 with support for multiple GPUs in isothermal compositional models and a new Python API for Network Designer.

Siemens XHQ Operations Intelligence Software V6.0 adds ‘edge’ functionality, secure access to remote data, an HTML5 client and support for server virtualization.


SAP in Oil and Gas 2019, Milan

Significant ‘going green’ component at SAP in Oil and Gas. Move to cloud well under way, but which cloud (SAP, AWS …?) and how (components, SaaS, lift and shift...?). Shell’s determination to see componentized SAP in the public cloud. Amazon spins-up an SAP instance ‘in hours’. PG&E’s ‘imperfect’ blockchain PoC. Shell’s SAP warehouse. BloombergNEF on the energy transition. Innogy’s SAP/HANA deployment. Equinor’s DigitalInc unit. BP dumps Maximo for custom SAP work manager. AI/ML developments from Anyline, Aperio and SAP.

SAP’s Peter Maier described the world energy business as ‘under construction’ with pressures from climate, deregulation, decentralization and digitalization. Diesel/petrol engines are ‘hugely under pressure’. Germany is to sunset coal. Elsewhere global megatrends will impact utilities, energy, mobility and the global supply chain. All of which is fine for SAP which has digital solutions for all of the above. Maier encouraged corporates to ‘join the S/4HANA movement’ and make SAP information ‘open and transparent’ but with the caveat that ‘this is a non-trivial task’. SAP is developing a new maintenance solution with AI and predictive analytics. With regard to the path into the cloud, ‘most of you are on the left, just starting out’. Companies can expect to spend ‘at least 10 years in the middle’, with a hybrid on-premise/cloud solution. Moving to the cloud is a complex process. SAP business consultants are there to help ensure a clean cloud solution on arrival. Today, big utilities are running on S4, ‘I could not say this 2 or 3 years ago’. Maier recognizes that there is a big desire to run on the ‘hyperscalers’, Azure, AWS and Google Cloud and gave a ‘clear commitment from our side’ to help with the migration. Celonis process-to-pay metrics solution got a shout-out even if, as Maier said ‘we could have done this ourselves’ (in Ariba). But ‘when I saw Celonis Process Miner, it blew me away’. The tool is used by Shell, Exxon, BP, Total and Schlumberger. Qualtrics was also on-stage with its transformational forecourt experience. The company combines operational ‘O-Data’ with ‘X-Data’; i.e. user and customer experience data.

Green investor Peter Molengraaf provided a state-of-play of the electricity business. He believes that the currently overlooked low voltage side of the business (domestic/consumer) is where the action is or will be soon. Solar photovoltaic (PV) and wind will soon be cheaper than existing fossil assets. To date, all forecasts have underestimated the potential for cost reduction of renewables. So, we will be building renewables as fast as you like, except that most grid operators will have problems handling the new energy sources. Politicians are unlikely to block this cheap, local energy such that wholesale prices will be low. Some expect that demand shifting technology (batteries for day/evening use and storage with hot water, hydrogen, synfuel) will rise. But storage may not be necessary when electricity is cheap every day and it will be very hard for storers to compete with direct energy users. Users will be in a position to arbitrate and will be in charge of the energy transition. For utilities, investing in renewables becomes risky. Renewables will be ‘grid-like’, generating low returns. The ‘inversion point’ comes when new renewable capacity is cheaper than existing fossil generation. Molengraaf observed that here, ‘every year, the EIA is wrong in the same direction’. The transformation brings IT opportunities, managing other parties’ assets like EV charge points, heat pumps, solar installations and battery storage on forecasting platforms with very high granularity. There will be hundreds of thousands of substations so medium/low voltage monitoring will be key. Comment: This sounds like the telcos at the advent of the internet when they turned around a commodity business into a money-spinning value add.

Frank Westerhof provided an update on Shell’s journey to S4/Hana on the public cloud, described in Oil IT Journal last year. Shell has been a SAP customer for 40 years (SAP was founded in 1972) and is now a ‘deep partner’. Shell’s own implementations (through to 2017) created economies of scale with single global instances for its main lines of business and there is no business case for further consolidation. Shell has also ‘done the cloud bit’, 12 years ago. Now the focus is on global process excellence in the face of a changing world of energy, especially in the EU. Shell has committed to decarbonize its own and its clients’ portfolio, with the implication that by 2030, ‘we will have to be the biggest power company in the world’. As Westerhof revealed last year, only 15% of its IT can be considered ‘competitive’ for the remaining 85%, ‘we want to be a consumer’. The endgame, ten years out or so, is a shift to the public cloud. Most all software providers (except SAP!) offer an SaaS*-based model. Shell is convinced that everything will move in this direction with stages of on-premises to a private, single tenant on-premise cloud, a switch to a hybrid solution or perhaps a more attractive solution, straight from on prem to the public cloud. But is software ready? No. Shell is working with SAP in the hope that the move will be made in one go. This is a paradigm shift away from the single global instance, ‘no longer the holy grail’. ERP is non differentiating so the plan is for a cloud version of SAP S4/Hana that is being developed in the Cloud Consortium for Oil & Gas, a consortium with Equinor, Apache, Devon, BP and Chevron as members. This will enable a move to a multi-tenant environment across Shell and its customers (B2B) and consumers (B2C). Previously as scope expanded, so did complexity. This can be avoided with a simple ERP for a simple upstream asset. Shell does not want either to be the ‘complex’ tenant or too small to be worth bothering with. Shell is aiming for a ‘balanced integration with 50 or so Shell tenants’ and a standard configuration. This is ‘a greenfield journey and a huge opportunity’ to move from process to data centricity. Shell now has three S4/Hana tenants live on the public cloud, New Energies, the pension fund and its Samco asset management unit. However, not all SAP-to-SAP integration is available out of the box which currently involves a high degree of friction and customization. The vision is for data centricity with internal and external connectivity, leveraging open standards, and AI-driven ‘touchless’ processes. Comment: Adding AI to ERP is a coming-home for SAP’s founders who left IBM back in the early 1970s where they were working on … AI.

Shell’s presentation was a great introduction to a ‘public cloud’ provider in the person of Mert Dogu, business lead for SAP on Amazon Web Services. Dogu offers to ‘spin up an SAP system in hours’ and enable interaction between SAP data and other sources such as drones, sensors and digital twins and use machine learning to ‘predict upcoming events that need fixing’. SAP at is at the heart of the digital transformation for many oils as is the shift to the cloud. But the migration to S4/HANA has not really happened yet and partners and customers are asking if they should go cloud first, or stay on Oracle, or go to S4 or brownfield, on prem, in colocation or what. These decisions are long term and costly which is where the ability to trial software on AWS comes in with the AWS/SAP FAST program. SAP can also connect to a S3/AWS data lake or into IoT/ML with Greengrass on AWS. In a statement that appeared to counter Westerhof (above), Dogu stated that SAP uses AWS to provide SaaS functionality, at least for NS2 and Concur. BP and GE Oil & Gas run SAP on AWS. BP is retiring data centers, ‘removing technical debt’ in the form of 3,000 apps and 7,000 servers and has ‘saved 30% on TCO’. More SAP/AWS case studies here.

Tanya Doyle Pacific Gas & Electric teamed with SAP’s Raik Kulinna to present a proof of concept of blockchain for supply chain traceability. As a regulated business, PG&E needs to be sure that its supply chain is fully qualified and all materials traceable. It can be hard to get documentation from suppliers and PG&E is reliant on vendors to get from datasheets manufacturers. Blockchain is said to ‘verify that what we received came from the manufacturer’. A manufacturer can post quality data to the blockchain for all stakeholders to access. This makes it possible to identify materials that may be fraudulent or of questionable pedigree. The work is now being developed in an industry consortium. More from the SAP blockchain in energy booklet. In the Q&A Doyle was quizzed as to the difficulty of connecting a digital blockchain record with an actual physical asset (see our editorial on this apparent blockchain flaw). PG&E recognizes that the system is currently ‘imperfect’ but hopes that the blockchain-to-physical world gap can be bridged with bar codes or other means of identification.

Royal Dutch Shell operates warehouses around the world and in over a million trips, moved 2.5 billion tonnes of cargo in 2017. Stephan Treuer described how Shell performed a market scan to look at barcode reading warehouse apps for its Prelude floating LNG development. A couple of POCs and another market scan saw Shell sign with Innovapptive in 2018 and is in the final stages of developing a solution that includes Cognex scanners and Zebra barcode printers. Three different SAP systems (G-SAP, BluePrint and ChemSAP) are involved and now ‘all paper-based stuff is on the mobile’. The development has extended Innovapptive’s ‘market standard’ system to oil and gas. Innovapptive provides an SaaS, multi-tenant, cloud solution adapted to the distributed design and build environment of Prelude, the largest floating vessel in the world. Prelude’s Darwin onshore supply base warehouse operates on SAP Inventory Management, currently a paper-based system. Darwin will go live with Innovapptive’s mInventory in Q3 2019.

Despite the ‘SAP in Oil and gas’ title, the event proved strong on the energy transition. Albert Cheung from Bloomberg’s New Energy Finance unit stated that electric vehicles (EV) and batteries are ‘coming of age’. Worldwide, some 2 million EVs were sold last year, around 1-2% of sales. In Norway 50% of sales are EVs, in China 7%. Last year, conventional sales are dropped, heralding ‘Car Wars episode 3, the Empire strikes back’ as VW enters the EV market. The price of lithium ion batteries is down 85% in the last 8 years. The next megatrend is that a clean power system is in sight. California, Spain and others are committed to 100% renewable power in a generation. Costs have come down rapidly. Solar PV is 90% cheaper than 10 years ago. Renewables are now the cheapest source of electricity in many countries. For new builds, renewable power will soon be cheaper than fossil. PV and storage is already competitive with gas in the southern US. Overall the world can get to about 70% zero carbon by 2050 with ‘peak coal’ in 2027. Gas capacity will increase but gas burn not so much. Oil is at risk. EVs and more efficient motors are to ‘wipe out’ 8 mm b/d by 2040 although others (Aramco, BP ) do not expect the bottom to fall out of the market. Digitalization, the new ‘strategic imperative’ is to impact the upstream, downstream and utilities. The social and political conversation is changing. While a green new deal in US has met with criticism from Republicans it has created space for conversation. In the EU, climate change is considered real. In the UK, from 2025, new homes will not be connected to the gas grid. Norway’s sovereign fund is out of upstream oil. Shell now has emissions targets linked to executive pay. This social and political stuff moves markets.

Arno Hagmans, from EU green energy company Innogy, described ‘S4i’, a S/4Hana ‘digital core’ deployment joint venture with SAP that went live this year. Innogy is a user of AI/ML but wanted to consolidate its efforts around a central database aka the digital core. This needs to be simple to use, previous SAP implementations have become too complex, too customized over decades and disconnected from the SAP roadmap. ‘We will not do this again, there will be no more custom code unless absolutely necessary’. SAP Activate provided a formal approach, leveraging Solution Manager and best practices. Innogy’s IT practitioners thought that ‘sprint methodologies will never work with SAP’ but it proved possible to deliver something in three weeks, keeping it ‘simple and standardized’. The system has now been running for three months and is working well, although Innogy is struggling with change management and training of its 4,000 users. ‘We started a bit late!’. Innogy’s digital core operates in the Hana enterprise cloud and is being complemented with mobile apps, AI for incoming payments and Ariba. The S4i project also involved management consultant Horvath Partners and the SAP project management office and quality assurance consulting services.

Eirik Solberg whose is ‘DigitalInc tribe lead’ chez Equinor set out the goals for an ‘autonomous supply chain’ by 2025. This is to include 3D printing, supply chain ‘control towers’ and ‘smart contracts’ on blockchain/IoT. The internal incubator, aka DigitalInc, sets out to produce a ‘minimum lovable product’ in 10 weeks. The prize is Equinor’s 144 billion NOK spend in 2018. The aim is for ‘collaboration and trust between IT and the business’ something that has been lacking in the past. There will be no more handing over 100-page requirements documents. SAP ECC (ERP central component) is at the heart of the initiative. Accenture Latvia is involved.

Andreas Skubski presented BP’s work management transformation. Eight years back, BP had over 200 ERP systems. These have now been consolidated to some 20 SAP instances. IBM Maximo was previously used as BP’s work management system which caused some challenges in data transparency and the difficulty of bi-directional alignment with SAP. A review of the situation led to the decision to replace Maximo and go ‘end to end’ with SAP. BP then realized ‘what we were missing and how much effort was wasted in maintaining all these interfaces’. But it was not possible to do work management with the standard SAP WM. Further complications came from legal issues around joint venture company codes. BP operates maintenance crews across JVs so a plant can’t be assigned across multiple company codes. This has been fixed with a plant revamp/staging layer work around. BP’s management said, ‘if we spend to enhance SAP, can you make it sexy?’ This resulted in a new Fiori GUI and a redesign of work management with Fiori concepts. This has now become a major project with a mobile app that allows employees to create work through a validate, plan, schedule, execute and close process. The new solution has cut 60 pages of SAP documentation down to 6. In 26 weeks, the whole work management app was redesigned, still connected to the SAP Cloud back end. The new Fiori GUI shows everything on one screen, irrespective of source.

Exhibitor Anyline has a neat solution to meter reading with a mobile phone. A joint venture with OMV has developed a machine-learning based app that captures meter serial number and reading on old, including some very old, meters. The system was trained with videos of typical meters. OMV field personnel is now using the solution that is displacing ‘clunky’ purpose-built data recording devices that require values to be punched-in. Anyline is also used by BMW to read sheet metal ids in polarized light.

Aperio has developed a set of digital ‘fingerprints’ of sensor data from a PI System. This is used to spot cyber events or a broke, flatlining meter or to look for correlations between sensors. The system can be used to clean data prior to ingestion to a SAP Hana database. Aperio can ID nefarious activity such as Stuxnet spoofing PI data by copying and repeating historical recordings.

Finally, we chatted with SAP’s Simon Grabowski who was demonstrating a … demonstrator. The small enclosure is designed to show off components of SAP’s Intelligent Enterprise and the digital factory. In the configuration on show, a Raspberry PI-based enclosure housed a model pump and level detector and a connected safety helmet. These feed data into SAP Asset Manager – enabling work order generation as required. The helmet captures biometrics (sweat factor from humidity and body temperature) via a Bosch BME 680 Multimodal sensor. Should the worker be required to dismantle the pump, an optical scanner can perform a rough-and-ready image recognition of the pump and return the part number. The AI is constrained by the context of what system is being checked out. The system could suggest a fix by comparison with historical records of similar kit. Another possibility would be to use SAP Copilot (voice recognition) to ask, ‘what fool serviced this pump last?’ (these were not his words). Grabowski has built a dozen of these units to demo various functions for potential apps.

More from the conference organizers TA Cook.

* Software-as-a-service.


ABC Wellsite Automation 2019, Houston

Devon on move to MQTT. Anadarko’s NI RIO-based infrastructure for large sites. Kepware on wrapping legacy scada with MQTT. Cirrus Link’s three way path to MQTT-enabled IoT. Extreme Telematics and the Oilfield IoT Consortium. ABB, optimization through automation. Technip’s UCOS software-based scada.

The fifth American Business Conferences Wellsite Automation Conference was held earlier this year in Houston. We have selected Brandon Davis’ presentation on the modernization of Devon Energy’s scada systems as the scene setter for this internet of things-focused event. Devon is using high tech IoT solutions to speed access to real time data from the field. Today, operations need a robust network that includes drilling rigs and frac vans and that spans on-location, corporate WiFi, cameras and fiber-optic sensing. Fracking operations make for even bigger demands on the network. OEMs are delivering new products and capabilities. Newcomers are pitching the sector with IOT solutions that do more stuff in one device, promising more wells, more meter runs and more optimization. The programming model is shifting from legacy scada. Edge devices are currently used for protocol conversion but there is growing interest in their replacing more legacy PLC/RTU functionality.

Today’s scada is problematical in that systems are interdependent, integration is hard and data latency is problematical. Devon is moving to a ‘future state’ broker-based scada architecture that provides protocol conversion (to MQTT) along with edge computing and storage. This has decoupled applications and now auto-generates tags and structures, easing application integration and assuring real-time data delivery. NodeRed runs on the frac skids with a Postgres database for local storage.

The edge approach has proved easy to deploy on Modbus and Allen Bradley applications. Modbus mappings make device data accessible where MQTT is not available natively although measurement and log data still need attention. While data brokers are relatively available and easy to implement, subscriber are the ‘least mature piece’ in the oil and gas sector. Few scada hosts and Historians can currently subscribe to MQTT messages although some vendors have such on their roadmap. Meanwhile, Devon has successfully built its own subscribers which, Davis believes, ‘will be key to integrating real-time data systems across the corporation’.

For Rogier Pouwer, Anadarko is moving to larger, multi-well pads that enable economies of scale. But large pads and high well counts require introduce potential problems of control and safety, knock-on effects from process upsets and complex logistics of staged construction and frac schedules. Anadarko uses a National Instruments RIO-based infrastructure to integrate skid packages with the main control system. MQTT is being investigated as a further communications option. Cyber security is a constant preoccupation as control systems move ‘from a world where you build higher and thicker walls to one where you need to be able to continue to be productive while constantly being infiltrated’. Anadarko has been investigating wireless devices with a view to reducing costs and making future I/O expansion easier. The trials showed marginal cost savings but demonstrated that wireless communications are very reliable. Pouwer observed that current operations technology has not caught up with IT standards and has led to stranded data and network bottlenecks. This, while users need more data and expect to be doing more with it, leveraging data science, analytics and machine learning.

Steve Sponseller from PTC Kepware provided use cases to demonstrate the value of MQTT. Kepware’s Kepserver sits between in-field data sources and IT/historian/data lakes, providing multiple outbound protocol conversions including MQTT. MQTT ‘wraps and extends’ legacy scada systems and enables ‘tightly integrated loosely coupled’ applications as opposed to today’s single vendor ‘behemoths’. One producer replaced around 90 ETL jobs per day with an MQTT-enabled publish and subscribe model. Now data goes (via MQTT) into ApacheActiveMQ, MuleSoft (for integration with ERP) and a Cloudera data lake for analysis with Spotfire R Studio. The Pub/Sub paradigm has been hugely beneficial.

Arlen Nipper (from Cirrus Link) delved deeper into the merits of MQTT. Cirrus Link’s big opportunity came when Steve Koenig at Phillips 66 wanted a better way to utilize his new TCP/IP based VSAT system and bridge the field to SAP. MQTT is ‘simple, efficient, stateful and open’. It is great, because it left a lot of stuff out, including security! So deployment involves a few other considerations. Nipper offers a three-step route to IoT enablement, 1) decouple! connect devices to infrastructure, not to applications, 2) demonstrate a ‘superior’ operations technology solution and 3) provide a ‘single source of truth’ for all equipment tags. Cirrus Link recommends using MQTT Sparkplug and ‘real’ middleware such as its own Chariot MQTT Server. A 2018 survey from the Eclipse Foundation is said to demonstrate the primacy of MQTT which has now overtaken HTTP for IoT applications.

Mark Scantlebury (Extreme Telematics) weighed the pros and cons of edge computing versus the cloud. While the cloud offers ‘infinite’ storage and processing power, it is costly to get high frequency data over the cellular network or satellite. The edge offers fast, low power hardware with built-in modems and can be cost effective for custom analytics. ‘Full stack’ edge processing solutions mean you buy hardware, apps and monitoring service from one vendor. These closed systems will collect and manage your data and sell you access. But such providers are not, in general, domain experts. On the other hand, buying your own hardware means developing your own solutions. Other possibilities in-between are emerging, leveraging the fact that edge computing is 10x cheaper and 10x faster than current PLCs. Again, MQTT is key, offering integration with other IoT devices such as GPS, accelerometers and modems. Scantlebury introduced the Oilfield IoT Consortium, a group of operators, technology companies and manufacturers/service companies whose mission is to create a collaborative environment between oilfield stakeholders that wish to adopt IoT. More from www.OilfieldIoT.org.

For those who are still after a single vendor solution, Braden Robinson presented ABB’s ‘optimization through automation’ offering. This offers domain-specific solutions such as ABB’s Plunger Analysis System 2.0, pattern recognition software that uses loading cycle history to recommend setpoints, or to pinpoint deficiencies in well performance. An operator training system helps new operators learn how to tweak plunger lift without of putting the well at risk, and to trial ‘what if’ scenarios.

Andrew Cappello presented UCOS, TechnipFMC’s ‘user-configurable open system’, a software-based, object-oriented control system that comes with pre-configured applications. On the basis that ‘software eats hardware’, Technip sees its software as replacing today’s PLC environments and flow computers that ‘unnecessarily bind flow calculation to proprietary hardware’. UCOS works across cellular, satellite, wired and Sigfox networks and is also integrating IoT technologies like MQTT, AWS IoT Core and Greengrass. Use cases include tank level monitoring, automated well testing and multi-phase metering.

Next years ABC Wellsite Automation Houston conference is planned for the 28/29th January 2020. More from American Business Conferences.


Folks, facts, orgs

Alerian, Argus, Arria, Aspen Technology, BCCK, Belmont Technology, Black & Veatch, Bluware, Cognite, Energistics, ENGlobal, EPAM, First Reserve, Flotek, FutureOn, Helix, Intrepid Financial Partners, Noble Corporation, Oceaneering, Osprey Informatics, OspreyData, Pixel Velocity, Siemens Oil & Gas, PRCI, Ryder Scott, SEG, Sercel-GRC, Society for Petroleum Data Managers, StormGeo, Tellurian, TIBCO, Unique Group, Vectra Capital, Schlumberger, McKinsey, Accenture, EVRY, Esri, ConocoPhillips, Katalyst, DNV GL, Fugro, Inpex.

David LaValle is Alerian’s CEO, succeeding Kenny Feng who is transitioning to a board and advisory role.

Argus has appointed David Fyfe as chief economist. He hails from Gunvor, a commodity trading company.

Bryan Zwahlen has joined Arria as SVP of partner services.

Georgia Keresty is now member of Aspen Technology’s board of directors.

Don Tyler is now BCCK’s director of engineering.

Alan Cohen is now executive advisor at Belmont Technology, an AI startup/platform for geosciences and reservoir engineering.

Black & Veatch management consulting has appointed Paul Maxwell as MD transactions, Kevin Cornish as senior MD, growth and performance and Joe Zhou as MD business technology and architecture.

Alexandra Mouton has joined Bluware as marketing manager. She was previously with Foster Marketing.

Francois Laborie heads-up Cognite’s new locations in Austin and Houston.

Edo Hoekstra (Schlumberger) is now a member of Energistics board of directors.

SVP John Kratzert is to lead ENGlobal’s newly opened office in Denver.

Robert Best is now director and principal consultant at EPAM. He was previously with Infosys.

Barbara Baumann and Paul Wood are now senior advisors with First Reserve.

Mark Lewis is SVP of global sales and business development at Flotek. He hails from Baker Hughes.

Michael O’Sullivan is now FutureOn Americas president. He was previously with Bluware.

Alisa Johnson is to retire as Helix’s EVP, general counsel and corporate secretary. Ken Neikirk has been promoted as SVP, general counsel and corporate secretary.

Michael France is MD and co-head of investing at Intrepid Financial Partners. He was recently with First Reserve.

Barry Smith is SVP operations at Noble Corporation. He hails from Atwood Oceanics.

Charles Davison has rejoined Oceaneering to succeed Clyde Hewlett as COO. He was most recently with Fairfield.

Mark Slaughter is to succeed Rob Logan as Osprey Informatics’ CEO. He hails from eFrac Well Services.

OspreyData has named Kenneth Collins as VP product optimization and services. He hails from EP Energy.

Pixel Velocity has hired Patrick Talley as EVP of sales, Bob Chunn as head of marketing, Isaac Shi as CTO and Matt Cassell leads the finance team.

Arja Talakar is now CEO of Siemens Oil & Gas. He was previously responsible for Siemens Saudi Arabia. Klaus Patzak is now Siemens AG managing partner for the portfolio companies. Mariel von Schumann chief of staff and head of governance and markets is to leave the company. Pallavi Chelluri is to lead the new MindSphere application center in India.

PRCI has elected Mark Piazza (Colonial Pipeline Company) chair of the research steering committee succeeding Francois Rongere (PGas&E), who will remain as past chair. David Chittick (TransCanada) is vice chair.

Guale Ramirez is President at Ryder Scott Petroleum Consultants succeeding Chairman and CEO Dean Rietz. Miles Palke and Tosin Famurewa are the newly elected board members. And Larry Connor and Herman Acuna have been promoted to EVP.

John Koehr is the new SEG executive director.

Guido Hagedorn is to manage the new Sercel-GRC facility in Tulsa, Oklahoma.

Registration for membership of the Society for Petroleum Data Managers is now open.

Carsten Mortensen is the new StormGeo non-executive chairman.

Sempra retiree, Octávio Simões is now Senior Advisor to Tellurian’s CEO.

Dan Streetman has been appointed CEO at TIBCO succeeding Murray Rode who is now Vice Chairman of the Board.

Matthew Gordon is Unique Group Regional VP for Europe and UK, based in Aberdeen.

Scott Kereiakes is now head of crude oil at Vectra Capital. He hails from Morgan Stanley.

Jamie Cruise is data management technology and solutions lead at Schlumberger. Hairel Dean Abd Samad is geoscience team leader.

Anosh Thakkar is Senior Advisor at McKinsey & Company. He hails from Shell.

Maggie Montaigne is senior advisor upstream oil and gas at Accenture Strategy. She was previously with ConocoPhillips.

Duncan Irving is now oil and gas consulting principal at EVRY. He hails from Teradata.

Dhowal Dalal has been promoted to senior GIS consultant at Esri UK.

Ikhide Longe has been promoted to supervisor, information management and geospatial services at ConocoPhillips.

Patrick Meroney has been promoted to VP US operations and professional services at Katalyst Data Management.

Ronald ten Cate is business leader hydrocarbon at DNV GL Oil & Gas.

Mehdi Belrhalia is business development manager at Fugro. He was previously with Ikon Science.

Masaki Ogihara is now Inpex’ senior coordinator, technical planning and coordination. Kazuyoshi Arisaka is general manager of the Japan Oil unit.


Done deals

Aker Solutions and FSubsea team on FASTSubsea. Aqualis Offshore acquires Braemar’s offshore business lines. Bluware acquires Kalkulo. Data Gumbo gets cash from Equinor, Aramco. Dietsmann boosts AI, robotics. ENGlobal gets compliance extension from NASDAQ. FireEye acquires Verodin. Greene, Tweed bags Lancer Systems’ fiber portfolio. Origin Energy invests in Intertrust. Novi Labs announces financing. OAG Analytics closed funding round. Rhône Capital to acquire Schlumberger businesses. Sphera Solutions acquires SiteHawk. SymphonyAI acquires Azima. Tachyus Series B round. Tieto and Evry merge. Weatherford in chapter 11.

Aker Solutions and FSubsea have created FASTSubsea, a multiphase subsea pumping specialist, combining Aker’s multiphase hydraulic technology with FSubsea’s Hydromag permanent magnetic pump. Aker and FSubsea will each hold 50% of outstanding shares in the new company.

Aqualis Offshore is to acquire Braemar Shipping Services’ offshore, adjusting and marine business lines. The company’s head office will be located in London. David Wells continues as group CEO.

Bluware has acquired Oslo-based Kalkulo AS, a provider of in machine learning, data analysis and modeling services to the oil and gas sector. Renevo Capital advised Kalkulo on the deal.

Oil country ‘Blockchain-as-a-Service’ boutique Data Gumbo has received $6 million in series A funding round ‘co-led’ by Equinor Technology Ventures and Saudi Aramco Energy Ventures.

Dietsmann Smart Robotics Lab, a 100%-owned subsidiary of the Dutch Dietsmann Group, has made a ‘substantial investment’ in Austrian Taurob GmbH, a developer of waterproof, ruggedized and ATEX-certified robots. The companies envisage the replacement of human inspection and maintenance missions in remote and hazardous locations and the general adoption of robotics in the oil and gas industry. Concomitantly, Dietsmann is to launch two start-up companies: Dietsmann Smart Robotics Lab and Dietsmann Smart Data Lab, both based in Boussens, south-west France, the historical heartland of French oil and gas.

ENGlobal Corp. has been granted a 180-day extension by the NASDAQ to regain compliance with the $1 minimum bid price rule.

FireEye has acquired Verodin, a specialist in the validation of cyber security controls in an approx. $250 million transaction.

Greene, Tweed has acquired Lancer Systems’ fiber optics product portfolio. Lancer’s fiber optic connectors are deployed by oil and gas service companies for use in wellhead outlets.

Australian Origin Energy has made a ‘strategic investment’ in Intertrust, a specialist in securing and managing the exchange of energy data.

Novi Labs has announced a financing round led by Cottonwood Venture Partners and seed investor, Bill Wood. Austin-based Novi provides machine learning software to help oil and gas companies design wells and optimize development programs.

Houston-based AI specialist OAG Analytics has closed its second round of growth funding from Rice Investment Group.

Wellbore Integrity Solutions, an affiliate of private equity firm Rhône Capital is to acquire Schlumberger’s fishing and tubulars businesses and assets in an approx. $400 million deal which includes the DRILCO, Thomas Tools, and Fishing & Remedial services units.

Sphera Solutions has acquired SiteHawk, a provider of software and data management solutions for chemicals in, inter alia, oil and gas.

SymphonyAI has acquired Azima Global, a provider of machine condition monitoring and reliability solutions for the process and manufacturing industries. The company is to rebrand as Symphony AzimaAI. Oppenheimer advised Azima on the transaction.

Tachyus has raises $15 million in a Series B round led by Cottonwood Venture Partners. Tachyus provides data-driven production optimization via Data Physics, its cloud-hosted software that blends AI and physics. Tudor, Pickering, Holt advised Tachyus.

Finland’s Tieto Corp. is to merge with Norway’s Evry ASA. Evry shareholders will own some 37.5% of the combined company and will receive some €200 million in cash. Tieto expects to issue 44.3 million new shares in the deal.

Weatherford has reached an agreement with its senior noteholders on a financial restructuring via a ‘pre-packaged’ Chapter 11 process that will reduce its long-term debt by over $5.8 billion. The package ‘contemplates’ $1.75 billion in new financing and up to $1.25 billion in additional post-emergence finance. Weatherford has also sold its surface data logging business to Excellence Logging for a $50 million total consideration.


AI in heath, safety and the environment

Lloyd’s Register’s Safety Scanner reveals ‘hidden insights’ in text HSE reports.

A publication from Lloyds Register, ‘How to optimize your HSE strategy with Artificial Intelligence*’ makes some bold claims for AI in health and safety. The announcement covers LR’s ‘Safety Scanner’, a natural language processing tool that accesses information ‘locked away’ in free text descriptions of HSE incidents, providing clues to immediate and root causes. The Safety Scanner can also use sensor data to monitor worker fatigue and heat exhaustion to understand risk setting and behavior, ‘providing previously unobtainable granularity of actionable insights into the HSE processes.’ We challenged LR to come up with some examples of such ‘hidden insights’ that have been revealed by the Scanner. LR’s Ran Merkazy kindly provided the following (which we have edited).

Although it’s early days, evidence is starting to collect. Here are some examples, some of which are soon to be released as case studies. Our most recent deployment is fatigue monitoring at a middle eastern airport airfield services operation where management suspected that fatigue played a role in the rising amount of accidents, such as airfield trucks colliding with airplanes, one which recently caused a fatality. We tracked fatigue levels using wearable sensors and compared the data with thousands of hours of benchmark data to find that the daytime average fatigue was between 3 and 4 times the expected benchmark. This result was a powerful motivator for management to come up with solutions.

Another example of surprising, otherwise unobtainable insight emerged when we used the scanner on HSE reports at a global B2B services organization, with 1000’s of field operatives. HSE management was well aware of the risks from ‘falls from height’, ‘slips & trips’ and ‘traffic incidents’ but our analysis also tagged many ‘unknown’ items for further investigation. Data analysis showed that the workforce was experiencing a large number of medical emergencies, such as blood pressure issues and age-related strain injuries such as back issues or sprained joints, particularly with older workers. We are now using analytics on data from multiple sources (training, geographic, weather, time), to identify other patterns. Early findings are interesting, showing, for instance, that it is the change in temperature (not how hot or cold it is) that counts. The day of the month can also be an early indicators of increased risk.

We are now engaged in a big AI-powered project with a National Oil producer from Asia (I can’t name it, but it’s as big as they get, on a global scale), who is asking us to deal with a massive amount of incident information, to mine for insights in textual descriptions. Here we are going further with the analytics to understand which safety barriers fail most often and what leading indicators can be identified in daily reports. Interestingly, this project demonstrates our capability to deploy our AI in different languages. An early finding is that Heinrich pyramid ratio may not be quite the gold standard of HSE practice as is often thought.

The Safety Scanner can be deployed as a plug-in to third party HSE software so clients can leverage this type of insight, without needing to change anything in the way they collect or track their HSE information.

* Read Lloyd’s original 2017 paper, How to optimize your HSE strategy with Artificial Intelligence and the recent announcement of the Safety Scanner.


Cyber security round-up

Siemens strengthens OT cyber posture in deal with Google Chronicle. NIST’s Risk management framework 2.0. Schneider Electric joins Cybersecurity Coalition. AFPM’s Cybersecurity 101 for refining and petrochemicals. Beyond Trust’s Microsoft Vulnerabilities 2019. McAfee Grand Theft Data II. UK ‘Petras’ National Centre of Excellence for IoT Systems Cybersecurity. MIT ‘IT security is largely impotent in protecting critical infrastructure’.

Siemens has announced the ‘Charter of Trust’, a suite of minimum cybersecurity requirements that are now included in all new contracts. The requirements will apply primarily to suppliers of security-critical components such as software, processors and electronic components for certain types of control units with the goal of protect its digital supply chain against hacker attacks. Siemens has also collaborated with TÜV SÜD to address the growing risk of cyberattacks on critical infrastructure by providing digital safety and security assessments, as well as industrial vulnerability assessments to global energy customers. More from Siemens. In yet another cyber deal, Siemens has partnered with Google-owned Chronicle, an Alphabet unit, to provide industrial monitoring and detection for the energy industry. The partners are to provide a single integrated platform and managed service that leverages analytics to ‘centralize and unlock the value of security data’. The system will leverage Chronicle’s Backstory platform to provide visibility across IT and OT systems and to ‘confidentially act’ on threats. More from Siemens.

The US NIST has released its ‘next generation’ Risk management framework RMF 2.0 aka NIST Special Publication 800-37. RMF offers a ‘holistic methodology’ to manage information security, privacy and supply chain risk. The executive summary states that ‘As we push computers to the edge, building a complex world of interconnected information systems and devices, security and privacy risks (including supply chain risks) [ are ] topics of great importance. The increase in complexity of the hardware, software, firmware, and systems within the public and private sectors (including US critical infrastructure) represents a significant increase in the attack surface that can be exploited by adversaries. Moreover, adversaries are using the supply chain as an attack vector and effective means of penetrating our systems, compromising the integrity of system elements, and gaining access to critical assets’. The 187 page publication provides a ‘disciplined, structured, and flexible process’ for managing such risks along with management training activities to prepare organizations to execute the framework.

Schneider Electric has joined the Washington DC-based Cybersecurity Coalition. The CC is developing consensus-driven solutions that promote a robust cybersecurity ecosystem with the development and adoption of cybersecurity innovations and by encouraging organizations to improve their cybersecurity. Schneider’s cybersecurity by design approach is exemplified by its EcoStruxure IoT offering, said to align with the US NIST cybersecurity framework. EcoStruxure cyber security is being enhanced through a global partnership with cybersecurity boutique Vericlave whose encryption technology is to ‘further secure and protect’ customers’ critical IT and OT systems. More from the Cybersecurity at Schneider Electric white paper.

The American Fuel & Petrochemical Manufacturers have published a short blog titled, Cybersecurity 101 in refining and petrochemicals. Author Dan Strachan warns of ‘radicals’ who are out to disrupt manufacturing and cause chaos at refining and petrochemical facilities. The AFPM’s Cybersecurity Subcommittee works around the clock to keep these folks from its IT and control systems. The AFPM sits also a member of the Department of Homeland Security’s Industrial Control Systems Joint Working Group and the independent Cyber Resiliency Energy Delivery Consortium. AFPM also sponsors the annual Department of Energy’s Cyberforce competition.

Beyond Trust has just published its 2019 Microsoft Vulnerabilities Report, the sixth edition. While Windows 10 was touted as the ‘most secure Windows OS to date’ when it was released, Microsoft continue to report vulnerabilities with twice the number reported in 2018 as in 2013. In 2018 across all Windows editions some 169 ‘critical’ vulnerabilities were caught. Beyond Trust reports that of these, 85% could have been mitigated by removing admin rights from end users. Critical vulnerabilities in Microsoft’s latest ‘Edge’ browser have increased six-fold since its inception two years ago. Beyond Trust observers that in the near future, Edge will have a Chromium-based engine, meaning that both Google Chrome and Edge could have the same flaws at the same time, leaving no ‘safe’ mainstream browser to use as a mitigation strategy*. Vulnerabilities in Microsoft Office continue to rise year over year, and they hit a record high of 102 in 2018. Here, removing admin rights would mitigate 100% of critical vulnerabilities in all Microsoft Office products. The question then arises as to how to restrict access and still ‘maintain a positive user experience’. Beyond Trust advocates leveraging ‘POLP’, the principle of least privilege, to mediate between security and productivity. The report also lists the ‘Top 4’ security mitigations as determined by the Australian Signals Directorate viz. application whitelisting, patching applications, restricting administrative privileges and patching operating systems. You probably knew this already ... but have you done it yet?

* What about Firefox?

A new report from McAfee, ‘Grand Theft Data II: the drivers and shifting state of data breaches’ finds that data breaches are getting more serious, with almost three-quarters of all breaches requiring public disclosure and/or affecting financial results. The top three vectors for ‘exfiltrating’ data are database leaks, cloud applications and removable USB drives. While insider theft is down 6% from 2015, it still accounts for 45% of all incidents. IT is implicated in 52% of breaches. While cloud applications and infrastructure do not generate a disproportionate amount of breaches, IT professionals are most concerned about Microsoft OneDrive, Cisco WebEx, and Salesforce.com. While 61% said that executives expect more lenient security policies for themselves, a similar number believed that such leniency results in more incidents. Security technology continues to operate in isolation, with 81% reporting separate policies or management consoles for cloud access security brokers (CASBs) and data loss prevention (DLP). However, over half of respondents have yet to install (or properly configure) at least one of these.

The UK has announced a National Centre of Excellence for IoT Systems Cybersecurity aka Petras’ (for privacy, ethics, trust, reliability, acceptability and security). Petras is to research the opportunities and threats that arise as technologies like edge computing, artificial intelligence and machine learning move from centralized systems to being run at the periphery of the internet and local IoT networks. Petras received a £13.85 million award from the UK Strategic MA Priorities Fund. The program aims to ensure that the Internet of Things systems are safe and secure as more critical applications emerge, making for increased vulnerability to sophisticated cyber-threats.

The US CERT National Insider Threat Center has published the sixth edition of its Common sense guide to mitigating insider threats. The report covers new research on unintentional insider threats and workplace violence, alongside fresh insights on the primary categories of insider threat: intellectual property theft, information technology sabotage, fraud, and espionage. The report also expands its organizational practices for mitigating insider threats to include positive workforce incentives, and it maps these practices to recent standards and regulations. The study includes analysis of more than 1,500 insider threat incidents across public and private industries. CERT director Randy Trzeciak observed ‘Many organizations feel insider threats are a greater risk to critical assets than external threats.’

The Spring 2019 edition of MIT’s Energy Futures, the MIT Energy Initiative bulletin, includes a five-page spread on Protecting our energy infrastructure. Using a new, holistic approach called ‘Cybersafety’, an MIT team has shown that today’s energy systems are rife with vulnerabilities to cyberattack—often the result of increased complexity due to high interconnectivity between devices and the greater use of software to control system operation. A spectrum of factors influence system operation, from physical design to operator behavior and managerial actions. Cybersafety provides a framework for studying how interactions among such factors affect system safety, and points to specific steps a company can take to harden its facilities. Recent events have demonstrated that traditional IT security measures are largely impotent in protecting critical infrastructure from advanced cyber adversaries. There is an urgent need to identify and mitigate cyber vulnerabilities, as future cyberattacks could cause unimaginable disruptions such as interrupting the flow of fuels or shutting down the US electric grid.


IBM releases Amados code base for oil spill modeling

EU ‘Adaptive meshing and data assimilation for the Deepwater Horizon oil spill’ project claimed as ‘new modeling paradigm’.

IBM has released the ‘Amdados’ (adaptive meshing and data assimilation for the Deepwater Horizon oil spill) as open source software under the Boost software license agreement. The Amdados pilot set out to develop ‘data assimilation’ and ‘adaptive meshing’ computing techniques to improve the accuracy of numerical solutions.

During the Deepwater Horizon blowout, the US authorities collected huge volumes of data on the extent and evolution of the oil spill. Previous research is said to have made use of some of the data, but a system that harnesses the full potential of the dataset by integrating it with ‘accurate, adaptive models and meta-models’ is yet to be put in place. The Amdados pilot application combines data assimilation and adaptive meshing techniques with complex models to simulate the Deepwater Horizon accident at an ‘unprecedented’ level of detail.

While advection diffusion codes for transport phenomena (such as oil spills) exist and are well developed, a novel one, that embeds AM-DA and is scalable to harness all the available data, is claimed to improve understanding of the impact of the Deepwater Horizon accident. The initial objective was to model affected areas at 4 meter resolution using the entire NOAA dataset collected during the event. However, the full-scale environmental study (the total volume of modelling outputs over a two-months’ time window involves some 100 Petabytes of data) is not feasible with the current level of compute performance.

The computational costs for a complete model of the spill imply something in the region of one to one hundred ExaFLOPs per time step. The researchers hope that the prototype and open source code base will constitute a benchmark for years to come in terms of both operational response planning and large datasets for environmental analysis. Beyond the Deepwater Horizon incident itself, the planned work, through the use of a coupled DA and AM modelling approach, will ‘create a novel paradigm for operational oil spill response systems, particularly in areas with complex and highly sensitive ecosystems’.

Amados was developed as part of the EU AllScale research and innovation program, endowed with some €3.3 million funding under Horizon 2020 grant agreement No 671603.


Weatherford plans to up its digital game despite Chapter 11

New Production Intelligence 4.0, Victus managed pressure drilling solution announced as company seeks protection from creditors. Forbes analyst sees warning to ‘whole oil services sector’.

Like a flower blooming before it dies, Weatherford has been busy in the digital domain as the company heads off into Chapter 11 bankruptcy. A new ‘Production intelligence 4.0’ solution sees Weatherford’s ForeSite lift control and production optimization solution deployed at the ‘edge’ i.e. the well site. ForeSite is available in both hosted and Google Cloud-based installation, along with CygNet IoT/SCADA software and MQTT data feeds.

Weatherford also unveiled Victus, an intelligent managed pressure drilling solution based on the company’s experience in some 7,600 MPD operations. For floating drilling vessels, Victus includes a new automated MPD riser system that has cut installation time from two days to less than 20 minutes.

Weatherford’s ForeSite and CygNet platforms monitor and optimize 460,000 wells around the world, along with 125,000 miles of oil and gas pipeline and some 30 billion daily data updates. Not enough however to save the company, which has just filed for Chapter 11 bankruptcy. A Reuters report has it that Weatherford ‘never recovered from the 2014 oil price collapse’ despite efforts sell assets and pare debt. The company recently disposed of its surface data logging and laboratory services businesses (see Done deals).

Forbes analyst Dan Eberhart sees this as a warning to the whole oil services sector which he describes as ‘in the tough position of being the first to get hurt when oil prices plunge and the last to profit when a recovery finally happens’. Eberhardt reports that share prices for bigger, stronger rivals like Schlumberger, Halliburton, and Baker Hughes are ‘trading at multi-year lows with dismal earnings outlooks’. Weatherford plans to continue operating its businesses ‘without any disruption to its customers and other partners’.


Sales, partnerships, deployments

Accenture, AspenTech, Black & Veatch, Bluware, University of Illinois, eDrilling, Emerson, Halliburton, Hunting, KBR, Kongsberg Maritime, DOF, SINTEF, NORCE, McDermott, Zamil Offshore, Mi4 Corp., Microsoft, Orbital, Samson AG, Bulloch Technologies, P97 Networks, Parabolt, QCI, Enerserv, RealWear, Honeywell, Stress Engineering Services, RPS Group, Schneider Electric, SitePro, Schlumberger, TechnipFMC, Unique Group, Add Energy, BHGE, C3.ai.

Accenture and Netherlands Gasunie have built and deployed a new digital natural gas transport management system to manage the Dutch gas grid, which transports natural gas, hydrogen and green gas from biomass. The GTMS uses gas management software from Schneider Electric. Accenture is also working with Oxy to implement transformational technologies to improve operations across all of its business units.

Rompetrol is to implement AspenTech’s software suite, including Aspen Mtell and Aspen Fidelis Reliability to accelerate its digital transformation.

Black & Veatch has received a full notice to proceed on conversion of the Golar LNG owned LNG carrier, Gimi, into an FLNG vessel for BP field development work using its patented PRICO liquefaction technology.

Anadarko and Bluware are combining their expertise to develop machine-learning and cloud-enabled geophysical workflows. They are also working on seismic data preparation to directly feed TensorFlow, and on interactive deep-learning training for data scientists and geoscientists.

BP America has opened an information technology center at the University of Illinois Research Park, to explore digital solutions and modernize IT services.

Gazpromneft-Yamal, a Gazprom Neft subsidiary, has implemented eDrilling’s industry-leading solutions software to optimize its drilling operations in the Tyumen region.

Emerson has been awarded a five-year contract with BP to provide predictive maintenance and operational support services on the Clair Ridge platform and on the Glen Lyon FPSO.

Halliburton has been awarded a three-year (with a two-year extension) contract by Shell for drilling services that integrates multiple product offerings and technologies at Brazil’s Campos and Santos Basins.

Hunting’s Titan division has signed an exclusive distribution contract with Well-Sun to market its products and technologies (with the exclusion of the People’s Republic of China).

KBR has been awarded a Technical Verification and Open Book Estimate EPC contract by Pieridae Energy Limited for a two train 10 MMTPA LNG facility at Goldboro, Nova Scotia.

Kongsberg Maritime, DOF, SINTEF Ocean and NORCE have formed a partnership to develop a decision support system for offshore vessel operations. The DSS aims to reduce fuel consumption and greenhouse gas emissions for complex offshore operations, while streamlining fleet-wide maintenance.

McDermott has partnered with Zamil Offshore to provide Saudi Aramco with comprehensive offshore brownfield EPCI solutions and asset maintenance services.

Mi4 Corporation’s 'Productioneer’, an oil and gas field data capture and reporting platform, is now available from Microsoft’s Power BI App Source marketplace.

Orbital has secured an undisclosed number of units of orders for its GasPTi-F sampling and analysis system distribution partner Samson AG, for use by a large industrial user of natural gas in Spain.

Bulloch Technologies and P97 Networks have formed a business partnership to enable mobile payments across the Bulloch point of sale network.

YPF has adopted Parabolt’s Kyduk platform that allows companies to search and develop new ideas in-house, ‘systematizing and accelerating’ the process of innovation.

QCI and Enerserv have partnered to offer turnkey wellhead and artificial lift penetrator product solutions and services.

Shell has chosen RealWear’s hands-free head-mounted RealWear HMT-1Z1 platform. The devices are being deployed through Honeywell in 12 countries and 24 operational sites.

Stress Engineering Services and RPS Group have implemented NeoSight and OceansMap in a pilot for Shell.

Schneider Electric is now one of BP’s main electrical contractors via a five-year global frame agreement covering deployment of Schneider’s EcoStruxure architecture across BP’s upstream projects.

SitePro has partnered with Microsoft to provide real-time data and system alerts to oil and gas operators.

Schlumberger has deployed its oil and gas petrotechnical software suite on the Google cloud platform to perform seismic processing, interpretation and subsurface modeling.

Woodside and Schlumberger are collaborating on cloud-enabled digital technology deployment and cross-domain R&D.

OMV has signed a strategic partnership to ‘accelerate its digital transformation’, leveraging Schlumberger’s current and future digital solutions.

Neptune Energy has awarded TechnipFMC an integrated EPCI for contract for its Norwegian Duva and Gjøa P1 projects.

Unique Group is to exclusively represent Add Energy to provide asset and maintenance management consultancy and software solutions to energy and manufacturing companies in the UAE.

BHGE and C3.ai have partnered to deliver AI solutions across the Oil and Gas industry.


Standards stuff

Colan reports new Cobia interfaces. Energy Piplines CRC wound down. Energistics investigates microservices in the cloud. Inline XBRL viewer open sourced. CoST teams with OCP on infrastructure data standards.

The CO-LaN management board has approved requests for consultancy services relative to the development of Cobia*-based interfaces from KBC Advanced Technologies (for its MultiFlash thermodynamic server) and Heat Transfer Research (for a new heat exchanger model). The services will be provided by AmsterCHEM. In a separate announcement, Michael Halloran has been renewed as consultant to CO-LaN until May 31st, 2020.

* Cape-Open binary interop architecture.

Australia’s Energy Pipelines Cooperative Research Centre (CRC), a collaboration between the members of the APGA Research and Standards Committee and researchers at the Universities of Adelaide and Wollongong, Deakin University and RMIT University, is to cease activity as its 10 year funding expires. Its assets will be transferred to Future Fuels CRC to allow for continued access to research reports and further development of IP.

The Energistics RESQML team is turning its attention to microservices in the cloud as operators push vendors to deconstruct monolithic apps into microservices. Microservices will need standard ways to interact, an area where Energistics can help with its ETP protocol.

An inline XBRL viewer, developed by Workiva has been released as open source software. The viewer reveals inline XBRL tagged content in financial documents, ‘unlocking the value of data’. Download the code from Github. The XBRL organization has also released some test cases samples for Inline XBRL, notably an extract from the Global Legal Entity Identifier Foundation’s (GLEIF) 2017 Annual Report.

CoST and the Open Contracting Partnership announced the Open contracting for infrastructure data standard (OC4IDS) at the 2019 OECD Infrastructure Governance Forum. The new standard for infrastructure transparency supports disclosure and monitoring of infrastructure projects through identification, preparation, implementation and delivery.OC4IDS combines OCP’s guiding principles of open contracting and open data with CoST’s specific knowledge on what to disclose during the project cycle. The launch culminates a ‘long journey’ for CoST as it moves from ‘cumbersome’ paper-based processes from public procuring entities to a systematic tool for disclosing user-friendly open data, available in real time.


PODS V7.0 roll-out

Pipeline data standards organization releases new data model and ‘formal’ data exchange spec.

PODS, the Pipeline open data standards association has released Version 7.0 of its pipeline data model for comments. Speaking at the 2019 PODS user conference, Chad Corcoran (Andeavor) and Pete Veenstra (Pivvot) presented the new 7.0 major release along with PODS Lite, a free subset of the PODS 7.0 model. PODS 7.0 represents a simplification of the data model that no longer mandates linear referencing for feature location and provides a formal data exchange specification.

PODS 7 leverages the ‘next generation’ PODS initiative, a modern approach to database design and deployment. A single PODS 7.0 conceptual model is delivered in Sparx Enterprise Architect, describing business concepts, entities and relationships. The conceptual model is used to derive a logical model, an agnostic framework for documenting and extending the model. Finally a physical model can be implemented in a database platform of choice (a specific RDBMS or geographic information system such as an Esri geodatabase). The approach allows both PODS 7.0 and PODS Lite to interoperate with, for instance, Esri’s ArcGIS Pro pipeline reference model.

The comprehensive model includes a rigorous treatment of units of measure and business rules such that, for instance, a UOM can be applied to specific routes (such as USA/Canada cross-border issues) or for various descriptions of pipe coating thickness. Business rules contain information on how records are managed when a feature location is moved or altered. PODS 7.0 and PODS Lite due for final release in August with business rule and data exchange documentation to follow in Q3 2019. More from PODS.


Offshore and onshore reliability database goes digital

New front end from Satodev provides access to Oreda running in the DNV GL Veracity cloud.

The Oreda (offshore and onshore reliability database) was initiated in 1981 by the Norwegian regulator the Petroleum Safety Authority (formerly NPD, the Norwegian Petroleum Directorate). Since then, data from more than 292 installations, 18,000 equipment units with over 41,000 failure and 77,000 maintenance records has been collected and analysed. The databank also includes 55,000 subsea components with close to 3,000 failures and 1,000 maintenance records. The OREDA handbook is said to be the leading source of reliability data for the global oil and gas industry.

Now the Oreda membership is to make the dataset more accessible and interactive via DNV GL’s Veracity data platform. French IT service provider Satodev has developed a front end to the data, the ‘Oreda@Cloud’ that lets members and external users filter and analyze the reliability database. This new solution will be distributed through the Veracity market place. Purchase Oreda@Cloud on the Veracity web store. More from Oreda and LinkedIn.


Bureau Veritas teams with Shell TechWorks spin-out on digital bolt management

Cumulus cloud-based maintenance and construction platform enhances inspection, minimizes fugitive methane risk.

The interface between things digital and the real world is where the digital transformation really is taking place. Bureau Veritas has brought the real and virtual worlds closer together with the launch of a ‘digital integrated bolted joints management solution’. The digital bolt addresses testing, inspection, and certification services and is powered by Cumulus’ Smart Torque system.

The digital bolt will allow inspectors, contractors and operators to better manage safety-critical bolting assemblies by providing facility and operation managers with data on improve work quality and will help minimize fugitive emissions and mitigate the risk of major accidents due to leaks. The solution provides real-time data and traceability during maintenance and shutdown activities and a digital solution for planning, tagging and tracking flange management activities with full accountability of contractors at the individual worker-level.

Cumulus Digital Systems connects workers, tools and data to a cloud-hosted software platform to manage maintenance and construction workflows. The platform collects data from digitally-enabled tools in the field to provide a single source of truth for real-time quality assurance and progress tracking, replacing expensive and time-consuming inspections. Cumulus is a spin-out from Shell’s Cambridge technology center, Shell TechWorks.

Cumulus CEO Matthew Kleiman observed, ‘Manual work at industrial sites has not changed significantly in the last 50 years, so the opportunity to innovate an analog process is exciting.’ BV’s Geert Jan De Vries added ‘Our oil and gas clients are committed to zero leaks at their facilities. This integrated solution has been proven successful to support this initiative’. The solution will integrate BV’s industrial inspection services portfolio and be delivered to clients worldwide. More from Bureau Veritas and Cumulus.


Capital project software news

Accenture believes there is ‘more to do’ in standards for oil and gas capital projects. Aibel awards Trimble a major contract for Equinor’s Johan Sverdrup P2 project. US BIM standards encroach on ISO/Cfihos construction standards terrain.

A 9 page pamphlet from Accenture advocates digitally-enabled capital projects that ‘help oil and gas companies design and build plants 20 to 25% faster, at 15 to 20% less cost’. Accenture advocates an ‘agile’ operating model that ‘defines the strategic and technical capabilities and management processes that are needed, by whom and where’. On standards, Accenture is somewhat critical, citing the IOGP JIP 33, which is standardizing ‘dozens of procurement specifications for key items’. However, ‘when it comes to capital projects, there is much more to do’. Cfihos, which addresses this issue, was not mentioned. Although, ‘digital tools’ are said to enable data sharing in a ‘standard tool-agnostic format’. We did ask Accenture for more on the tool agnostic format: no reply to date.

One possible candidate for such is Trimble’s Tekla building information management (BIM) software that was recently deployed by Aibel on the Johan Sverdrup P2 oil field construction project. Tekla’s software spans planning, construction, operation and maintenance. It is standards-based, but, as far as we can tell, leverages standards emanating from the US building and information management space rather than the ISO 15926/Cfihos community*. Equinor’s Johan Sverdrup is one of the largest projects on the Norwegian continental shelf. Aibel evaluated and performed a proof of concept of Tekla Structures and several other BIM solutions with attention to interoperability with offshore design software systems, cross-team global collaboration, domain expertise and ability to automatically generate detailed shop drawings. Local reseller EDRMedeso helped in creating workflows around Tekla Structures that integrate with the plant design management system. The Tekla model sharing collaboration tool supports distributed teams in Norway, Singapore and Thailand.

* We received clarification on this issue from Trimble SVP BIM Leif Granholm who confirmed that ‘Tekla uses ISO standard IFC, ISO 16739 which is good for structures, but not 15926 which focuses on process equipment, not structures. CFIHOS is not an ISO standard and is not used’.

Comment: Many stakeholders are involved in construction and each brings its own world view and (possibly) standards to the table. BIM is an interesting area. We see a parallel here with the subsuming of the US Fiatech into the CII BIM community last year.


HansonWade 2019 Applied Data Analytics Upstream, Houston

ConocoPhillips ExtraTrees ML for SAGD. Apache on cost functions in forecasting. Schlumberger’s data-driven prognostics and health management. Woodmac’s new Analytics Lab.

Christopher Olsen presented a detailed data-driven approach to steam allocation optimization at CononcoPhillips’ Athabasca oil sands ‘Surmont’ acreage, which uses the steam assisted gravity drainage (SAGD) process. The objective was a full-field tool that reacts to live process data and well performance to optimize steam allocation in real time. A massive amount of data was available, on wells, injection, pressures, logs and (a lot) more ‘closely tracked’ process variables. A comprehensive analysis of data acquisition and tests of various ML-based optimization led to an approach based on a virtual flow meter and recurrent neural networks. RNNs are said to excel at forecasting sequences, the next sequence of characters in a text string, or the next set of values in a time series. The RNN was trained on historical data for several key variables and used to forecast what should happen for the next several hours. The output feeds other ConocoPhillips models to predict oil/water rates (a virtual flow meter fills gaps in the data). A first production version was released late 2018 using ExtraTrees (a variety of Random Forest) based forecasting. This has been updated to incorporate deep learning and now models automatically retrain on new data with reinforcement learning.

David Fulford (Apache) introduced his presentation on the importance of cost functions in production forecasting with a recap of Anscombe’s Quartet, a statistics classic on the multiplicity of statistical interpretations that can be derived from the same data, and on the importance of graphics in an analysis. To evaluate a machine learning algorithm, a quantification of ‘goodness of fit’ is required, aka a cost function. The choice of cost function(s) can be as important as the choice of predictive model. Understanding the data model gives insight into appropriate choice of cost function. Uncertainty is a fundamental characteristic of modeling. A ‘best fit’ is not the same as a best forecast. It does not mean that only one set of model parameters fits the data!

Schlumberger’s Enterprise Solutions unit is a long-time user of machine and deep learning for real - time predictive maintenance of its frac pumps as principal data scientist Jay Parashar explained. Schlumberger’s prognostics and health management (see also the PHM Society) uses data-driven PHM as opposed to reliability-based maintenance. The ‘state of the art’ PHM system encodes time-series data in a polar form and ‘performs a gram matrix-like operation on the resulting angles. Convolutional neural nets also ran. While ‘PHM has impacted the definition of maintenance in a big way’, machine learning results still ‘need to be explainable’.

Preston Cody presented Woodmac’s new Analytics Lab, a data consortium that works along the lines of similar data sharing efforts in insurance and banking. The Lab uses a ‘give to get’ model where members agree to exchange data with certain other members of the lab. Members only receive comprehensive analyses if they contribute data themselves. Woodmac acts as an independent facilitator that manages and harmonizes data across multiple sources, without bias, and provides ‘statistically sound’ analytics. The Analytics Lab is backed by a cloud-based platform and robust capabilities that enable computationally intensive analytics on large datasets. Woodmac’s parent company Verisk adds its computer security expertise that complies with major security and governance regulations. More from HansonWade.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.