Oil IT Journal: Volume 25 Number 6


The Open Footprint Forum

PIDX Fall Conference hears from OSDU champion, Shell’s Johan Krebbers, on another The Open Group-backed standards initiative, the Open Footprint Forum. OFF is to ‘track and reduce’ greenhouse gas emissions with an open source initiative to ‘stop wasting time and effort’.

Speaking at the online 2020 PIDX Fall Conference, Shell’s Johan Krebbers announced The Open Group’s new Open Footprint Forum (OFF). The OFF is to provide a mechanism of tracking and reducing greenhouse gasses and other emissions. Krebbers observed that the various stakeholders (WEF/GRI/GDP etc.) have so far failed to come forward with clear standards on what data to store. This makes it challenging to track emissions consistently. ‘Everybody works on their own and it is virtually impossible to aggregate and compare emissions’.

The OFF has launched ‘to stop wasting more time and effort’ and is to create an open source, Apache 2.0 licensed industry standard to track corporate emissions in a standard way. The OFF is to work across Industries and provide standards for a data platform, architecture and reference implementation. The open, vendor-neutral standards for consistent measurement of environmental footprint data will facilitate sharing of footprint data across the supply chain, and foster an ‘open and robust ecosystem’ of software and services for environmental footprint data capture, management, and reporting.

Krebbers presented a straw-man high level architecture of the proposed platform. The stack envisages data access from public APIs, a near real time capability with OpenID user control, a machine-to-machine capability and micro services-based orchestration providing calculation services, metrics and units of measure. Third party apps (Krebbers cited Consensys’ blockchain-based CarbonX) can join the OFF via a marketplace. At the top of the stack is an ‘HTML5/native’ GUI. A linked ‘blockchain for proof of origin’ is also mooted for the OFF.

As in Krebbers and Shell’s earlier standards initiative, OSDU, The Open Group was selected to provide ‘a legal framework under which all companies can work together in a transparent fashion to get the free standard to market’. OFF phase one is to cover greenhouse gases. Phase two will extend the work to other environmental issues such as land spill, water, etc.

The Open Group is encouraging companies to sign up to the OFF and help lead development, lower the cost of measuring and managing their environmental footprint and ensure that the OFF standards and data platform will meet their needs. ‘Without the Open Footprint Forum, organizations would be left to acquiring or building incompatible and expensive custom solutions that don’t interoperate throughout their supply chains’. There are currently 22 corporate members of the OFF. These include BP, Chevron, Equinor, Shell and many service providers.

More on the OFF from The Open Group.

Comment: Krebbers and TOG are entering a space where there are already some established players. To name but two we have OGCI, the Oil and Gas Climate Initiative which, along with EY, is collecting and checking environment data from its members, Shell inter alia. And there is the United Nations Environment Program whose ‘geo-referenced, remote-sensing and earth observation information integrated environmental statistics and data’ can be viewed on a very grand ‘World Environment Situation Room’.


Amazon Cloud supports Halliburton, Explor

A historical backgrounder on Landmark’s migration of OpenWorks to DecisionSpace365 in the cloud. iEnergy hybrid cloud achieves SOC 2 certification for client data management. Canadian Explor reports ‘breakthrough’ seismic processing in the cloud.

Amazon recently provided* some insights into how Halliburton’s Landmark Graphics unit has deployed its flagship OpenWorks geotechnical database in the cloud. Back in the day, OpenWorks was sold and deployed in-house on an Oracle database. With the advent of Landmark’s DecisionSpace365, a cloud hosted database was required. Enter Amazon Web Services (AWS) Aurora, a ‘MySQL and PostgreSQL-compatible' relational database built for the cloud. This now underpins Landmark’s Data Foundation for the DecisionSpace365 platform. The cloud-native data store can ingest and manage data ‘at-scale’, delivering a ‘fully managed model that can be customized to many industries’.

The move involved a refactoring of the OpenWorks schema and code to both an on-premise PostgreSQL database and into the Aurora host. Taking Oracle out of the equation meant that ‘Landmark could sell its products at a lower price point’. Landmark’s Amanda Smith reported an ‘up to 30%’ performance hike from Aurora, although it’s not clear what Aurora is being compared to.

Landmark is in the process of obtaining service organization controls (SOC) 2 certification for both its infrastructure and solutions—meaning an independent certified public accountant is determining whether the company has the appropriate service organization and controls safeguards and procedures in place to manage customer data. SOC 2 certification involves two separate reports: a Type 1 report, which measures system and control design based on SOC 2 criteria, and a Type 2 report, which measures operational effectiveness. Halliburton Landmark has received a Type 1 report for its iEnergy hybrid cloud solution. Type 2 is scheduled for completion in 2021.

Because Aurora is a fully managed service, AWS is responsible for database operations including manual software patching, setup, configuration, and backup, and completes those tasks faster than the Halliburton Landmark team was able to do when self-managing the original database. Software updates that took weeks or months with traditional databases now only take an hour or two with Aurora.

Aurora has meant that Landmark now operates an E&P solution without needing to be an expert in platform technology. Smith added, ‘We are relying on AWS to provide the tools, and we add our E&P expertise to deliver solutions.’

In a sperate announcement, Canadian geophysical boutique Explor reports ‘breakthrough’ seismic data processing in the cloud using Landmark’s seismic processing** in the Amazon cloud. In the proof of concept trial, multiple benchmarking tests demonstrated ‘an 85% decrease in CDP sort times, an 88% decrease in CDP FK Filtering times and an 82% decrease in pre-stack time migration’. The latter on a 922 gigabyte, 165 million-trace dataset.

* The release includes events that happened ‘eight years ago’ so we apologize on behalf of Amazon for the late arrival of this ‘actualité’.

** More precisely, ‘Seismic Processing, a DecisionSpace 365 cloud application powered by iEnergy on AWS’. Which we take to mean ProMax.


Run interpretation software in the cloud? Maybe.

A forced ‘trial’ of Office in the Cloud leaves editor Neil McNaughton preferring the old way of running applications on your computer. Quizzing some data managers, he finds them similarly perplexed by the rush to the cloud. But that is where the seismic world is heading, with the promise of interpretation software running in the cloud alongside its data. Will a fast internet connection ever replace hundreds of gigabytes of local RAM? An ongoing experiment in computer gaming should provide the answer.

Times are hard for the oil and gas industry, but you have probably heard enough about that already. If you haven’t, read the ‘Industry at large’ in this issue. Times have been quite hard for us here at The Data Room, with a computer crash which has meant working on a SharePoint/OneDrive file system in the cloud. Which has at least given me some editorial fodder. But first, let me expand the subject to the cloud in general.

A couple of months ago I quizzed a small sample of data managers in an offhand manner as to why? Why the cloud? Why bother? Our last issue was replete with tales of microservices, Kubernetes and such, and indeed with this issue, yet more cloudy stuff, considerations of cloud object storage and so on. All of which are orthogonal to the business of either geoscience or producing oil and gas. Nobody in their right mind would want to inflict all this IT arcana on themselves without some clear benefits.

My data managers did not have any great ideas as to why the industry is rushing headlong to the cloud. I guess the unstated aim is to reduce or eliminate the cost of an on-premise data center. One of my interlocuters confessed that although the initial cloud decision may be unclear, once your data is in there, there are all sorts of things you can do with it. A post-facto argument that may be hard to sell to management.

On the question of cost, in this issue, you can read our summary of Andy James’ (Bluware) excellent exposition on the niceties of cloud costs and how, for voluminous seismic data, these can easily explode. Old-timers (like me), whose experience goes back to the tape storage and document management outsourcing efforts of the 1980s, recognize the bait and switch. Low costs to take all your documents off-site. Then come the big bucks to access, move and (especially) change providers.

It seems to me that the whole industry is striving to adapt itself to the current technology, terms and conditions of today’s clouds which of course are very likely to change over time. Perhaps when the GAFAs start paying all the taxes they owe!

Anyway, to get back to my own enforced experiment. I have been working with Office365 since 2014 when I wrote my ‘The Cloud. How the IT world is slowing us all down!’ editorial. In the interim, I have been working in the old-fashioned way, running all the Office apps on my local machine and letting OneDrive act at a real-time backup mechanism. This proved quite satisfactory bar a couple of chunks of lost work on dodgy internet connections.

When my workstation went down (power/disk/video card of all of the three?) I figured that as it had been going for over 10 years it had given me a good run for my money. I hoped also that my next machine would last as long and thought that, instead of buying whatever they happened to have in-store at Office Depot (not much these days) I would order a decent box from Dell. Covid and the holidays meant that this meant waiting for a couple of weeks. So, this issue has been prepared on the unlikely combination of an old (but wonderfully designed) Mac Mini Server running Ubuntu Linux and accessing OneDrive/SharePoint via Firefox. Somehow this feels very cloudy – a lightweight endpoint with all the smarts in the cloud. How does it feel? Well, I concluded my 2014 editorial with the following...

If you are not yet in the Office cloud you might like to know that Office 365 brings you two versions of everything. One to run in the browser and another on your desktop. The online version is clunky and idiosyncratic. Keyboard shortcuts? You may as well forget them. Even cutting and pasting from within the same email brings up a dialog along the lines of ‘are you sure you want to do this?’ Click to open a document or change folders, the cloud lets you know that it’s ‘working on it…’ The web-based version appears to have been designed to slow usage down to a crawl. Perhaps that was the point.

Six years later, I don’t have a lot to add to this. The user experience has evolved some. Not necessarily in the right direction. SharePoint doesn’t seem to be able to access the clipboard and instead suggests a steampunk CTRL-C/CTRL-V. Using the mouse is idiosyncratic, to say the least. Selections disappear or take ages to enact. Ages that is, until you try to select text in Word that requires scrolling. Then the cursor skedaddles off to the end of the document before you can stop it. Navigating the file system in either OneDrive or SharePoint is, frankly, grotesque. I usually do a lot of moving and renaming files as I work my way through the masses of raw information that we receive for each issue. Even saving a Word document to a folder of one’s choice is totally obscure. Maybe I need a training for all of this. That would be a first for me in almost 40 years of personal computing! Anyhow, diddling around with the file manager is a prime example of ‘if it ain’t broke don’t fix it’ not!

James’ exposition on seismics in the cloud goes beyond cost analysis to explain the rationale of a new seismic data format adapted to the cheaper object storage format. He makes it clear that this is not so that data can stream fast into your local workstation. It is so that data can stream fast from storage to an application in the cloud. I must be a bit slow on the uptake but this was quite an eye-opener for me. What I now see is that OSDU is not just a shift to cloud-based data, but a shift to data and applications running in the cloud. This has been something of a holy grail of IT since long before the cloud was called the cloud. Will it work? Will the cloud ever be able to deliver the performance of today’s seismic workstation from just an HTML5-enabled dumb terminal? I really don’t know. But there is a great experiment going on right now that is worth watching, in computer gaming. As people are rushing out to buy the latest PlayStation or Xbox loaded with RAM and GPUs (or perhaps an even more powerful gaming PC), Google is working on Stadia its cloud-based gaming platform. So, there you have it. If your kids are asking for a Stadia subscription next Christmas instead of a new PlayStation then OSDU will be the place to be.


Industry at large

Ryder Scott on ‘gut punch’ to industry. Shell slashes costs … and the green team resigns! PwC on Equinor’s $21 billion US loss. Accenture’s mammoth Guide to Decarbonizing the Industry. Cegal shape-shifts to greener future. Schlumberger moves exec bonus goalposts … and fires 21,000. Deloitte on the Future of Jobs in Oil and Gas, ‘home-based work not transitory’. EU report conflates digital transformation and green deal. Shale binge trashes US reserves. Rystad’s plausible if pessimistic forecast for fossil fuels. The Data Room’s own 2cents ... on the energy density of batteries, Avtur and hydrogen.

Ryder Scott’s annual reserves conference was held on Zoom this year. CEO Dean Rietz described the current situation in the oil and gas industry as ‘like a gut punch’. Industry has been hit by the pandemic, dwindling demand, low oil and gas prices, bankruptcies, ‘continued slander’ against fossil fuels, unrelenting pressure to reduce carbon footprints and election-year politics. Rietz traced the 12-month oil price forecasts made at earlier conferences. These revolved around a $59-a-barrel prediction last year, down from $75 in 2018. ‘No one could have predicted the events leading up to today’s $40 price’. ‘Until we start getting back to normal, perhaps after a Covid-19 vaccine, we won’t see an increase in demand anywhere close to recent years’.

As the FT’s Anjli Raval reported Shell is ‘slashing costs to shape-up for the energy transformation’. Project ‘Reshape’ sees some 9,000 job losses, making for some $2.5 billion of annual savings. Shell is to ‘slowly lift spending on low-carbon tech while sustaining its legacy oil and gas operations.’ It seems like Shell is not moving fast enough in the direction of green. The FT subsequently reported a ‘wave of resignations’ as key green executives quit over a split as to how far and fast the oil giant should shift towards greener fuels.

PwC has authored a report ‘Equinor in the USA’, a review of Equinor’s US onshore activities and learnings for the future. The review was prepared at the behest of the Equinor board. Equinor has invested over $10 billion in US shale gas and oil, notably with the $4.7 billion acquisition of Brigham. But soon, challenges in land management, production revenue accounting, joint venture accounting and procurement resulted in the company ‘losing control over critical business support processes’. ‘Between 2007 and 2019 Equinor recorded a $21.5 billion loss on its US activities. PwC found that ‘corporate oversight should have been stronger and did not sufficiently reflect the underlying risks of the business’. Today, the internal control environment in the US organization is ‘significantly improved’.

Accenture has produced a mammoth report on ‘Decarbonizing Energy, From A to Zero’, a practical guide to navigating the decarbonization process. The report has it that ‘storm clouds have been gathering over the oil and gas industry for years and it’s time to heed the warning signs. Advances in renewable energy technologies threaten the industry’s relevance’. Accenture presents ‘the only three viable options’ for oil and gas companies. These are: 1) ‘Decarbonization Specialists’ operating a ‘clean, high-margin portfolio’ of oil and gas assets, 2) ‘Energy Majors’ with a ‘broad reach’ into the energy system of the future and 3) ‘Low-Carbon Solutions Leaders’, a new type of asset-light energy company. The study warns against adopting a hybrid role in the transition, ‘as some integrated oil companies have done’. They will almost certainly fail. Industry can take some comfort from the fact that hydrocarbons will continue playing a key role in supplying energy well beyond 2050, the endgame is not an energy system without fossil fuels which provide ‘close to 50%’ of 2050 energy. There is something for everyone in this 179-page document.

Oftentimes, the IT folks like to conflate the energy and digital ‘transformations’. Cegal has spotted an opportunity here and is shift-shaping its digital future , taking its experience of working with cloud technology in the oil and gas space to other industries, renewables, ocean industry and beyond, ‘responding to the increasing need for digitalization and secure data access through the cloud’.

Covid and the industry downturn do not seem to have impacted some executive remuneration. As Patrick Temple-West reported in the FT, corporate bonus plans are being redrafted to prop up pay as ‘performance metrics are being switched and missed targets ignored’. The FT gave Schlumberger a shout-out in this context. Schlumberger shares are down 60% this year. Bonuses are no longer determined on an earnings-per-share metric. Instead, a more favorable adjusted EBITDA has been used. Schlumberger is cutting some 21,000 jobs, about a fifth of its workforce.

The FT’s Myles McCormick summarized the results of a Deloitte study of the Future of work in oil and gas and chemicals. Seemingly, ‘seven out of ten US oil and gas are not coming back as a humbled industry overhauls the way it operates’. Some 107,000 jobs were lost in US oil, gas and petrochemicals between March and August. Rystad Energy’s Matthew Fitzsimmons was quoted as saying that the crash has kickstarted digitization initiatives which ‘will lead to some of these traditional jobs not coming back.’ We checked out the Deloitte study where we also learned that ‘The trend of home-based workers shouldn’t be seen as transitory’. Deloitte, echoing Accenture, proposes ‘four levers of transformation’ for oils energy transition, integrated human-machine collaboration, recoded careers and organizational agility. All of which ‘could push operators into the future’. Deloitte also argues (somewhat unconvincingly) that ‘the consequences of the pandemic have reinforced the call for long-term decarbonization and a solid energy transition’. Something that oil and gas strategists will have to take into consideration. The 30 page Deloitte study is replete with analysis and advice on greening an industry in a ‘great compression’.

The EU is another conflater of digital transformation and the ‘green deal’ and has proposed a ‘how to spend it’ digital investment plan for Europe. Digital ‘has enormous potential to facilitate the transition to a low carbon circular economy’. A GESI* study has digital technologies as offering a potential 20% reduction of global CO2 emissions by 2030. Time to short the data center providers?

We have plagiarized the FT far too much in this piece, so we refrain from telling you too much about how the ‘Shale binge has ‘drilled the heart out’ of US reserves’ as Derek Brower reported. Let’s just say that Quantum Energy Partners’ Wil VanLoh told the FT the dirty secret of shale, ‘too much fracking has sterilized a lot of US shale’.

Speaking at the EAGE Virtual Annual Conference (report in our next issue), Jarand Rystad offered a plausible if pessimistic forecast for fossil fuels. New energy technologies are so competitive that the transformation will happen regardless. Policies will determine the speed of the transition. There will be zero carbon in the 2060/70s whatever and ‘strong support’ for decarbonization in 2040/50s. Rystad monitors global road traffic where covid has brought major disruption, with a 30% decline in the UK. A ‘huge, unprecedented decline in oil demand’. Rystad forecasts some recovery in 2021 (for road), aviation will take longer. All demand will be back in 2022/23. So, there is ‘new upside’ at that point, ‘a short but welcome’ break for the oil industry. But it will not last long. Rystad sees ‘severe downside’ in 2024 and from then on, a structural shift with EVs, and many other sectors switching to renewables and especially batteries as a backup to solar/wind. 2030 will see the full effect of the transition.

Finally, in a context where demand is down, and the industry is confronted with pressure from the environmental movement a couple of cents worth of comment from The Data Room. An article in the Scientific American included an interesting factoid which argues a little against Rystad. Today’s best electric batteries have an energy density of 250 watt-hours per kilogram. On the other hand, Avtur (jet fuel) has an energy density of 12,000 watt-hours per kilogram. This is why electric planes are unlikely to go very far any time soon. This is not quite the whole story as the energy density of hydrogen is 33,000 watt-hours per kilogram. That’s just physics and chemistry. What is a little bit more contentious is an off-the-cuff quote from an yet another article in the FT where we learn that Morgan Stanley analysts have calculated that ‘per unit of energy produced, renewables require up to five times as much capital expenditure as oil and gas projects’. Or in other words, ‘get woke, go broke!’.

* Global Enabling Sustainability Initiative.



Machine learning classifies fossil pollen grains

NIST-backed team applies convolutional neural network to high resolution microscopic imagery.

US Researchers* published in the Proceedings of the National Academy of Sciences have applied machine learning to the identification of fossil pollen grains. The team developed and trained three machine-learning models to differentiate between several existing Amherstieae legume genera and tested them against fossil specimens from western Africa and northern South America dating back to the Paleocene, Eocene and Miocene.

Taxonomic resolution is said to be a major challenge in palynology that limits the ecological and evolutionary interpretations possible with deep-time fossil pollen data. The NIST-backed team used optical ‘super resolution’ microscopy and machine learning to create a quantitative workflow for producing palynological identifications and hypotheses of biological affinity. Three convolutional neural network classification models were developed: maximum projection, multi-slice, and fused. After training on modern genera, the models were run on the fossils. All models achieved average accuracies of 83 to 90% in the classification of the extant genera. The majority (86%) of fossil identifications showed consensus among at least two of the three models. The study supported the hypothesis that Amherstieae originated in Paleocene Africa and dispersed to South America during the Paleocene-Eocene thermal maximum (56 Ma).

While one might debate the exact meaning of machine learning and AI, using the computer to classify fossils has a long history. Earlier work, termed ‘numerical taxonomy’ dates back to the 1960s. See for instance, John Schrock’s intriguing book review on Amazon.

* A US National Science Foundation-funded team at the Smithsonian Tropical Research Institute, the University of Illinois at Urbana-Champaign, the University of California, Irvine and collaborating institutions.


SEG – ‘adjust quicker to machine learning’

SEG Seismic Soundoff hears from ExxonMobil seismic guru on use cases for ML in seismic processing and interpretation. ML is good for some common tasks but not a ‘silver bullet’.

In Episode 94 of the SEG’s Seismic Soundoff podcast, Andrew Geary (51 Features) interviewed Mehdi Aharchaou (ExxonMobil) on ‘The case to adjust quicker to machine learning for geophysics’. Aharchaou reported that outside of our industry, use of supervised ML is now ‘more reasonable’. It does not apply to all problems and should not be considered a ‘silver bullet’. In geophysics, popular applications of ML include data QC, currently a labor intensive area that could be (more) automated.

Siamese neural networks are useful at the common seismic task of comparing two objects. ‘Similarity is at the heart of what we do’, comparing two or more results from a processing sequence and providing an objective assessment of improvement. ‘Edge-aware’ filtering is an objective, streamlining and simplifying the imaging workflow allowing for comparison of observed and simulated results or of near and far trace AVO.

Deep learning can be used to extend seismic bandwidth, learning from an ocean bottom data set and reconstructing the low-frequency content of conventional recording. Likewise, high-frequency data from shallow acquisition can be used to train regular data and make for synthetic broadband. AI trained on Kirchhoff and reverse time images can be used to avoid prohibitively expensive reverse time migration.

But progress to date has been disappointing. We need better adoption and more impactful use. All this work has been done in the last five years. More time is needed to catch up on developments in ML/AI, especially computer vision. ML should not be seen as a hammer looking for nails. Not all problems are candidates and some geophysical processes already automated and have no need for ML. There is also the challenge of moving from proof of concept to at-scale. We are currently just scratching the surface.

More in The Leading Edge Special section: Machine learning and AI (October 2020).



2020 Oil and Gas Machine Learning Symposium

The Advertas/Geophysical Insights-backed online event hears Paradise use cases from RockServ and Idemitsu Norge. Southwest research Institute’s SLED, smart leak detection. AgileDD’s open source Tabio toolkit for data retrieval from scanned documents. Yokogawa’s ‘intelligent’ GOSP gas oil separation plant. IBM’s Production Optimum Advisor.

Ali Bakr (RockServ) presented a multi-disciplinary workflow for unsupervised machine learning based interpretation of a 3D seismic survey across the Shell-operated Sr Field, offshore Nile Delta. Conventional mapped with Hampson Russel spectral decomposition was followed by ML using Geophysical Insights’ Paradise software. Attributes were ranked with principal component analysis and classified using Paradise’s self-organizing map. The approach confirmed the main targets and located a ‘significant other channel’ that was undetected by conventional spectral decomposition.

Sharareh Manouchehri (Idemitsu Norge) also presented a Paradise case history on the Norwegian North Sea Wisting field. Wisting is a very shallow target that has been surveyed with an ‘ultra high resolution’ PCable survey. This uses short-offsets only, so AVO was not an option. Instead, Paradise was used to perform PCA/SOM on the 1 ms data. The result was a ‘clear improvement in reservoir characterization as compared to traditional quantitative interpretation’.

Heath Spidle from SWRI, the San Antonio based Southwest Research Institute has developed ‘SLED’, SWRI’s smart leak detection system. SLED provides ‘automated, unmanned detection and quantification of fugitive methane emissions. SLED provides an early indication of an unexpected emission. Drone-based sensor data requires special attention as the motion and viewpoint of the camera defy static pattern recognition algorithms. SWRI’s ‘powerful deep learning algorithms’ offer low false positives and pinpoint ‘otherwise invisible’ emissions. 96% precision and 2% false positive rates are reported. Tools of the trade include the FLIR MWIR infra red camera and the NVIDIA Tegra embedded/mobile GPU. SLED/M is the methane detection variant. Another system, SLED/C, detects crude oil leaks. More from SWRI.

Amit Juneja (Agile Data Decisions) presented ‘Tabio’, an open-source toolkit for detecting and extracting tabular data from scanned documents. Tabio uses machine learning to locate tables and figure text alignment, spacing and numbers and letters. Tabio is released as open source under the MIT license ‘for the benefit of the data management community’. Tabio development was sponsored by Total, Technip, Saipem, Schlumberger, Subsea7 and IFPen.

Mustafa Al-Naser (Yokogawa and King Fahd University of Petroleum and Minerals (KFUPM)) presented an ‘intelligent approach to GOSP (gas-oil separation) and enhanced oil recovery’. GOSP plants are traditionally operated at fixed conditions, ignoring ambient temperature variations, leading to lower recovery. The project set out to optimize setpoints at the high and low pressure production traps. A GOSP dynamic simulator was built by Yokogawa’s OmegaLand unit to investigate optimal settings for different ambient temperatures. ‘Artificial intelligence techniques’ determine the optimum pressure required to maximize production.

Crystal Lui (IBM Canada) presented the Prediction-Optimization Framework, an AI based tool developed for a major Canadian oil sands operator (Lui was previously with Suncor). The solution uses a system of systems approach that spans data silos across the Syncrude process flagging potential process upsets before they happen. Scenario generation integrates mass balance and machine learning to generate production plans based on dynamic events and operational objectives.

More from the Oil & Gas Machine Learning Symposium.


Software, hardware short takes

New releases, updates from Novi Labs, Cegal/Blueback, Emerson/Paradigm, CGG GeoSoftware, Ceetron Solutions, Rockware, Weatherford, Wood Mackenzie, OspreyData, Quantum Automation, DSI, Quorum Software, Assai, TRC Consultants, Aucerna, AspenTech, Siemens, Gexcon, nVent, Yokogawa Electric, Blue Marble, CGG/Sercel, OriginLab, University of Manchester.

Geoscience/Reservoir

Novi Labs has released V1.0 of the Novi Data Engine, a machine learning toolkit for unconventional development. The NDE provides custom data-processing algorithms and data ingestion and management tools. NDE understands complex data such as spacing, stacking and timing. More in the Novi Labs video.

Cegal/Blueback’s Python Tool Pro adds Python–scripted data science to Schlumberger’s Petrel. PTP lets geoscientists run machine learning algorithms and data science workflows against their Petrel data and includes tools for data visualization and code sharing.

Emerson/Paradigm has released version 19.2 of its Emerson E&P software suite, available both on-site or cloud-hosted. New features include GPU acceleration of the EarthStudy 360 Imager. Seismic migration can be performed in parallel with data loading, providing a performance boost for large ultra-dense data sets. The new release includes applications co-developed with Repsol under the ‘Kaleidoscope’ alliance, including seismic-guided velocity, structural tensor filters and partial stacking AVO. The Echos seismic library now supports Python scripting. Enhancements to SKUA-GOCAD accelerate reservoir modeling workflows. More from Emerson.

Emerson has also release V20 of Geolog, its petrophysical and formation evaluation software. A new ribbon-like toolbar and menus with intuitive icons guide users and the tool offers faster access to data, processing tools and views to visualize and interpret results and deterministic petrophysics functionality for thin beds, shale plays and the CIS market.

CGG GeoSoftware’s Jason Workbench 10.2 brings enhanced display options, upgraded QC and monitoring, and more user-friendly interfaces. New scripts and Jupyter notebooks have been added to the Python machine learning ecosystem. HampsonRussell 10.6 includes an interactive radon analyzer, an AVO interpretation cross plot template and a new inversion algorithm. PowerLog 10.2 for petrophysical interpretation includes patented outlier detection to improve curve data quality. InsightEarth 3.6 now includes WellPath, an interactive well path planner for large multi-well pads or platforms.

Ceetron Solutions has announced ResInsight 2020.10, a ‘major release with many new features’. Check out what’s new in the free, open source reservoir simulation post processor here.

Rockware’s WellCAD v5.4 has a new velocity analysis workspace and a ‘core shifter’ for depth match of core and log data. More on RockWare’s oil and gas software here.

Weatherford has rolled out ForeSite Sense reservoir monitoring, providing ‘instant intelligence to production optimization’. ForeSite Sense displays critical downhole data on pressure, temperature and flow in real time. Custom ‘pods’ tune the toolkit to particular production regimes such as shale or deepwater. ForeSight is a DAS (distributed acoustic sensor) system that has already been deployed in some 7,000 wells. More from Weatherford.

Wood Mackenzie’s new Lens Upstream Optimization, (an extension of its Lens hosted, big data/analytics solution) supports merger and acquisition specialists with a holistic view of the economic impact and tax implications of M&A on company portfolios. Companies can add their own data to the system using the Lens Direct API service.

Operations

OspreyData has launched a ‘digital field quick start’ program to help oil and gas producers transition to digital oilfield operations. The AI-based DFQ optimizes operations across the major artificial lift types. DFQ builds on OspreyData’s Production Unified Monitoring solution that provides detailed well views, lift-specific tools such as Dynacards and pump performance curves, and OspreyData’s rapid event highlighter, a data labeling and collaboration tool. More from OspreyData.

Quantum Automation has announced QCloudServer industrial internet of things (QCS IIoT) system for remote monitoring and control. QCS IIoT assembles heterogeneous hardware, software and networking technologies for IIoT projects. End users and systems integrators can connect to edge-source data, transmit it on-site or to the cloud, aggregate and log the data, perform calculations and analysis as needed, and deliver mobile/web visualization. QCS IIoT connects with any user-supplied edge systems supporting MQTT. Otherwise, Quantum’s QRTU provides connectivity to multiple PLC and edge devices. QCS IIoT can be deployed onsite, but Quantum recommends cloud-hosting on AWS.

Document management/supply chain

DSI has released a new version of Cloud Inventory, its cloud-based inventory management solution. The flexible, ‘low-code/no code’ approach lets users adapt supply chain processes as their businesses evolve. More from DSI.

Quorum Software’s latest myQuorum DynamicDocs cloud-based document management system for oil and gas is now natively integrated into the myQuorum platform, providing robust, industry-specific document management alongside a comprehensive suite of transactional, operational and accounting solutions. The integration bridges the upstream paper-to-digital divide and with workflows that span standard business processes and applications.

The 9.7 release of Assai’s engineering document management solution clarifies various aspects of the document workflow and user roles and status. AssaiWEB now supports CAD format template generation. Document handover has been simplified heralding the arrival of a new ‘handover to project’ functionality to roll-out real soon now!

Economics

TRC Consultants is offering a preview of the latest version of its PHDwin economics package. PHDwin V3 boasts an intuitive and customizable user interface, improved calculation and reporting speeds, new graphics that include a ‘bendy B-factor’, ability to graph revenue, expenses, and investments, and scenarios that capture qualified data for ‘what-if’ scenarios. Checkout the presentation.

Aucerna’s Enersight 2.14 connects assets to corporate teams with ‘file-less’ integration to PlanningSpace. A new solver brings a performance improvement and reduces model uncertainty. Water recycle workflows have been improved and well list data management simplified. The embedded Val Nav fiscal regime capability now supports evaluation of facilities.

Downstream

AspenTech has released AspenONE V12 with added artificial intelligence and cloud delivery. The industrial solutions democratize the application of AI and represent a ‘step towards the self-optimizing plant’. Hybrid AI/physical models embed engineering first principles and domain expertise to deliver ‘comprehensive, accurate models at scale’.

In a similar vein, Siemens latest Mindsphere uses AI to optimize maintenance of rotating equipment. A ‘predictive service assistant’ provides early warning of drivetrain anomalies, recognizing fault patterns such as misalignment or defective bearings. When a problem is detected, a warning is issued along with a recommended due date for a fix.

Gexcon’s flagship explosion and dispersion modelling software, FLACS-CFD 20 has been re-written from the ground up to offer ‘better integration and digitalization capabilities, improved workflows and greater flexibility’. FLACS-CFD 20 represents ‘more than 40 years of extensive modelling and validation work based on real-life testing’.

nVent Electric has released the nVent Raychem Supervisor, an internet of things platform to connect, control and monitor temperature-critical assets. The first deployment is as the Raychem pipeline supervisor for predictive maintenance of temperature-sensitive pipelines. The heat trace monitoring software delivers performance trends and actionable data insights to enable the safe and efficient operation of vital heat tracing infrastructure. Supervisor monitors nVent Raychem electronic temperature control products from centralized and remote locations. A new Elexant 9200i controller adds wireless connectivity to the platform.

Yokogawa Electric has released V3.4 of its OmegaLand dynamic simulator the core element of Omega Simulation’s plant training simulator. New functionality transforms panoramic images captured with a 360-degree camera into a true 3D virtual reality simulation environment. OmegaLand can integrate with control systems, equipment and other simulators to create credible ‘digital twins’ of plant for training and ‘what-if’ design analysis.

Miscellaneous

A new release of Blue Marble GeographicsGeoCalc Online hosted geodetic parameter repository and transformation engine now includes a point-to-point calculator for general purpose coordinate conversion and geodetic calculations in the browser.

CGG/Sercel has released S-lynks, a solution for real-time monitoring of the structural integrity of buildings and infrastructure. S-lynks monitors structures with Sercel’s ‘ultra-sensitive’ QuietSeis sensors. Data is streamed to the cloud for processing and remote analysis. The solution is marketed in a joint venture between Sercel and engineering consultancy Apave.

The 2021 edition of OriginLab’s data analysis and graphing software adds over 75 new features, apps and improvements, enhancing Origin’s ease-of-use, graphing, analysis and programming capabilities. A new Origin Pro package provides Python access to Origin objects from an IDE with syntax highlighting, debugging, and a package manager. New apps include Neural Network Regression, Taylor Diagram, rank by fit reports, optimization solver, TDMS and Yokogawa WDF connectors. Download a free trial version here.

Not all AI runs in the cloud or needs big data. The University of Manchester (UK) has announced Scamp, a stand-alone camera and edge processor that performs image classification at 17,500 FPS ‘without a CPU or GPU’. The camera itself runs a convolutional neural network. After training, the model runs on an onboard pixel processing array. The researchers have used the Scamp for hand gesture recognition and plankton classification at speeds of 2,000 to 17,500 frames per second, all while consuming less than 1.5 watts of power.


PNEC 2020 Online

Ecopetrol deploys Kadme Whereoil cloud platform. PPDM professionalizes the data managers. Troika on seismic formats, SEG and OSDU. Rive University curriculum evolves towards data science. ExxonMobil at forefront of seismic data management. Total moots hybrid, on premise/cloud data solution, downplays OSDU. Denondo data warehouse for Oxy/Anadarko. Apache AirFlow data ingestion for Schlumberger’s Delfi. Noble Energy cleans data with InnerLogix. Bluware on the true costs and gotchas of data in the cloud. Shared data enhances Woodmac Analytics Lab model. LEK Consulting compares digital maturity across industries.

Gustavo Londono presented on NOC Ecopetrol’s a cloud-based upstream data platform that leverages Kadme’s Whereoil* technology to provide ‘fast and complete’ data access and advanced natural language search. Ecopetrol’s decade old legacy system had ‘reached its limits’ and the time spent looking for geoscience data was back up to ‘around 80%’. Two years ago, the company started work on a transformation and contracted with Kadme for the map-based, in-cloud solution. The automated data consolidation platform includes an NLP capability across 2.5 million documents and 800k logs. Data enrichment adds quality flags, deduping, text from PDF/scans and georeferenced documents for an area of interest (AOI) search. Data migration began in October 2019 and the system launched to 800 concurrent users a year later. Ecopetrol’s legacy systems have been decommissioned and the company is preparing for ML/AI models in a mature data lake. The solution also manages check-out and return of physical data objects. Teams collaborate via the cloud.

* Kadme’s Whereoil is also used by YPF as we reported last year.

Cynthia Schwendeman (BP) and Patrick Meroney (Katalyst Data Management) outlined work done in PPDM’s Professional Development Committee. The PDC has surveyed some 500 data managers across industry, finding ‘high variability’ in job descriptions. PPDM has taken upon itself to standardize these and provide guidance on roles and responsibilities from business data owner/steward, business analyst, project manager to data scientist with ‘six simple role descriptions’. Subject to board approval, the PDC approach is to extend into in midstream/downstream and renewables.

Jill Lewis (Troika) observed that the standards from organizations such as the IOGP, SEG and Energistics are all free. The navigation standards from the IOGP’s ‘great mathematical minds’ have earned world-wide recognition. These are now shared positioning standards across IOGP SEG and Energistics, with a special mention for the latter’s units of measure work. The question now arises as to why the industry at large has not ‘moved with us’. Many still use SEG Y Rev zero! R1 was released in 2000 but not much used. In 2017, R2 came along with a machine-readable implementation suited for automation and machine learning applications. Turning to OSDU, Lewis observed that OSDU release 2 supports both SEG-Y R2 and (the somewhat competing) OpenVDS format. Overall, take-up for R2 has been ‘pretty poor’ but support is coming from some NDRs with, notably, its inclusion in the NPD Yellow Book. In-country regulations may also mandate storage in a neutral format such as SEG-Y. Lewis expressed disappointment that OSDU was not engaging more with the SEG standards committee.

Dagmar Beck (Rice University) gave the Tuesday keynote presentation on how university education is evolving to meet workforce needs. These center on a need for data scientists that are ready to work in the ‘era of big data that encompasses a wide range of industries and businesses’. Rice University’s professional science master’s programs (PSM) prepare post grads for management in technology-based companies, governmental agencies and not for profits. These programs combine study in science or mathematics with coursework in management, policy, or law. The science and business programs were initiated in 1997 by the Sloan Foundation and Rice was one of the first to get Sloan funding for its subsurface geosciences program and others. The PSM Programs evolve as per industry desiderata, today this means energy data management, data science and data governance. All of which is now bundled into the Rice Energy Data Management Certificate program. The data science component is provided by the Rice D2K Initiative.

Yuriy Gubanov put ExxonMobil ‘at the forefront’ of revolutionizing data management practices. Today ExxonMobil is developing a cloud infrastructure to move its seismic data from storage ‘in salt mine caves’ to long-term geoscience archives in the cloud. ‘Blob storage has replaced boxes of tapes and hard copies’. Exxon’s seismic data is marked for ‘indefinite retention’, stored on tape, but often in legacy formats that can be hard to read. A large amount of data is ‘underutilized’. Gubanov advocates digitizing everything in open formats to avoid proprietary lock-in.

The biggest challenge for cloud storage is up/download in this complicated architecture. Data egress charges can be high and security is an issue. But the cloud offers a lot in terms of APIs and a virtual desktop infrastructure. Exxon has evolved a ‘cloud first’ architecture with data access via an API. Tapes go into cloud blobs (binary large objects). The cloud is making data ‘ready for ML’.

Having said that, Exxon is ‘not yet done’. There are challenges. Some divisions hold back on cloud adoption. The cloud is not free. Data sovereignty can be an issue in some jurisdictions where it may be necessary to run an application in a local data center. There are also additional use cases coming out of nowhere which may challenge assumptions economically and technically. Today, Exxon has migrated terabytes, but there are ‘petabytes left!’ QC needs more work and third party integration needs fine tuning.

And (speaking of additional use cases) there is OSDU which ‘prompted us to look at our architecture’. There is overlap of several components, but much is complementary and ‘we may make our APIs OSDU compatible’. Gubanov concluded that the move to the cloud is both an enabler and an ‘inspirational goal’. But one that ‘both IT and the business are committed to’.

In the Q&A Gubanov elaborated on the relationship with OSDU. Ingestion and cloud services overlap, but workflows are complementary as is the API. Some stuff that is not compatible will need some refactoring, ‘mostly on our side’. The main geoscience workflows are not cloud based which makes for data egress charges. Exxon is experimenting with ‘express style’ connections to the cloud to optimize up/down load. Exxon is also migrating complete workflows to the cloud to minimize data movement.

Hilal Mentes presented Total’s ‘innovative approach’ to seismic data management. Total has developed in-house, on-premise data stores for seismic (DMS), well data (DMW) and interpretation results data storage (IRDS)*. The current system works well but is said to scale poorly and is ‘not amenable to AI/ML’. The ‘to be’ situation envisages ‘more autonomy, collaboration and standardization’. In other words, ‘a data lake’. The idea is for a hybrid, on premise/cloud solution as a future replacement for DMS.

Mentes cited the work of OSDU in the context of a cloud-based, single source of the truth. The OSDU API promises a path to learning and sharing best practices for a cloud migration. However, the OSDU Plan in Total has been cancelled**. The future is less clear. ‘we will probably start using a cloud solution, it may be OSDU’. More work is needed to develop ‘an efficient data platform that will open the way for the digital transformation’.

* The databases are a component of Total’s Sismage-CIG (Geoscience and Reservoir Integrated Platform)

** One issue cited by Mentes is whether an OSDU-based cloud system will be able to handle real-world seismic data loads and what would the performance be compared to Total’s current systems.

Ravi Shankar (Denodo) presented a case study of Denodo’s work for Oxy/Anadarko evolving a ‘traditional’ data warehouse into a ‘logical’ data warehouse. The old data warehouse is not so good at capturing unstructured and other novel data types. Attempts to create data lakes in Hadoop have led to more data silos. The ‘single source of truth’ remained elusive. ‘Only 17% of Hadoop deployments are in production’. Enter the logical data warehouse (LDW) a recognition that all data cannot reside in a single location*. The data lake and data warehouse are complementary and the LDW overlays both. Denodo’s LDW/data virtualization allows for data abstraction and per-user/persona-based access. Queries are run across original data in situ and are said to be faster than Hadoop. A multi-cloud potential is claimed.

More on Denodo’s work with Anadarko here.

* The LDW has echoes of many earlier virtual data stores and data virtualization efforts such as OVS, Petris (now Schlumberger), Tibco and others.

In a similar vein, Anubhav Kohli (Schlumberger) reported on the shift from on-premises to cloud-based data warehouses and unstructured data lakes. These have created problems for data managers trying to combine existing workflows with the new platforms. Enter Apache Airflow, an open-source workflow manager originally developed by AirBnB and now used by Adobe, Google, ING and others. In Airflow, workflows are represented as directed acyclic graphs, collections of different tasks that can be run sequentially or in parallel. DAGs are suited to modern deployment mechanisms such as Docker and Kubernetes.

A ‘data to Delfi’ workflow showed how the Airflow GUI provides click-through access to tasks and code. A DAG is a collection of tasks, dependencies and operators. These can be assembled into a workflow using Python, with metrics on performance etc. Airflow has been selected as the main workflow engine for OSDU, the Open Subsurface Data Universe.

In the Q&A, Kohli was pressed on Delfi data ingestion. It turns out that the example shown was more of a demonstration of Airflow functionality. Creating pipelines from various sources and technologies can become quite complex. Schlumberger’s Delfi exposes its own ingestion services for different file formats like CSV, LAS, DLIS, documents and SEGY. But Schlumberger does use Airflow and has built multiple end-to-end Airflow pipelines using Google Composer inside the Google cloud. These connect with third party data sources, adding asynchronous retry mechanisms, load balancing and alerts for job completion. Schlumberger’s DAGs ingest nearly 50 million records and have produced ‘an overall 64% reduction in man-days’.

Ankur Agarwal described Noble Energy’s use of Schlumberger’s InnerLogix data cleansing toolset to automate data loading. Noble has data stored in multiple applications including WellView and Prosource. The question is, which is the system of record? All give a different answer! Noble is addressing the problem by adding rule-based data quality management from InnerLogix to its data in ProSource. The system applies business rules such as ‘no well without a UWI, lat/long’ across applications from different vendors, taking tops from Petra into WellView and synchronizing Petrel Studio with Petra and other applications.

Andy James (Bluware) asked rhetorically, ‘If cloud storage is so cheap, why isn't everyone moving their petabytes of seismic data to the cloud?’ Talk is of the economies of scale associated with cloud storage, but few oils are ‘jumping on the cloud storage bandwagon’. Moving petabytes of seismic data to the cloud is difficult, figuring the true cost of managing large data sets in the cloud is hard and the exercise does not, in itself, add business value.

In a large oil company, seismic data can be 85% of the total data volume and is likely stored across multiple locations and formats including tape. So how much would a move to the cloud cost? Key to the cloud is the object store which is cheap but differs from regular file storage. A petabyte-month on AWS costs something like $24k in hot storage, $4.2k in cold. Then there is the question of the user experience for a typical geophysical workload which conventionally likes data to be close to compute resources. This can be achieved by moving the workstation into the cloud. Indeed, virtual workstations ‘have been around for a while’. On the desktop, 60 frames/second in the browser ‘is no problem’.

But geoscience apps don’t work with the cloud’s object store, they work with files - SEGY etc. Unfortunately, file systems in the cloud are very expensive. On Amazon, a NetApp file server in the cloud can be $400k/PB/month. You should not fixate on the cost of archival in the cloud, the true cost is for usage in the cloud. One solution is to abandon the file format and use Bluware’s OpenVDS and VDS formats. These bricked formats, which work with pre and post stack data, leverage the cheap object storage and offer fast encode/decode to application-specific formats. Data is read from cheap, scalable object storage and streamed with Bluware’s ‘Fast’ decoder into an application running on a virtual machine in the cloud.

In the Q&A, James was pressed on cloud costs. He acknowledged that there is a lot of uncertainty as to how much will this cost in the long run. There is a ‘fixation’ on archive cost, but the cloud vendors charge more for data egress which should be avoided*. Another issue is the relationship between VDS and the SEG’s standards. Jill Lewis (Troika and SEG Standards Committee) invited Bluware to ‘join us in making SEGY cloud ready. Open VDS is not an open format as it sits behind an API that hides the actual data that is being written’.

* A similar situation existed (and probably still exists) with physical document and tape data storage. Vendors take your data off your hands for relatively little. But getting it back is a different price point!

At PNEC 2019, Wood Mackenzie presented its Analytics Lab*, a cross-industry offering that encourages companies to build data consortiums. This year, Woodmac’s James Valentine presented the results of a pilot project in the Bakken shale that demonstrated the value obtained from analytics across a ‘broader, higher fidelity dataset than is available publicly or to any single operator’.

To combat disconnects between data scientists and subject matter experts, analytics-derived data models need to be explainable. Selecting the best predictive features is crucial. X/Y (lat/long) features are ‘prime offenders’ in the analytical model that ‘bring all sorts of variables along for the ride like geology, operators’. Variables that don’t add value need to be removed. ‘Kitchen-sink style’ ML is unsuitable for decision making as it is extremely easy to overfit the data, resulting in million dollar failures. Explainability comes from interaction with the model. What happens when you hold out one operator, or hold out last year’s data? Observe the changes in the output and see the limits of the model as error rates go way up. Data sharing between operators (even of a small data set) may provide very different and valuable experimental results.

Very large data sets are needed to train a reliable model. Enter the Woodmac/Verisk Data Consortium. The Bakken proof of concept trial was based on some 250 million data points and was ‘a real eye opener’ for participants, showing a huge data quality opportunity to homogenize frac fluid volumes and units of measure between public data and operator data. The final model included commercial information on well costs from other Woodmac units.

* As we reported earlier in Oil IT Journal.

Houston-based LEK Consulting has surveyed a broad range of industries including the upstream with regard to digital success and how companies can ‘stay ahead of the curve’. Across the board, leaders are pressing their advantage mostly in two areas: ways of working (automation, remote monitoring) and digitized operations (planning procurement, supply chain). In both areas, the difference between good and bad is huge, and the gap is likely to widen. The imperative is particularly great in the upstream, a ‘challenging environment’ that is ‘behind the curve’. Although the upstream is ‘about on par with heavy industry’ and the situation is less dire when the level of complexity and the high cost of failure. ‘Applying AI is hard, development is done relatively infrequently. The upstream is not your average industry. Upstream is making steady progress given its problem set’. One difficulty stems from the decades of previous E&Ps digital initiatives that have ‘not much to show for them’. Gujral cited BP as a digital success with its rapid prototyping and deployment of sensors to detect fugitive emissions, a partnership with an AI startup to optimize workover frequency. Suncor also got a shout-out as ‘managing to a specific digital P&L’. Download the LEK Survey here.

More from the PNEC Conferences home page.




Folks, facts, orgs ...

Borr Drilling, University of Cambridge, Chemical Safety and Hazard Investigation Board, Digital Guardian., DNV GL, Energistics, Engage Mobilize, Foster Marketing, FutureGeo, Digital Twin Consortium, LYTT, McDermott, Metegrity, Pason Systems, Petrofac, PPDM, ProPetro, PTC, Interstate Oil and Gas Compact Commission, SEC-DERA, TechnipFMC, Sword Venture, IOGP, INPEX, Norwegian Petroleum Directorate.

Christoph Bausch is to succeed Francis Millet as CFO at Borr Drilling. He was previously with Weatherford.

Clare Shine was named Director of The University of Cambridge Institute for Sustainability Leadership (CISL). She succeeds Polly Courtice.

President Trump has appointed Bruce Walker as Senior Advisor at the U.S. Chemical Safety and Hazard Investigation Board (CSB).

Cybersecurity VP, Tim Bandos, is now Chief Information Security Officer (CISO) at Digital Guardian.

Former Telenor CEO Jon Fredrik Baksaas is DNV GL’s new Chair of the board.

Guilhem Dupuy (Total) is now an Energistics board member.

Lisa Miller (Spearhead) has joined the Engage Mobilize Board of Directors.

Mackenzie Lee has been named as a digital associate at Foster Marketing. She is a recent graduate from the University of Louisiana at Lafayette.

Tom Backhouse’s Terrafirma company has launched ‘FutureGeo’, to ‘inspire a generation of geoscientist [ ... and form a ... ] modern, inclusive, diverse and connected geo-community’.

Autodesk, GE Digital, and Northrop Grumman have joined the Digital Twin Consortium as founding members. Membership now approaches 150.

Daryn Edgar is now LYTT’s new CEO.

Neil Bruce has been appointed to the McDermott board.

Martin Fingerhut is now Metegrity’s CEO. He succeeds Adrian Met who continues as Chairman.

Dick Alario has stepped down from his short-term role as Executive Vice Chairman of DistributionNOW. He continues as director.

Celine Boston has been appointed CFO at Pason Systems. She hails from CES Energy Solutions.

Sami Iskander is Executive Director of Petrofac replacing retiring co-founder Ayman Asfari. Asfari is now non-executive director.

David Hood (geoLOGIC systems), Jamie Cruise (Schlumberger) and Ali Sangster (IHS Markit) have been re-elected to PPDM’s board of directors. Sue Carr (Katalyst Data Management), Kolleen Kidd (retired), and Daniel Perna (EPAM) are now members of the board.

David Schorlemer is now ProPetro’s CFO succeeding, Darin Holderness, who is leaving the company. Schorlemer hails from Basic Energy Services.

Troy Richardson has been appointed EVP and COO at PTC. Mike Bethea heads the new PTC Location in Mexico.

RRC Commissioner Wayne Christian has been appointed to Vice Chairman of the Interstate Oil and Gas Compact Commission.

Mike Willis, newly appointed Associate Director in the SEC Division of Economic and Risk Analysis (DERA), will lead the SEC’s new data office.

TechnipFMC has named Arnaud Pieton President and CEO-elect, Technip Energies. Jonathan Landes has been promoted to President, Subsea. Margareth Øvrum (Equinor) is now member of the board of directors.

Douglas Frisby is to lead as Business Unit Director for US operations at the newly opened Sword Venture office in Houston.

Adri Postema (JIP33) is now the IOGP’s Engineering Director.

Shusuke Katori has been appointed General Manager EMEA at INPEX London. Hideharu Yonebayashi is now Project General Manager, Abu Dhabi Projects Division.

We're hiring

The Norwegian Petroleum Directorate is hiring. The ten vacant management positions include IT development and operations, finances and joint services, data management, shelf analysis and other ‘technical disciplines’. More from the NPD.


Done Deals

Aker Solutions merges with Kvaerner. AqualisBraemar acquires LOC Group. Aucerna acquires Previso. Baker Hughes acquires Compact Carbon Capture. Caterpillar acquires Weir Group Oil and Gas. CGG emerges from financial restructuring. Dassault Systèmes bags NuoDB. Dresser acquires Flow Safe. EQT gets stake in ThinkProject. Genasys completes Amika Mobile purchase. geoLOGIC Systems acquires SubsurfaceIO. Hexagon acquires PAS Global. Inspirit Capital buys Lloyd’s Register Energy creating Vysus Group. Novara GeoSolutions CHA Integrated Solutions. Pelican Energy Partners has bought Baker Hughes’ pressure control business. Petrosmith has acquired Wellflex Energy Solutions. Quorum Software has acquired Landdox.

Aker Solutions has completed its merger with Kvaerner. The refocused combination offers solutions to reduce emissions from oil and gas installations and for the delivery of renewable energy production facilities.

AqualisBraemar is to acquire 100% of the shares in LOC Group, a specialist in oil and gas loss prevention and management services, and a marine and engineering consultancy.

Aucerna has acquired Previso Software, an integrated production modeling software boutique provider of compositional gas modeling.

Baker Hughes has acquired Compact Carbon Capture, a provider of carbon capture with a ‘rotating bed’ technology.

Caterpillar is to acquire Weir Group’s Oil & Gas Division in an all cash $405 million transaction. Weir provides pressure pumping, pressure control and aftermarket services to upstream oil and gas customers.

Following the early settlement in full of all its creditors, the Commercial Court of Paris has acknowledged completion of CGG’s safeguard plan. This ruling puts a ‘definitive end’ to CGG’s financial restructuring process that began in 2017.

Dassault Systèmes, with a current 16% interest, is acquiring the remainder of NuoDB equity. NuoDB provides a cloud-native, distributed SQL database.

Dresser Natural Gas Solutions has acquired Flow Safe, a manufacturer of spring-operated and pilot-operated high-performance pressure relief devices.

EQT has acquired a majority stake in ThinkProject from TA Associates and founder Thomas Bachmaier. ThinkProject is an EU-based provider of construction intelligence solutions for architecture, engineering, construction and owner-operators.

Genasys has completed the acquisition of Amika Mobile, a Canada-based provider of critical communications, situational awareness and emergency management products. Amika Mobile has been renamed Genasys Communications Canada.

geoLOGIC Systems has acquired SubsurfaceIO, a provider of cloud-based mapping and analytics solutions for oil and gas.

Hexagon has acquired PAS Global, a provider of operational technology integrity solutions. The acquisition is to form a new cybersecurity-focused business segment within Hexagon’s PPM division.

Lloyd’s Register has sold its Energy business unit to Inspirit Capital, creating a new standalone engineering and technical consultancy renamed Vysus Group.

Novara GeoSolutions, a provider of technology services and asset management solutions, has rebranded as CHA Integrated Solutions.

Pelican Energy Partners has completed the acquisition of Baker Hughes’ surface pressure control flow business. The new company will be renamed Vault Pressure Control.

Petrosmith has purchased Wellflex Energy Solutions, an EPC management company specialized in modular production equipment.

Quorum Software has acquired Landdox, a provider of cloud-based land management software. Landdox is to form the land management foundation of Quorum Upstream On Demand, a multi-tenant SaaS suite designed for upstream companies.


2020 IoT in Oil and Gas Virtual Conference

Energy Conferences Network Internet of Things in Oil and Gas hears from ExxonMobil on OPAF (Open Process Automation Forum) function block successes and on harmonization with NAMUR. Chevron warns on risks of ‘consumer’ devices in the workplace. McDermott’s ‘Gemini’, a major deployment of Dassault Systèmes 3D Experience. Parsley Energy’s WITSML/Cold Bore Technology/Halliburton technology stack.

David DeBari provided a backgrounder ExxonMobil’s Open Process Automation Program which is developing standards based, open interfaces for the industrial internet of things. The work is performed under the auspices of the Open Group’s Open Process Automation Forum (OPAF). For more introductory material see our previous coverage and the OPAF home page. DeBari announced the harmonization of the OPAF work with similar initiatives from the EU NAMUR standards body. NOA, the Namur Open Architecture is described as offering a monitoring and optimization capability ‘separate from and parallel to the control system’. Harmonization also extends to the Namur MTP (Module Type Package) a middleware layer that abstracts proprietary control systems for orchestration. The NAMUR specifications have just been published as Recommendation NE 171.

OPAS interoperability is achieved using ‘function blocks’, software objects developed according to IEC 61499 standard. In one trial development, Siemens, Yokogawa, Schneider and Rockwell developed independent process control function blocks using only the interfaces provided and with no access to each other’s code. A ‘cohesive’ application was built from the four different developer’s function blocks demonstrating that a written interface description is sufficient to ensure correct use, and that future intellectual property can be protected via pre-compiled target libraries. OPAS is currently under test with ExxonMobil, Yokogawa and others with field trials to demonstrate technical readiness planned for 2021-2023.

After the event we asked DeBari about possible overlap between OPAS and NAMUR, in particular, whether the collaboration was achieved through accretion or rationalization. He replied, ‘NAMUR and O-PAS are complementary architectures. [ they can ] fit together, do not conflict in their design, and are basically addressing different parts of an automation system. Accretion is really the better concept for this harmony’.

Chevron’s Michael Lewis warned of the risks of consumer IoT in the workplace. Increasingly, employees bring connected devices like fitness trackers and smart watches into the office. Virtual/augmented reality devices, drones and medical implants may likewise expose a company to novel threats. These include monitoring and tracking of a wearer’s activity. PINS and confidential information may be intercepted. Lewis suggests ‘abstinence’ as a cure. Turn off unneeded features, do not wear the device when working in public areas or better still do not work in public areas. Currently, the risk from most wearables is low, although the possibility of data exfiltration over Bluetooth by a malicious insider is considered ‘medium’ risk. Drones can be used to intercept or disrupt communications. Conversely, drones themselves are at risk of attack, by GPS spoofing and jamming. Lewis cited the NIST/NCCOE Mobile Threat Catalogue as source material for many of the above risks.

Vaseem Khan presented McDermott’s Gemini XD Collaboration PLM*, deployed on its capital projects. The industry has a mixed record of delivering complex projects on time and on budget. Digital technologies are coming to the rescue with a combo of a digital twin (aka a single version of the truth) and a digital thread that runs across a project from inception to decommissioning. Gemini (a rebrand of Dassault Systèmes’s 3DExperience) system talks to other software tools from Hexagon and AVEVA and, according to Khan, has simplified ‘confused and complicated project information flows’. Addressing the terminology, Kahn defines the ‘twin’ as a digital manifestation of the plant and the ‘thread’ as the connecting project data flows. Together they provide an integrated view of a project across the functional silos. The digital thread also provides traceability from the digital twin back to the requirements, parts and systems that make up the asset. Gemini XD is McDermott’s name for Dassault Systèmes 3DExperience platform. Watch the video here for more.

* Product/Plant Lifecycle Management.

Shaji John presented Parsley Energy’s real-time solution for well completion and operations. Witsml drilling and completion data streams into a Witsml data store which feeds into Halliburton’s OpenWells operations reporting software for drillers and also into an analytics data store for business users. These systems operate in hybrid storage spanning Parsley’s data center and the cloud. Expert systems perform activity detection and create an automated operation log from sensor data. Summary data is captured to Halliburton’s Engineering Data Model and integrated with completion data to provide data-driven completion efficiency reports.

Parsley makes extensive use of Cold Bore Technology’s SmartPAD server to aggregated data at the well site from CBT (Cold Bore Tech) devices on pumps and at the well head. These include a 15 kpsi rated valve pressure, a safety beacon that automatically signals a hot zone and others. Frac, wireline, pump down and flow back data is aggregated from service providers, again with auto-event detection and activity classification. CBT/SmartPAD KPIs are accessible from tablets or cellphones. An IOT Edge gateway uses an MQTT/AMQP protocol to talk with the cloud for in-cloud analytics.

More from the Conference home page.


GO Digital Energy Oil and Gas Middle East

McKinsey survey finds digital initiatives floundering. Petroleum Development Oman ARV project improves SAP data quality with machine learning. AVEVA befriends the data monster.

Anders Brun presented the results of a McKinsey survey of 1800 execs working in asset intensive industries. Strong support for ‘digital and analytics’ (DnA) as 91% believe such will ‘materially change their industry’. Brun offered some choice quotes from major oils’ C-suites on the expected benefits of AA. Across the O&G value chain, McKinsey sees a ‘transformational agenda’ emerging centered around 15 main themes. However, the McKinsey survey found that overall, only 17% were actually sponsoring large digital initiatives and as few as 2% reported seeing ‘material and sustainable benefits from DnA’. So what is going wrong?

Brun cites a number of failings. A mindset that focuses on technology rather than business impact. The difficulty of navigating the technology jungle where ‘everybody offers everything’. Organizational governance is not fit for purpose and there is a general lack of usable data, many firms are struggling to deliver value due to data challenges. 72% noted managing data as being one of the top challenges to scaling data and analytics impact.

In an inditement of 20 plus years of data management initiatives, McKinsey found a lack of buy-in from business leaders and data governance that exists only ‘on paper’ as opposed to being actioned. Data architecture development requires much more time and investment than realized and data itself is too often perceived as ‘IT-stuff’ as opposed to a business asset. Finally, there is a lack of data talent such as data architects, data engineers and data visualization experts.

Oil and gas companies are keen on executing digital pilots and proofs of concept but fail to go the ‘last-mile’ and realize the actual benefits. What is needed has been formulated as the McKinsey DnA Tech-Enablement Playbook for accelerating digital transformation. This includes findings from the top quartile digital performers who for instance, spend over 50% of their budget on changes to the ‘last mile’ of business processes. Other recommendations include an operating model designed around collaborative, cross-functional teams and clear accountability and decision-making pushed down to the working team-level. Download the Playbook for Utilities.

Aleksandr Zykov showed how Petroleum Development Oman has used machine learning to improve data quality in an asset register verification (ARV) project. Incomplete and erroneous values in PDO’s SAP Asset Register were a contributing cause of process safety incidents. The ARV project set out to align SAP technical objects with process engineering flow schemas and P&IDs. Attempts to do this manually were time consuming and error prone. PDO developed an ARV process improvement toolkit that learns from previous object matches and provides quality control of incoming technical data.

The model uses regular expression matching between SAP functional location and P&ID tags right along the safety critical systems pipeline. An analysis of weighted matches determines which of various possible values is statistically most likely to be correct. Object matching time is now down to 0.09 sec/object. The whole 180,000 set of process containment equipment tags can be QC’d in one pass. PDO is now looking to apply the toolkit to other data matching exercises such as real time operations PI-to-SAP plant maintenance data. The process has identified hitherto undocumented functional locations. These have been fixed with ‘focused data harvesting’.

Interestingly, in the light of the McKinsey presentation above, Amit Kar (Aveva) thinks that the ‘data monster’ is your friend? Once you acknowledge this, like a good friend, the monster will tell you the truth about yourself. But this is a relationship that needs to stay fresh! Kar, citing a different McKinsey survey, forecasted that operations would be ‘fully infused’ with AI in the next few years with a ‘+122% impact on cashflow’. Companies making AI progress by 2030 will see a +10% impact and the AI laggards will get -23%, due to eroding competitiveness. In all events, Aveva is there to capture plant data in its ‘living’ digital twin with added cloud data storage and AI capabilities. Although Kar did not cover this in his talk, the Aveva’s acquisition of OSIsoft, the developer of the ubiquitous PI System has hiked Aveva’s data street cred somewhat.

More from GO Digital.


Sales, partnerships, deployments ...

CGG Smart Data Solutions sale and contract with PDO. Aker BP to support general purpose data centers. Tommeliten Alpha FEED for Aker Solutions. Norwegian Research Council award to Applied Petroleum Technology. Aramco IKTVA MoUs. Arria NLG for Total. JLL supports BP set zero. Datagration now Microsoft partner. DNV GL and Bluewater test hybrid FPSO digital twin. Easy Aerial adds GPSdome to drones. BP awards automation contract to Emerson. Bell Geospace teams with Transparent Earth Geophysics. Neptune Energy to deploy DecisionSpace 365 well construction. Implico teams with Minsait. iPIPE awards tech partnership to Orbital Sidekick. Lummus Digital formed. Novara GeoSolutions now Esri ‘release-ready’. Pro-Frotas, Ipiranga hire Radix. RigNet Intelie Live for Permian Basin. ProFlex teams with Siemens Energy. Sercel WiNG for Paragon. Siemens Energy teams with Bentley Systems. Schlumberger for KOC, Suncor, ANPG. Agora Edge AI for Petronas, Ecuador. Performance Live for PTT E&P. Nutanix for Total. KBR for USGS EROS. Equinor teams with SINTEF.

CGG’s Smart Data Solutions has been selected by an unnamed* ‘major international energy company’ for the digitization and multi-year storage of its legacy data.

CGG has secured a 3-year extension to its land seismic imaging services contract with Petroleum Development Oman.

Aker BP is to leverage its HPC experience in support of general purpose data centers.

Aker Solutions has secured a FEED contract from ConocoPhillips for modifications on the Ekofisk installations that will tie-in the Tommeliten Alpha development.

Applied Petroleum Technology has been awarded funding by the Norwegian Research Council to develop image digitalization and associated AI techniques.

Aramco has signed MoUs with Shell, Suzhou XDM, Shen Gong, Xinfoo and Supcon (China) and Posco (South Korea) to expand its flagship ‘IKTVA’ program (in-Kingdom total value add) to increase local content and boost domestic supply chains.

Arria’s NLG technology has been deployed in a financial planning and analysis use case for Total’s internal reporting. French professional services partner Demain.ai helped close the deal.

JLL has signed a multi-year agreement to provide workplace and real estate solutions to support BP’s transformation and net-zero carbon ambitions.

Datagration is now a Microsoft partner providing cloud solutions to the oil and gas industry.

DNV GL and Bluewater have partnered in a test of hybrid digital twin technology to optimize the structural safety of the vessel and enhance risk-based inspection of a North Sea FPSO.

Easy Aerial is to integrate InfiniDome’s GPSdome solution for GNSS/GPS signal protection in its line of military-grade drones.

Emerson has secured a $14 million contract to provide automation technologies for BP’s new Azeri Central East offshore platform in the Caspian Sea.

Bell Geospace and Transparent Earth Geophysics (previously CMG Operations) are teaming to offer data acquired from the latest GTz airborne gravimeter.

Neptune Energy has awarded Halliburton a three-year contract for its DecisionSpace 365 well construction suite. Moving to the cloud-based solution will enable Neptune to incorporate artificial intelligence, machine learning and data analytics to ‘solve upstream challenges and support the company’s overall digital transformation’.

Implico and Minsait, have teamed to ‘guide and support’ oil and gas companies in Latin America, the Iberian Peninsula and Italy in their SAP downstream solutions-based digitization endeavors.

Orbital Sidekick has been awarded a 12-month technology partnership agreement by the Intelligent Pipeline Integrity Program (iPIPE) to deploy its satellite-based hyperspectral pipeline monitoring service over the Bakken and Permian basins.

Lummus Technology and TCG Digital have formed Lummus Digital to implement digital analysis and operating solutions to refining, petrochemical and gas processing assets.

Novara GeoSolutions has been awarded Esri’s ‘release-ready’ specialty status.

Pro-Frotas, a digital fuel supply management start-up, in partnership with Ipiranga, has hired Radix Engineering and Software to create an application and web portal for the management and control of fueling.

RigNet’s Intelie Live suite of machine learning solutions and software has been selected by an unnamed Permian Basin independent.

ProFlex’s digital Pipe-Safe leak detection technology is to combine with Siemens Energy’s IoT to minimize leaks.

Paragon has selected Sercel’s WiNG land nodal acquisition system to conduct seismic surveys across the USA.

Siemens Energy and Bentley Systems have jointly launched asset performance management for oil and gas. APM4O&G combines Bentley’s AssetWise with Siemens’ technology and service expertise in maintenance operations and planning.

Kuwait Oil Company has awarded Schlumberger a five-year $109 million contract to implement Petrel other petrotechnical applications.

Suncor Energy has signed a multi-year agreement to use the Schlumberger Delfi E&P suite. The agreement includes a heavy oil R&D collaboration on digital technologies.

The Angolan Agência Nacional de Petróleo, Gás e Biocombustíveis (ANPG) has chosen Schlumberger for its ‘first-ever’ digital transformation project.

Schlumberger has implemented its Agora edge AI and IoT solutions on its own production projects in Ecuador.

Petronas has deployed the Agora platform on mature assets to improve its wellsite safety and productivity while reducing greenhouse gas emissions.

Schlumberger has deployed the Performance Live digitally connected service on four rigs for PTT E&P onshore Thailand.

Total has implemented Nutanix’ solutions to streamline administration and enhance automation.

KBR has secured a $300 million recompete for five-years of scientific, engineering and technical services with the US Geological Survey’s Earth Resources Observation and Science (EROS) Center.

Equinor and SINTEF have signed an extendible four-year strategic collaboration to facilitate the exchange and development of ideas and ‘radical solutions’. The agreement covers offshore wind, marine systems, energy systems and modeling.

* You would think that in this day and age and circumstances the ‘major’ client would have the good grace to let poor old CGG use its name!


Standards stuff

DNV GL RP for the digital twin. Last call for Energistics ETP v1.2. EU CSV Validator. IIC’s edge computing framework. Opto22, ‘all you need to know about MQTT’. OGC rolls-out cloud standard for EOS data. PPDM ‘What is a Facility’ feasibility study. The Open Group, IOGP sign MoU.

DNV GL has produced a recommended practice on quality assurance of digital twins in oil and gas. The RP was developed in collaboration with TechnipFMC. DNVGL-RP-A204 helps assess whether a digital twin will deliver on stakeholders’ expectations, establishes confidence in the twin’s data and computational models and determines an organization’s readiness to ‘work with and evolve alongside a digital twin’. The approach was piloted on 10 projects with companies including Aker BP, Kongsberg Digital and NOV Offshore Cranes. The RP is claimed to ‘provide clarity on the definition of a digital twin’. Clarity in this context would indeed be useful in a domain where there is a case to be made for the term ‘digital twin’ being marketing speak for a simulator. To learn more, you will have to pay-up as the RP is part of a subscription-based portfolio of nearly 80 different standards and RP’s for oil and gas. More from DNV GL.

Energistics has publishes the final release candidate of the Energistics Transfer Protocol v1.2 for review and comment. ETP v1.2 adds new and improved capabilities for real-time data streaming, with a comprehensive set of messages to support the reliable, two-way flow of data between applications for oil and gas development and operations workflows. Interested parties can download the release candidate schemas and specification to test and comment on the standard before its formal release. The three-month review period ends February 15, 2021. Petrotechnical Data Systems is providing an updated .NET devkit for ETP v1.2. More from Energistics. Energistics (formerly POSC) is celebrating its 30th anniversary with the offer of a Covid-19 special membership price cut of 30% for 2021.

The EU’s Interoperability Solutions unit ISA^2 has unveiled an online CSV* validation service. Validation is driven by means of Table Schema, a popular specification for tabular data, allowing a flexible configuration for the validation of CSV content. The CSV validation guide is the latest in a suite of guides for setting up validation services for a variety of protocols including RDF, JSON and XML. A step-by-step guide to the process is also available.

* Comma separated values – a simple Excel output format.

The Industrial Internet Consortium has published a technical report defining a framework for distributed edge computing. Those rushing headlong to the cloud will be interested to learn that ‘moving computing from the cloud to the edge’ (i.e. in the opposite direction) ‘increases the performance, trustworthiness, and efficiency of industrial IoT applications’. Distributed computing at the edge provides system architects and implementers with a distributed computing framework that ‘moves the capabilities of data center-based cloud computing closer to intelligent IoT devices at the edge’.

A recent blog published by Opto 22 promises ‘everything you need to know about getting started with MQTT’. MQTT is an open source publish-subscribe communications protocol that is now ‘the most commonly used IoT-specific communications protocol’. Together with the Sparkplug B specification, MQTT ‘can form the backbone of industrial IoT infrastructure’.

A ‘novel, standards-based technique’ from the Open Geospatial Consortium* is claimed to simplify access to earth observation (satellite) data. An ‘application-to-the-data’ principle has simplified app development for cloud-deployment. A new architecture, developed with Docker and web services, allows software developers to package and share applications on any cloud platform. More from OGC.

* Assisted by the European Space Agency and Natural Resources Canada

PPDM’s ‘International Petroleum Data Standards’ initiative has kicked off a Reference Values List work group to develop an initial set of lists for evaluation by membership. PPDM has also published a What is a Facility, feasibility study. More from PPDM.

The Open Group and the IOGP have signed a memorandum of understanding to ‘foster oil and gas industry collaboration for the development of common standards’. A key focus area for the new collaboration is OSDU, the Open Subsurface Data Universe. The MoU covers terminology, models, and systems for describing and managing spatial referencing of geospatial data.


OSIsoft 2020 EAME Oil and Gas User Conference

A teaser for OSIsoft’s well integrity management system. ENI’s PI System-based e-Digital Oilfield e-DOF game-changer. Spirit Energy’s eureka moment. Shell’s Golden Pi Tags data quality improvement project.

Mike Horrocks gave a foretaste of OSIsoft’s well integrity management system, a proof of concept that is being developed with help from International Well Integrity Ltd. Current well integrity management systems tend not to run with live operational data and may react to well events when they have already happened. PI System’s exception-based surveillance, Asset Frame analytics and Notifications can easily be used in this context. The new solution will be presented in a webinar, ‘Well Integrity Management through Real-Time Data’ early in 2021.

Luca Cadei presented ENI’s e-DOF, a PI System-based real time digital oilfield and a ‘game changer’ in upstream asset management. eDOF is configured from state‐of‐the art off‐the‐shelf components. Design and assembly are performed by a team of ICT specialists and business users. The system acquires over 350 million values per day. Simulators and predictive tools embed the framework and enable production optimization and better decision making. Site data on OPC servers is captured and forwarded through a PI aggregator, and ‘historicized’ in ENI’s ‘Green Data Center’. The Center houses, inter alia, HPC 5, a 52 petaflop supercomputer that is N° 6 in the TOP 500. Graphical ‘operating windows’ show how equipment performs with respect to ‘correct and acceptable’ KPIs. ENI’s data science teams have complete access to all data including PI AF template attributes, calculations, KPI’s, summary information and event frames. (Only) relevant PI System data is ingested into the big data environment. ‘The is no need to perform utopic complete PI data ingestion into the data lake’. Approved data sets are sourced through PI AF to all AI models and data scientists work in the same environment as used for data discovery. Analytics results are fed back into operations dashboards via PI AF & PI Vision. The PI data infrastructure has enabled successful use of data science findings. Cadei ended with an image of a huge PI Vision display with ‘everything a control room operator needs’. The display, nicknames ‘the Monster’ shows some 230 PI variables on screen. ‘It is now possible to anticipate process upsets, asset integrity issues or deviation from plant optimized parameters’. eDOF is considered to be ‘the main enabler of ENI’s digital transformation’.

Glen Milne spoke of a ‘eureka moment’ when a visit from OSIsoft persuaded Spirit Energy of the potential of PI AF. This led to the creation of an asset dashboard that increased daily usage of the PI System. ‘People started talking about performance and started solving business challenges.’ the dashboard provided a live overview of well stock some of which had not been visible for years. The next step was to address high value, low effort problems. For instance, high temperature trips occurred in the process plant causing significant production loss. Spirit worked with Merkle Dentsu to develop a predictive model that worked with the production loss reporting system to identify equipment ‘bad actors’ and allow operators to take preventive actions. Data from over 400 sensors was ‘feature engineered’ to create a 30 billion points data space. Merkle worked with subject matter experts at Spirit Energy to create a target variable that identified unrecorded trip events. A random forest classification was used to create a model that now provides an indication of a trip event in under two minutes. Spirit and Merkle received the 2020 DataIQ award for most innovative use of AI*.

* DataIQ also awarded the Oil & Gas Technology Center an award for its missed pay project.

Peter Van den Heuvel reported that Shell’s PI system, with some 35,000 users, is the largest PI in the world. In the face of a digital landscape that is growing in complexity, Shell is trying to simplify things and speed connections to PI Server. Currently, there are about twelve million events/sec streaming into PI servers. Data management is growing in importance as Shell leadership now considers data as an asset and PI as one of the main databases. A lot of Shell’s PI data has very poor quality. Business decisions are made quickly. If a meter is flatlining, there is the risk of making a decision based on the wrong data. Shell has a whole team working on PI data quality/data management, an issue that was highlighted in the OSIsoft user conference last year by Pat Kennedy who put PI data quality at the top of his list. With support from OSIsoft Shell has kicked off a Golden PI tags data quality improvement project which will decide which tags are used in decision making.

More from OSIsoft.


KBC’s Value Chain Manifesto

100 years of optimization in oil and chemicals from Taylorism through linear programming to ‘molecule management’. Today, the Integrated Asset Model/digital twin constitutes the ‘heartbeat of the plant which drives all other applications’. Autonomous operations ‘empower the plant to run, learn and adapt to a changing environment’. AI works best in tandem with first principles.

KBC (a Yokogawa company) has just produced a 53-page ‘Value chain optimization manifesto’ that encourages operators to ‘digitalize with purpose’. The manifesto is based on KBC/Yokogawa’s experience of working on the world’s largest organizations in energy and chemicals. The Manifesto spans upstream oil and gas, LNG, refining and petrochemicals.

The Manifesto advocates a holistic approach to optimization that considers complete value chains such as that extending from the reservoir, through the gathering system, top-side processing constraints and on to market demand. Process improvement dates back to (at least) 1911 when Fredrick Taylor, wrote ‘In the past the man has been first; in the future the system must be first’. This led to operations research into logistics and the use of mechanization to improve labor-intensive processes. Computers and ERP systems came to the fore in the second half of the 20th Century, integrating siloed databases. Today, supply chain management is an ‘overarching operations management activity that dictates operations delivery performance’.

The Manifesto drills down into topics such as asset optimization through ‘molecule management’, a concept that spans reservoir engineering through refinery feedstock selection and yield optimization. 20th Century linear programming has given way to more accurate non-linear approaches, able to incorporate system-wide dynamics, although linear models ‘remain useful’.

The Manifesto’s goal is operational autonomy of the asset, achieved through increased plant automation and an evolution of the business model. ‘Achieving autonomous operations involves empowering the plant to run, learn, adapt and thrive in tomorrow’s environment; whatever that might be’. Enter the first principle-based Integrated Asset Model (IAM), operationalized with real-time data to create a digital twin around which ‘holistic, cross-functional convergence in understanding and action across organization silos can occur’. The IAM digital twin constitutes the heartbeat of the plant which drives all other applications.

Artificial intelligence observes asset behavior patterns and attempts to correlate these with positive or negative outcomes such as energy savings, yield improvements or machine failures. AI models are simple to use, fast to execute and do not require deep chemical, mechanical or electrical engineering knowledge. While some claim that AI provides similar capabilities to the simulators of the digital twin, their successes are limited to simple problems where a correlation-based model is good enough to represent reality. The main reason for the failure of AI is ‘too many false positives’. AI can succeed at the machine level, but fails at the machine-plus-processes level. Better results have been obtained using first principles-based approaches in tandem with AI. ‘Cognitive’ AI can be trained to interpret the results of the rigorous asset simulation model, ‘homing in on what the problem or opportunity might be and presenting the engineer with a triaged, reduced set of options to explore further’.

Plant knowledge management that connects information from a variety of sources is needed to provide context and enable interpretation and understanding. Achieving meaningful levels of knowledge is a challenge for current methods. Semantic web technologies such as knowledge graphs show promise and allow for entity pairing. For example, between assets (refinery, oil field, chemical plant), operating conditions, feed type, product slate, shift, etc. utilizing data from a variety of underlying sources.

Industry is still largely dependent human-based decision making and associated tacit knowledge of experienced operators. However, demographics are changing as top-flight, experienced engineers approach retirement and fewer petroleum engineers are graduating. The net loss of tacit knowledge needs to be managed and the ‘rules of thumb’ need to be codified, validated and institutionalized.


2020 PIDX Virtual Fall Conference

The Petroleum Industry Data Exchange organization’s Fall 2020 online event heard from Diamond Key on TIDE, its Terminal Information Data Exchange data integration platform. Sullexis’ use case for standardizing carbon emissions data. ChaiOne’s Velostics, a PIDX-based solution for inbound fuel terminal logistics.

Lily Chen (Diamond Key International) advocates a pragmatic approach to downstream digital transformation with special reference to terminal loading and management. Digital Transformation is no longer just a buzzword but is the ‘linchpin of survival’. Moreover, Covid-19 has sped-up the digital transformation notably with pressure on business continuity and the need for remote/home working.

DKI provides end-to-end operations support to petrochemical terminals across 28 countries. Chen reported that ‘60% of operators don’t know how to handle the data monster’. DKI’s downstream digital twin helps make the best investment decisions in terms of both financial viability and operational impact. Different investment options to have the same ROI but very different effects on operations.

DKI’s Terminal Information Data Exchange (TIDE) enables multiple sources of data to be integrated into information dashboards, performance trend tables and analyses to provide insights into day-to-day operations. TIDE is particularly effective in understanding and managing terminal alarm flooding. Alarms are ‘slowdowns hidden in plain sight’ and make for a useful proxy for operating efficiency. More alarms indicate inefficient operations, and many are avoidable.

Jeff Diaz (Sullexis) presented a use case for standardizing carbon emissions data. Companies are under pressure to reduce their carbon footprint. But externalities mean that this can be very hard to evaluate. A ‘large offshore operator’ contacted Sullexis to identify carbon footprint reduction in the paint it used extensively for maintenance. This proved tricky as the supply chain was largely outsourced and encompassed many factors outside the operators’ sphere of influence.

Sullexis managed not to get ‘too deep in the weeds’ by attempting a full life cycle analysis of the paint. Instead, Diaz proposed a bottom-up approach ‘working with what you know’. These best efforts/pragmatic calculations may not hold up under scrutiny. More standards for carbon reporting and capture are needed. In which context, checkout Johan Krebbers talk at PIDX which we have promoted to this issue’s lead.

Jay Tchakarov presented ChaiOne’s ‘Velostics’ PIDX-based solution for inbound fuel terminal logistics ‘in the age of Covid’. Current tanker loading is often manual, complex and error prone. Velostics is a smartphone app for contactless data transfer. integration between business partners is starting to catch up to the consumer world à la Starbucks Gold Card app. The app provides a ‘boarding pass experience’ for truckers. Loading and unloading are monitored by flashing a QR code. The system integrates with the terminal management system via PIDX data exchange with order and shipment data translated to a location-based QR code, configurable with fields for customer, ship-to, channel, etc. QR codes are pushed to drivers via text or through the app. At the gate, the code is scanned, identifying the driver and creating the PIDX documentation. A minute or so per truck loading is claimed. Customers include Aramco, BP, Chevron, ExxonMobil, Marathon and many others.

See also Johan Krebbers’ presentation on The Open Group’s Open Footprint Forum which we report on in this issue’s lead.

More from PIDX.


Bentley backs FutureOn

Digital Twin solution receives cash injection from Bentley Acceleration Fund. FieldTwin to combine with Bentley’s iTwin upstream platform. Bentley’s ‘Chief Acceleration Offices’ seeks other partners.

Norwegian start-up FutureOn has received financial backing from the Bentley Acceleration Fund for Oil and Gas Digitalization. FutureOn’s cloud-based FieldTwin provides oil and gas engineers a single source of data truth for offshore field planning, installation and operations. The company argues that traditional ‘digitalization’ approaches threaten companies’ long-term viability as they start slow abd involve significant upfront expenditure. IoT devices, smart sensors and robotics are expensive and their ROI is difficult to assess.

FieldTwin is claimed to be a more efficient and immediate digitalization strategy for oil and gas companies, providing data-driven solutions to improve work processes, increase data accessibility, and usability. The solution is currently deployed operationally in real-world fields and is claimed to have cut pre-FEED* field planning by ‘at least 60%’. FieldTwin leverages ‘comprehensive security measures’ developed by Google, Microsoft and Amazon. The cloud eases data integration and breaks down legacy systems’ barriers.

The company has secured an investment from the Bentley Acceleration Fund and established a strategic partnership with Bentley Systems to ‘accelerate the digitalization of the oil and gas industry’. FieldTwin is to combine with Bentley’s iTwin platform. The combo will deliver a ‘next-generation’ digital twin for upstream project design. The use by both partners of open web standards is said to facilitate complex integration and customization.

Oslo headquartered FutureOn was spun out of Xvision, part of EXP group. The project received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No 876696, a.k.a Field Twin, the ‘first centralized data platform for efficient and collaborative development of offshore energy projects’.

Bentley Systems’ Acceleration Fund was founded in 2020 to invest in new and incremental participants in open ecosystems to advance infrastructure digital twins. The Fund is chartered to ‘accelerate the creation and curation of digital twins’ with minority investments and acquiring and expanding digital integrators. Bentley’s ‘Chief Acceleration Officer’ Santanu Das welcomes queries from potential ecosystem participants on the Fund home page.

* Front-end engineering design.


On sale!

A ‘retail round-up’ of solutions from Professional Data Solutions (sales to Chevron, EG Group), IBM Services (Indian Oil ePIC platform), PetroSoft and Bulloch Technologies.

Chevron has extended its 16-year partnership with Professional Data Solutions for the provision of back office and head office solutions for deployment across the Asia Pacific region. PDI provides ERP, fuel pricing, supply chain logistics and cloud-based marketing solutions for convenience retailers and petroleum wholesalers. The deal includes the next generation of PDI Envoy back office and head office software solutions. PDI technology underpins CaltexGo, Chevron’s customer-facing mobile app to reduce wait times at the fuel pump and in the store. PDI owns and operates the Fuel Rewards loyalty program.

EG Group has likewise chosen PDI as its Provider for ERP, marketing cloud, fuel pricing and logistics. EGG is a UK-based gasoline and convenience retailer with thousands of sites across Europe, North America and Australia. The company is currently ‘exploring’ the use of PDI Marketing Cloud Solutions, an industry-specific solution for retailers that combines back office, promotional and loyalty data to ‘attract and retain customers’. EGG recently acquired the US-based c-store chain Cumberland Farms as part of an ongoing global expansion strategy.

IBM reports that Indian Oil Corp. has achieved a ‘strategic digital transformation milestone’, with the ‘go-live’ of its Project ePIC Platform (PeP) that will support some 12,400 distributors across the subcontinent. The distributors can now use the Indian Oil One mobile app and portal, developed by IBM Services. The Indian Oil One mobile app and portal are part of Indian Oil’s Project ePIC, an integrated platform for customer relationship and distribution management. The platform ensures real-time updates to inventory, orders, invoices, reducing the time to order fulfillment. More from IBM India.

Petrosoft and Bulloch Technologies have teamed to provide a back-office solution to a ‘global oil company’s’ Canadian convenience stores. Petrosoft’s CStoreOffice has been paired with Bulloch’s BT9000 POS c-store point-of-sale software that serves thousands of outlets throughout Canada and North America. Bulloch’s provides card and contactless EMV* payments without the need for a 3rd party payment system. More on the partnership from Petrosoft.

* Europay, MasterCard, and Visa.


Cyber special: SolarWinds!

Oil IT Journal does its own quick fire investigation into the nefarious high profile breach. And finds some rather good advice on preventing hacks ... from SolarWinds itself!

As everyone knows by now, cyber security specialist FireEye discovered a supply chain attack that ‘trojanized’ SolarWinds’ Orion business software updates in order to distribute malware. It appears that SolarWinds’ deployment is pretty widespread. The company has one unnamed Australian operator as user of its network technology. No doubt there are others busying themselves with the patches and fixes.

What intrigued us (of course we were looking for dirt!) was a short item on the SolarWinds website namely, the SolarWinds Cyberthreat Guide: Seven types of internet threats and how to help prevent them.

This you will note is SolarWinds’ advice to its clients, we quote, ‘As a technology professional, you must be realistic about the chances of defeating a persistent threat from a group that could be relatively large and contain some truly skilled hackers. The sort of company that draws the ire of these groups is usually a close-to-enterprise-level organization that may have significant cyber-risks due to political, cultural, religious, or ideological products or services. Chances are a company like this will already know the appropriate configuration of systems that must never be on the internet’.

SolarWinds offers some examples of infrastructures ‘that probably should not be connected to the internet’. These include military/governmental classified computer networks, financial computer systems, like stock exchanges, life-critical systems, such as nuclear power plants, computers used in aviation and computerized medical equipment and finally, industrial control systems, such as SCADA in oil and gas fields. SolarWinds observes that ‘Sadly, many of these critical systems are being connected to the internet without even basic security solutions in place. You may need to help a business implement security solutions to ensure the benefits of connection to the internet do not introduce vulnerabilities with significant consequences if exploited’.

In contrast to this rather good advice, we have regular entreaties from the IT brigade for ‘convergence’ of IT and OT systems. The digital twin/internet of things movement is predicated on the connectivity of SCADA systems into the network. The shale gale has created a multiplicity of connections between field devices and the cloud. We don’t know anything about how the hack might have affected operations beyond the ‘supply chain’. But if the trojan did manage to worm its way from corporate IT systems into operations, the folks pushing ‘convergence’ should be carrying the can!


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.