Oil IT Journal: Volume 24 Number 5


Accenture’s requiem for the digital refinery

Survey finds falling gains from digital technology despite increasing investment. Has digital been oversold?

We are usually skeptical as to surveys conducted by vendors and consultants designed to demonstrate a need for more of whatever they are selling. But what of this intriguing conclusion from a survey by Accenture, that found only 3% refiners reported seeing significant value (defined as over $100 million) from digital technology.

'Digital Re-Definery*’, Accenture’s third annual study on digital technology in the refining industry, was based on a survey of 145 oil industry professionals, including C-suite executives, functional leaders and engineers at refineries globally. The survey found that, ‘over the past year the number of refiners reporting that digital technologies delivered a margin improvement of more than 10% in refining operations dropped from 11% to 3%; the number reporting that the technologies delivered margin improvements of 7-10% dropped from 19% to 11%; and the number reporting that digital delivered margin improvements of 2-6% dropped from 46% to 38%’.

The report continues with ‘furthermore, only 3% of refiners (compared with 6% in 2018) reported seeing significant value from digital, defined as over $100 million […] the challenges of achieving scale for digital initiatives across an asset base is stopping refiners from capturing the full value that digital can offer.’ According to the study, only 28% estimate that digital is driving $5 million or more in value for their refining business.

Accenture puts a positive spin on the numbers citing the current ‘next big thing’ i.e. advanced data analytics, as the top digital technology du jour, followed by ‘platforms’, internet of things sensors and edge computing. Paradoxically, digital spending continues to increase, with 56% of respondents reported investing ‘more or significantly more’ in digital technology than a year ago, mainly in production planning, maintenance and capital projects. But refiners are now more realistic in assessing their digital capabilities as only 44% of respondents this year categorized their use of digital technologies in refining operations as ‘mature or semi-mature’, down from 48% last year.

Accenture has it that people are the cornerstone of success. But people are also responsible for underperforming digital transformation: ‘the number of refiners citing resistance from their people and the culture in their organization as a barrier to wider digital deployment rose sharply this year, to 48%, from 33% last year’.

Accordingly, refiners appear to be addressing these issues. Five in six (83%) are taking actions to address the convergence of information technology (IT) and operational technology for their refining operations, including changing the role of IT, creating new organizational structures for digital, forming steering committees, creating a new C-level position, or a combination of the above.

According to Accenture’s Tracey Countryman, ‘Refiners are still working out how to optimally deliver results within site operations. There is no one answer on how to best organize the integration of your information technology and operational technology, as this depends on company culture and leadership strategy. Having launched many proofs of concept, refiners must now revisit the operational processes to enable scale and pace.’

* The online survey was conducted in January 2019 by OGJ (Oil & Gas Journal) Research, supported by Accenture Research. More from Accenture.

Comment: As we wrote back in 2016, AI, ML and so forth have a venerable ancestry in oil and gas. Back to the very earliest days of computing in fact. What has changed since the early days is the ballyhoo surrounding digital technology and the claims for a constantly increasing value to be derived from computerized operations. It is hard to ringfence and report on the monetary value of a particular technology, but the ‘value’ cited in the Accenture study of a few million dollars worldwide is indeed small when compared with the ‘billions’ promised by early practitioners of the digital oilfield, or the further billions ‘left on the table’ by those still futzing with pre-AI physics-based methods. These numbers are also rather paltry when compared with the bill for digital. No, we don’t know how much Accenture charges for its services. But a report in the FT has it that BP has signed a $1.2 billion, 10-year contract with the ‘secretive’ Silicon Valley data-analysis company Palantir!


A Digital Factory in Paris

Total aims for $1.5 billion ‘value’ per year by 2025

Total is to open a Digital Factory in Paris early in 2020 to house some 300 developers, data scientists and other experts to accelerate its digital transformation. The Factory is to leverage the capabilities of digital tools to create across all of Total’s businesses.

Total needs new digital solutions to improve its operations, in terms of both availability and cost and notably in managing and controlling energy consumption. The Factory will also extend Total’s new distributed energies and reduce its environmental impact. Total’s ambition is to ‘generate as much as $1.5 billion in value per year for the company by 2025 through additional revenue and reductions in operating or investment expenses’.

CEO Patrick Pouyanné opined, ‘Digital technology is a critical driver for achieving our excellence objectives. The Digital Factory will serve as an accelerator, allowing Total to systematically deploy customized digital solutions. Artificial intelligence, the Internet of Things and 5G are revolutionizing our industrial practices, and we will have the know-how in Paris to integrate them in our businesses as early as possible. The Digital Factory will also attract the new talent essential to our company’s future.’

The Factory is headed-up by Frédéric Gimenez, chief digital officer. Teams will comprise developers, data scientists, architects, specialists in agile methodologies who will work with operating personnel from Total’s different businesses in the 5,500-square-meter facility located in the center of Paris. Establishing the factory follows the signature of partnership agreements with Google on artificial intelligence and geosciences and with Tata Consultancy Services on Refinery 4.0.


What do you say when the boss asks for your opinion?

Neil McNaughton recalls his own wrong answer along with a historical boardroom-driven technology fail. Pondering BP’s billion-dollar contract with Palantir and the rebranding of the SPE ‘information’ special section as ‘analytics’ he postulates that AI group think has taken-over the industry.

Back in the mid-1980s when I was a geophysicist, the boss sauntered into my office and asked, ‘Neil, what would you think of a digital geophone?’ At the time, seismic recording had already been digital for twenty years or so, but the analog-to-digital conversion took place in the recording truck. I was scratching my head trying to see the point in having all that fancy electronics distributed across the length of the spread in every single jug. What about the bandwidth, the conversion range, how many bits etc.? My answer was non-committal. Which of course was the wrong answer! The right answer, when the boss comes in proposing to make a major acquisition, whether it is for the manufacturer of a new geophone, or a purveyor of the latest digital technology for machine learning/artificial intelligence, is ‘Yes boss, that’s a great idea!’

Anyhow, the boss took no notice of my agonizing and went on to acquire the geophysical company (Input-Output). The digital geophone took another decade or two to see the light of day, so my skepticism was not completely misplaced. Today, you can even build your own digital geophone with a kit from Raspberry (Pi) Shake. But I digress.

Back in the day, deals like this were made on the golf course, struck with a handshake of colorful individuals who either possessed considerable financial means or who were capable of smooth-talking others into ponying-up their cash. But not all such ‘top-down’ deals, i.e. ones that suddenly appear in the boardroom, rather than from in-house specialists, are done on the golf course. A spectacular instance of a top-down deal was done in France in the 1970s when the then Elf Aquitaine’s top brass got into a huddle with leading French politicos and a scurrilous Italian ‘inventor’. The great ‘avions renifleurs*’ scandal began when Aldo Bonassoli managed to get the ear of France’s SDECE intelligence agency. This led to top secret tests of his ‘gravity wave’ device that was claimed to detect oil directly**. The tests were performed in the presence of Elf’s top brass, France’s president, Valerie Giscard d’Estaing was even involved, but without any of Elf’s geophysicists in attendance. In the end nothing came of the ‘technology’ but Elf and, one imagines, the French taxpayer were parted from a considerable sum.

I’m not party to how these deals are done today, but I am pretty sure that the ‘top-down’ approach to major decision-making is at least as prevalent today. In the field of computing it has been honed to a fine art as company bosses not only make deals buying startups and making bets on the next big thing, but they also boast about them.

A report in the Financial Times has it that the new CEO, Bernard Looney, wants BP to be ‘the leading digital upstream business’. As a part of this drive, the FT reveals that BP has ‘expanded its relationship with Palantir Technologies, the ’secretive’ Silicon Valley data-analysis company, with which it has a $1.2 billion 10-year contract for its data integration platform’. Palantir’s technology is at the heart of a ‘digital twin’ of BP’s global infrastructure, performing simulations to determine optimum routing for BP’s production and optimizing maintenance. Looney is quoted as saying that ‘immediate access to data is vital to attract the next generation of employees, it is the older hands that are having difficulties adjusting. […] The problem that we have is that people have been working a certain way and believe that answer that they’ve got is right. […] Anywhere in our system where there are tons of data. There is value that is being unrealized.’

Now I am not saying that Palantir is the next ‘avion renifleur’, but there are some uncomfortable parallels here. A secretive company with black-box technology***. A boss who disses in-house expertise. The promise of great benefits. What is new is the way that this staggeringly large investment in software is broadcast to all and sundry. In the early days of the digital oilfield, BP excitedly spoke of a ‘billion dollar’ bottom line addition. Now the boast is of a billion dollar software spend!

The paradigm of massive data lying around with unrealized value comes straight from the marketing material of the consultants and IT vendors. As a geophysicist, and an oldie, I feel almost personally responsible for this terrible state of affairs. But I am scratching my head to think of how the massive amount of seismic data ‘at rest’ could be usefully put to better use and exactly what value is unrealized. Most professionals have an approach that involves doing just the right amount of processing or other work. The new paradigm has it that you have never done enough. That black box is always ready to re-run some big data algorithm and come up with the next insight that was ‘invisible to the human eye’..

The AI revolution in its latest manifestation is old enough now for us to expect to have seen some great results. Here at Oil IT Journal we have been tracking progress in the AI field over the last few years. What we see is continued enthusiasm for trying stuff, especially the stuff that is more or less freely available from Google and Microsoft. The results? Well, we have seen proofs of concept that show image recognition of scanned logs and NLP on document retrieval that are interesting. But accuracy is usually quite low, often in the 60-70% range. This means that AI is applied in special circumstances like a data room where there is a need for quick, rough and ready results.

I’m just back from the Calgary SPE ATCE and am in the process of unpacking my notes and looking through some presentations. I will report back in a future issue on the state of play in upstream AI. But so far, nothing staggering revolutionary to report. What did come over loud and clear from the ATCE is the group think that has it that the AI revolution is already mainstream. To reflect this apparently obvious ‘fact’, the SPE Special Section for ‘Management and Information’ has been split into two disciplines. While splitting out ‘Management’ makes sense (it was always puzzling to group this with ‘information’ and not very many managers seemed to be onboard), the rebranding of ‘information’ as ‘data science and engineering analytics’ seems to be an acknowledgement that all that old-world scientific computing stuff is out of the window.

* Wikipedia: The Great Oil Sniffer Hoax.

** Bonassoli’s device is one of many scams/inventions that your average geophysicist is confronted with during their career.

*** Some put it even less politely.


Review: Reproducibility and Replicability in Science

The 250 page report from the National Academies Press investigates the reproducibility of scientific experiments and publications citing an eminent geophysicist as father of the reproducibility movement and new boosts to reproducibility from Docker, the cloud and the Jupyter notebook.

A new publication, Reproducibility and Replicability in Science* (R&RiS) from the US National Academies Press investigates the reproducibility of scientific experiments and publications. The overarching theme is that often, single experiments are taken as demonstrating some finding or other, which subsequent studies find not to be true. In the introduction, the to-and-fro of the benefits or otherwise of margarine as a healthier alternative to butter is cited, along with changing advice from the medical profession on the merits of daily doses of baby aspirin to reduce the risk of heart attack. R&RiS is a ‘consensus study’ from an impressive collection of Foundations and Academies, with backing from Congress and support from the Alfred P. Sloan Foundation.

What caught our attention in the report was the references to preeminent geophysicist Jon Claerbout whose work on seismic processing led to his launch of the reproducibility movement (still current with Madagascar). With special reference to data and compute-intensive scientific work, Claerbout observed that minor mistakes in code can lead to serious errors in interpretation and in reported results and proposed that both data and code should be openly shared so that results could be reproduced. Clarebout is quoted as saying, ‘An article about computational science [. . .] is merely advertising of the scholarship. The actual scholarship is the complete software development environment and the complete set of instructions which generated the figures.’

As an example, an image processing scientific workflow may involve user interaction that a subsequent investigator may not be able to replicate. Reproducibility zealots thus eschew interactive programs ‘unless they include the ability to arrive in any previous state by means of a script’. Others likewise deprecate the ubiquitous spreadsheet as prone to non-reproducible results. ‘The use of spreadsheet software impairs reproducibility because spreadsheets conflate input, output, code, and presentation. Spreadsheets inhibit one’s ability to make a record of all steps taken to construct a full analysis of the data, and they are notoriously hard to debug’.

There are better ways of capturing the scientific workflow. Workflow management systems such as that developed by CERN for its investigations with the large Hadron collider can capture and store data and workflow provenance automatically. These systems link results with the computational processes that derived them. The Chimera system (developed for the Sloan Digital Sky Survey) likewise captures and automates a complex pipeline of transformations on the data by external software. The Open Science Framework, developed by the Center for Open Science is a cloud-based project management tool that emerged as part of efforts to replicate psychological research for use in other fields.

The misunderstanding and misuse of statistical significance testing is a particular source of non-reproducibility. As recently as 2016 the American Statistical Association, noting that in its 177 years of existence, it had never previously taken a stance on matter of statistical practice, published its six principles on the use of the P-value test, ‘in the hopes that they would “shed light on an aspect of our field that is too often misunderstood and misused in the broader research community.”’ This year the ASA published a special edition of its official journal, The American Statistician titled ‘Statistical Inference in the 21st Century: A World Beyond P < 0.05.’

There are now also tools reproducing research results, notably ReproZip, that creates a reproducible package of the whole computational sequence for execution in the cloud without additional software. More generally, virtual machines that encapsulate an entire computational environment, from the operating system up, can enable reproducibility, so long as the source code is made public. The combination of virtual machines and public cloud has proved valuable for reproducibility in several domains, such as microbial ecology and bioinformatics. Docker containers are another route to reproducibility as witnessed by the Software Sustainability Institute’s June 2017 workshop on Docker Containers for Reproducible Research.

Jupyter interactive computational notebooks are another technology supporting reproducible research enabling researchers to fully narrate their analysis with text and multimedia content. Notebooks can be shared with other researchers to reproduce computations. According to R&RiS, ‘scientists are increasingly adopting Jupyter for their exploratory computing, sharing knowledge within their communities, and publishing alongside traditional academic papers’. The gravitational wave LIGO team published Jupyter notebooks that reproduced the analysis of the data and displaying the signature of a binary black-hole merger.

Several institutions have contributed guidelines to the open data movement: the FAIR (findable, accessible, interoperable, and reusable) data principles from the Lorentz Center in the Netherlands. The Transparency and Openness Promotion guidelines and those by the Association for Computing Machinery.

R&RiS dives into reproducibility in geoscience, a rather harder task and a contentious one when applied to the forecasting of natural hazards, and ‘notoriously difficult to predict’ extreme events of low probability but high consequence. Here scientific forecasts are expressed as probabilities involving iteration of forecasting models over many cycles of data gathering, model calibration, verification, simulation, and testing.

In conclusion, R&RiS advises that members of the public and policy makers have a role to play to improve reproducibility and replicability. When reports of a new discovery are made in the media, one needs to ask about the uncertainties associated with the results and what other evidence exists that the discovery might be weighed against. Anyone making personal or policy decisions based on scientific evidence should be wary of making a serious decision based on the results, no matter how promising, of a single study. Similarly, no one should take a new, single contrary study as refutation of scientific conclusions supported by multiple lines of previous evidence.

Curiously there is no mention (outside of the copious list of references) of big data, artificial intelligence or machine learning, fields which would all merit from a close inspection as to their ‘reproducibility’. There is no mention either of ‘fake news’. But both AI and fake news make up the sub text that dare not say its name, politely hiding inside RR&IS’ exhortations.

* National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and Replicability in Science. Washington, DC: The National Academies Press. ISBN: 978-0-309-48616-3.


A look at evolving downhole communications technology and the ‘start of the digital age’!

Oil IT Journal investigates options for communications between the drillbit and the surface. We look at Halliburton’s latest QuickPulse, the evolving NOV IntelliServ wired drill pipe and the new kids on the block, AkerBP and TDE Group with the Powerline Drill String.

Introduction

Communications between the surface and the drill bit are problematical and yet they play a key role in downhole situational awareness and in applying real-time drilling optimization solutions. Currently most all communications are achieved through acoustic pulses in the mud column. This is reliable technology that does not require any particular kind of drill pipe. It is unfortunately also very low bandwidth. This can be fixed in two ways. If real time is not important, high frequency information can be recorded to downhole storage and replayed when the tool is pulled out of hole. But higher frequency, real time data mandates a special wired drill pipe. We take a look at some new offerings in the measurement-while-drilling space from Halliburton, NOV and AkerBP/TDE to see just how fast is fast in modern downhole comms.

Halliburton QuickPulse

Halliburton’s latest contribution to the MWD is QuickPulse, an automated directional gamma service that provides ‘quick and reliable downhole information at extended depths’. QuickPulse combines directional, vibration and gamma ray sensors with a strong transmission signal that overcomes most downhole interference. The system prioritizes critical measurements. Bandwidth as such has not been announced but a compete survey is said to take 24 seconds to transmit while ‘toolface updates’ can be delivered every 3 seconds. It is clear that we are talking a few bits per second here. This may be fine for many operations, but if you need higher real time sampling, you will likely need to look to a wired drill pipe.

IntelliPipe to IntelliServ

Our first encounter with wired drill pipe was back in 2002 when Grant Prideco and Novatek announced IntelliPipe. IntelliPipe derived from a US Department of Energy funded experiment to offer ‘megabit bandwidth’ for MWD. (Incidentally mud pulse telemetry was likewise developed with help from the DoE, back in the 1970s.) Our next encounter with the technology was in 2006 when IntelliPipe had morphed into IntelliServ and was being marketed by Halliburton along with Grant Prideco. At the time the bandwidth was still quoted as ‘around a megabit’. But this had dropped to a more realistic 57 kbps in 2009 when we reported on Apache’s trials of IntelliServ offshore Australia. In the same year a new InteliServ joint venture kicked-off between Schlumberger and Prideco’s parent National Oilwell Varco. Today, it appears that NOV has repatriated the IntelliServ technology. It would not appear to have been the roaring success that was expected back in the early 2000s. By 2016, only around 130 wells had been drilled with wired drill pipe. But some successes have been reported, notably on the Norwegian Martin Linge field.

TDE’s Powerline Drill String

Most recently, interest in wired drill pipe has picked up with the announcement from Austria-based TDE Group of the ‘Powerline Drill String’ (PDS) that delivers both a high-speed data telemetry system and electrical power. TDE with joint venture partner AkerBP have tested the system in two weeks of drilling at the NORCE/Ullrigg test site in Stavanger. The tests demonstrated simultaneous transmission of data at 500 kbps (10x IntelliServ) and 300 watts of power. TDE Group’s Abdelrhani Lamik said, ‘Multivendor BHA tools can now be operated using electrical power from the surface without batteries or turbines. The quality of the downhole data streamed at 500,000 bit/s is unprecedented. A gap has been filled and the digital age can start!’


US baseline oil and gas production to decline 35% in 2019

AI-based study by IHS Markit delivers well-by-well production forecast for one million North American wells.

IHS Markit has ‘harnessed the power’ of artificial intelligence (AI) to predict future production for each of the nearly one million currently producing oil and gas wells in its North American database. The results confirm the staggering rate of decline of US shale wells. Overall onshore production will decline by 35% during the next 12 months. This compares with base decline rates of 5% to 14% for global petroleum systems and under 15% a decade ago in the US.

Comparing 2017 to 2019, IHS Markit calculates that the onshore oil base decline (i.e. without taking account of new wells) nearly doubled. Base production declined by 1.8 million barrels of oil per day or 28% in 2017 and but will fall by a further 3.5 million bopd (35%) in 2019. HIS Markit’s Raoul LeBlanc commented ‘The treadmill that producers are fighting is moving very fast. As producers come under pressure to restrain investment, this decline rate is becoming the main factor that promises to slow the explosive US production growth we’ve witnessed the past few years.’

Russell Roundtree added, ‘Most engineers don’t have the time they once did to conduct in-depth, well-level, decline-curve analysis on even a handful of wells. The prospect of rapidly analyzing thousands of wells is attractive. This derived dataset becomes a valuable addition to the oil and gas community and to financial investors who need to assess a company’s risk and future performance.’

IHS adds, more positively, that while shale wells experience a ‘breathtaking plunge’ of 65% to 85% in first-year production, the dynamic (of steep declines) is sustainable because of high-initial productivity. Also, ‘much more intensive’ hydraulic fracturing is not generally leading to faster decline rates.

IHS’ AI-based tool ‘learns’ from adjacent wells in the same reservoir to improve forecasting accuracy. Analysis is also enhanced by automated partitioning a well’s history into meaningful stages, workover, re-stimulation, pump installation etc. The automated well forecasting technology detects changes to production patterns and makes a forecast using only relevant data. The technology is smart enough to recognize interventions and to compensate for them. Results are delivered via IHS’ Performance Evaluator package .


The Open Group, IT4IT, Shell and … OSDU

An Open Group blog reprises Shell’s deployment of its IT4IT reference architecture. IT4IT covers digital transformation and the move to the cloud. Just like the Open subsurface data universe!

The Open Group’s official blog reports on the ‘interesting case’ of the use of its IT4IT in Shell’s digital transformation. A major outsourcing initiative saw Shell’s IT transferred to three global service providers. Shell then had to develop its own interfaces to the outsourcers to track IT incidents. TOG quotes Shell’s Mary Jarrett as saying, ‘Shell faces challenges around matching IT capabilities to core business needs and reducing IT spend. Technological developments like cloud computing, IT consumerization, and big data add complexity, and we [ are ] increasingly stretched to respond to rising demand and a need for greater agility’.

The IT4IT Reference Architecture is said to provide ‘holistic guidance’ for the implementation of IT management capabilities. The latest (V2.1) release is a 190 page document with some 70 plus sections offering a highly granular approach to IT service delivery. TOG positions IT4IT as a ‘peer’ to other reference architectures such as the Object Management Group’s NRF/ARTS retail operational data model, and the eTOM business process framework. IT4IT is said to allow IT ‘to achieve the same level of business predictability and efficiency that supply chain management has allowed for the business’.

A January 2016 white paper describes how Shell has leveraged IT4IT across its 140,000 desktops with support from some 10,000 IT staff (including contractors). The TOG blog is essentially a reprise of the 2016 white paper. Shell’s IT outsourcing began in 2008. All of which begs the question as to the extent to which Shell is still using the IT4IT framework. Now here’s a thing. Shell’s first port of call when seeking a home for its ambitious Open Subsurface Data Universe was of course, TOG. Moreover OSDU, like IT4IT has the cloud at its focal point. How much of the IT4IT DNA will migrate across to OSDU is a question that we hope to able to answer in our next edition when we report from the TOG Amsterdam member meeting.


Open Industrial Digital Ecosystem Summit

NIST-hosted event hears from Mimosa and OAGi luminaries. Highlights include OIIE, the open industrial interoperability ecosystem , BP’s interoperability story (too many oil and gas standards!), Industrial ontologies foundry and ‘OAGIS Lite’.

The US National Institute of Standards and technology (NIST) hosted the ‘Open Industrial Digital Ecosystem Summit’ earlier this year at its national cybersecurity center in Rockville, Maryland. The event, sponsored by Mimosa and OAGi, set out to explore supplier neutral, standards-based interoperability to improve operational efficiencies.

Mimosa CTO Markus Stumptner teamed with Matt Selway to present ‘Standardizing standards-based interoperability’ and OIIE, the ‘Open Industrial Interoperability Ecosystem’. OIIE is a ‘framework and architecture for defining and describing standardized and standards-based ways for how systems should interoperate’. It will/should (?) support digitalization and supplier-neutral, COTS/open source plug and play interoperability. OIIE is a ‘refinement’ of concepts that Mimosa has been developing for five years.

OIIE components include a use case architecture, connectivity and services architecture, data and message models along with ecosystem administration functions. The aim is for a supplier-neutral digital ecosystem specialized for process industries (including oil and gas). But, observed Stumptner, ‘The major suppliers of IT infrastructure and industrial applications all want their ecosystem to be the ecosystem!’

The venerable OGI Pilot has been a testbed and proving ground for the OIIE with a focus on EPC to O&M* handover of engineering data. The current pilot phase covers the additional fields of information requirements for greenfield and brownfield data. The OIIE deploys SDAIR, a structured digital asset interoperability register with tag identifiers and mappings. Mimosa CCOM (common conceptual object model) serves as an information model for asset data exchange.

Ken Dunn presented the BP Interoperability Story*. BP is aiming for plug and play exchange of asset and equipment information sans coding such that ‘information need only be entered once, and made available to all stakeholders’. The BP Interoperability Program will deliver integrity of asset information across a wide range of operations systems and partners through collaboration with software vendors and other owner operators, driving adoption of the OIIE. BP is planning OIIE production deployment asap using the OGI Pilot environment. Key bricks in BP’s asset O&M landscape are SAP’s assent intelligence network (AIN). This is connected to engineering tools, notably from Bentley and Yokogawa using the OIIE CCOM protocol. The OIIE SDAIR (structured digital interoperability registry) captures the current state of plant assets. BP is to sponsor the next phase of the OGI Pilot, working with Yokogawa, Bentley and SAP to develop the architecture and implement a minimum viable product. Dunn concluded that ‘there are too many overlapping information standards in the oil and gas industry, most of which are not broadly adopted’. A dozen or so owner operators have established the ISSC, the information standards sub-committee of the IOGP, to provide a ‘unified voice’ of the industry on information standards and to help with adoption.

OAGi Board Chairman Garret Minakawa introduced a new strategic initiative, the Industrial Ontologies Foundry (IOF), a group that is working to create a set of open ontologies to support manufacturing and engineering industry needs and advance data interoperability. More on the IOF in our report on the 2019 Industrial Ontologies workshop elsewhere in this issue.

Open Applications Group VP operations, Michael Figura observed that the Oagis standard can be a ‘little intimidating, especially for small organizations’. Enter OAGIS* Lite, a slimmed-down version of the standard which can be subsequently upgraded ‘with zero rework’. OAGIS Lite will leverage a subset of the most popular BODs (business object documents). Instead of some 1,200+ BOD schemas there are 5 to 10 and documentation is ‘smaller and more approachable’.

* In fact, Dunn’s presentation was not available at the time of writing. This short summary has been taken from a presentation made to Mimosa in December 2018.

* Open Applications Group Information Standard.

* EPC engineering prime contractor. O&M operations and maintenance.


Energy transition outlook 2019, oil and gas

DNV GL’s crystal ball sees oil demand peaking in the mid 2020s as gas takes over as the world’s main energy source. A peek into the digital future of oil and gas sees further de-manning supported by robotics and AI. 3D printing, drilling automation, wired drill pipe and wearables will also come to the mainstream. Maybe!

DNV GL’s Energy transition in oil and gas is a spin-out of its global 2019 Energy Transition Outlook (ETO), a comprehensive 296-page report on the global energy transition out to 2050. The shorter (96 page ETO oil and gas (ETOO&G) edition uses the same data and DNV GL’s ‘system dynamics feedback model’, implemented with ISEE Systems’ Stella modelling tool. DNV GL is forecasting oil demand to peak in the mid-2020s, at which time the world’s largest source of energy will be natural gas. While DNV GL’s production forecasts are model based, the chapter (in the shorter ETOO&G) on the digital transformation of the oil and gas industry is more of an editorial.

ETOO&G envisages the impact of digital in de-manning, enabling new lower-carbon concepts, such as subsea, rather than fixed, platforms and by enabling new manufacturing solutions, such as additive manufacturing. Much of DNV’s digital forecasting for oil and gas is spun into a green, lower carbon scenario.

De-manning will involve inspections by autonomous robotics supported by advanced sensors and machine vision. Embedded memory and communications systems will report live status from remote locations to a central onshore hub. Other digital goodies to come include subsea Wifi and the subsea IoT and all-electric topsides, taking power from offshore renewable sources. The autonomous supply vessels concept is ‘rapidly moving closer to reality’.

Advances in standardization (it would be good to know what these will be!), and the reuse of elements of past asset designs, will help to simplify requirements and improve the efficiency of the design process. 3D design will replace 2D and construction will become more automated and integrated. Advanced modelling tools will run scenarios to derive optimum manufacturing solutions that consume less material. Possibly leveraging technology from DNV GL’s own Additive Manufacturing Technology Centre of Excellence in Singapore. Wearables will provide high-speed access to remote experts worldwide. Virtual and augmented-reality training will improve inspection and maintenance activities. Data analytics will ‘continue to optimize subsurface mapping of the optimum drilling locations, indicating how and where to steer the drill bit and suggesting the best way to stimulate the reservoir’. Greater gains will come through wider deployment of smart drill pipe that report downhole conditions via fast, reliable telemetry. Robotic drilling systems that respond to downhole data ‘are on the way’ and will increase the safety and efficiency of drilling operations.

What key technology will underpin the oil and gas digital transformation? You guessed it, the cloud. On-site computing and storage is shifting to become fully cloud-based based. Industry platforms will enable collaboration, such as sharing data and tools/apps. Breaking down traditional functional silos is vital. The most powerful impact on projects and operations come when leading subject-matter expertise is combined with data analytics, information management and real-time control to optimize operations and maintain safety. DNV opines that there is significant potential for the industry to share data across projects and operations, using cloud-based data platforms such as its own Veracity. And of course, AI and machine learning will ‘supplement’ human interpretation. ML will focus on high probability, low-consequence scenarios, of which there are many in the oil and gas sector. Finally, digital twins will enable communication between different types of models. Supply-chain companies will incorporate and test their designs directly into a digital twin master. Here again, DNV GL is working on a ‘virtual offshore platform’ that will provide up-to-date information throughout the asset’s lifecycle for multiple purposes including asset health monitoring risk barrier management, spare parts and performance analytics. Elements of the digital model will be used to produce physical components, using advanced manufacturing techniques (as above) located at the physical asset.


Software, hardware short takes

Achilles’ Oil and Gas Europe. Aker Solutions’ Intelligent Subsea. Brüel & Kjær Vibro’s VC-8000. Bentley Nevada Orbit 60. Brady Inspection Timer. CGG HampsonRussell/Jason. DNV GL Sesam Insight. EasyCopy EasyCore 2.0. Elysium InfiPoints V 6.0. Emerson/Paradigm. EY Digital Energy Enablement Platform. Geophysical Insights Paradise. Halliburton Landmark DecisionSpace 365. Pegasus Vertex MudManager/cement displacement JIP. Quest Global Eagle 2.0. Siemens Teamcenter. Sphera IRM 4.0. Software AG TrendMiner. Universal mCloud AssetCare. Weatherford ForeSite.

Achilles has announced the launch of Oil and Gas Europe, a ‘comprehensive supplier network for the oil and gas industry. The network is powered by Achilles JQS and FPAL services and brings together 6,000 suppliers and 148 buyers across the UK, Europe and Nordics into one supply chain. A community-based dashboard provides enhanced analytics for improved procurement, based on 20 years of validated data held by Achilles.

Aker Solutions’ ‘Intelligent Subsea’ combines modular, optimized and configurable subsea equipment with automated design. Both subsea and topside systems are optimized, and concepts can be rapidly developed with the aid of ‘advanced digital tools’ that also extend field life enabled by condition monitoring, predictive maintenance and system enhancement as the field matures. Aker also offers all-electric and subsea gas compression technology.

Brüel & Kjær Vibro’s VC-8000 Setpoint machinery protection system has received SIL-2 certification for functional safety applications. The system has been tested to ensure compliance with ISO 61511/API 670 and IEC 62443.

Baker Hughes Bentley Nevada unit has releases the Orbit 60 series machinery protection and condition monitoring system, the ‘future of machinery asset protection, condition monitoring and advanced diagnostics’. Orbit 60 is designed for use with Bentley Nevada’s System 1 diagnostics software.

Brady Corp’s innovative Brady Inspection Timer’s LED lights ‘grab the attention’ of users and maintenance professionals alike and highlight when the next planned maintenance intervention is due. Versions are available that count down 7 days, 30 days and 365 days. Each version is equipped with a dark green, light green, yellow and flashing red LED light that indicate a ‘recently inspected’ up to an ‘uncertain’ equipment status.

CGG GeoSoftware will include machine learning technology in the Python ‘ecosystem’ in upcoming releases of its flagship HampsonRussell and Jason reservoir characterization solutions. The Python API will allow experts and data scientists to customize machine learning and reservoir characterization workflows with Python ML libraries such as Google Tensor Flow and proprietary code. CGG earlier added a Python interface to its PowerLog petrophysical software.

DNV GL has announced Sesam Insight, a means of sharing 3D engineering models among stakeholders, avoiding the ‘lack of transparency’ that results when asset knowledge and data remain with contractors instead of asset owners. Sesam Insight provides access to the data and models for subscribers on any device and is said to be a game ‘game-changer’ for improving decision-making and reducing errors. Watch the video.

A new release of EasyCopy’s digital core description solution, EasyCore 2.0 has been released with custom visualizations, enhanced data import/export and new column groups and overlay/underlay options. A free hand tool captures sketches and annotations in the field or lab. Download a trial version here.

InfiPoints V 6.0, the latest release of Elysium’s laser point cloud software now includes tools for extracting piping and structures from point clouds into Autodesk Revit CAD. A plug-in enables a direct connection from InfiPoints to Revit including geometry and non-geometrical information such as piping standards and attributes. More from Elysium’s video.

Emerson/Paradigm has ported its complete E&P software portfolio to the cloud. Emerson’s petrotechnical software can now be run locally, hosted or in a hybrid mode. Different configurations can be either managed by clients or as a service package where Emerson manages IT operations on the cloud and provides software support, maintenance and operation. The cloud-hosted software is available on Microsoft Azure and Amazon AWS.

EY’s Digital Energy Enablement Platform (DEEP) is a toolset for the digital integration of key processes across the value chain, from complex well engineering, production and maintenance optimization, supply chain management and financial modeling. DEEP embeds Microsoft Azure and Dynamics 365 and runs on common data model. The platform can be extended across the oil and gas organization, ‘breaking down silos and reducing cycle time and costs’. A parallel solution, EY UtilityWave performs a similar integration for Utilities.

Geophysical Insights has added a scripting language to its flagship Paradise AI-powered seismic interpretation package. The Paradise scripting language and editor add custom signal and neural network analysis with over 600 geoscience specific commands. The language processor was built with Intel Fortran, Parallel Studio XE 2016 64-bit on Windows Studio Ultimate.

Halliburton Landmark has released DecisionSpace 365 a suite of cloud-native E&P applications powered by iEnergy, the Landmark hybrid cloud designed to deploy, integrate and manage customers’ E&P applications. The cloud applications include Scalable Earth Modeling, a high-fidelity, fast earth modeling solution that uses all available data, without upscaling, to produce rock property models that can be interrogated across all scales of resolution, Full-Scale Asset Simulation for multiple, fully coupled sub-surface/surface scenarios for field development planning and the Data Foundation, ‘a holistic, multi-model application leveraging multiple cloud native data stores to ingest, manage and access sub-surface, engineering, log, seismic, and corporate data’. More from Halliburton.

Pegasus Vertex has rolled out MudManager, an online database management system for the managers and supervisors of drilling fluids companies, which links individual mud engineers’ data into a single location for data searches, well comparison, evaluation, and data correction. Earlier this year PVI also announced a joint industry project to investigate optimizing displacement efficiency for cementing operations. The JIP is developing a full computational fluid dynamics (CFD) module on top of CEMPRO+ along with 3D visualization of flow and fluid concentrations and laboratory testing. More from PVI Software.

Quest Global has announced ‘Eagle 2.0’, an end-to-end customizable open source asset management platform for the Oil & Gas and Power industries. Eagle connects assets and factories by aggregating data from disparate sub-systems, remote devices and multi-sites. The customizable framework consists of the Eagle Edge and Eagle Cloud. The Edge data aggregation hub captures data from field devices for ingestion to the Eagle cloud or other cloud platforms.

Siemens' Teamcenter software ‘weaves a digital thread of data’ through an enterprise’s portfolio of plants and other assets, from capital project delivery to operations, spanning design, construction, and operations. The digital thread, via a strategic alliance announced last year, ties into Bentley Systems iTwin cloud services to enable project and asset performance digital twins. More from Siemens.

Sphera’s new IRM 4.0 solution leverages Industry 4.0 principles to help companies manage and execute plant turnarounds. The software uses risk as a key driver in planning major turnarounds in addition to resources, materials, contractors, availability and costs. The solution provides tools to improve ‘risk-based lean scope’, holistic management and optimization of the turnaround and improved safety.

Software AG has released TrendMiner 2019.R3 with a new analytics-driven ‘production cockpit’ that compares live and historical production to display diagnostics, quality status and provide predictions to operators and management through custom dashboards. TrendMiner integrates with historians including OSIsoft PI, Yokogawa Exaquantum, AspenTech IP.21, Honeywell PHD, GE Proficy Historian and Wonderware InSQL.

Universal mCloud has expanded its AssetCare asset management solutions with new capabilities including advanced industrial IoT sensors, drone-based AI-powered aerial capabilities, and digital twins for process simulation and 3D virtual facility walk-downs. More from Universal mCloud.

Weatherford has added business intelligence powered by Microsoft Power BI to its ForeSite production optimization platform for the oilfield. The integration accelerates ForeSite users’ ability to deep dive into their production data, conduct analyses, and create simple, easy-to-digest data visualizations that illustrate virtually any production scenario from the well to asset level.


NETL’s Offshore Risk Modeling suite

New spill modeling tool leverages terabytes of Energy Data eXchange historical data to ‘evaluate and reduce’ risk of spills.

The US National Energy Technology Lab, NETL, has developed ORM, an offshore risk modeling suite to evaluate and reduce the risk of oil spill events. ORM was developed in response to the 2010 Deepwater Horizon oil spill which showed a need for improved system-wide knowledge and computational tools to predict and prevent future spills.

ORM was built by researchers from NETL’s Geo-Analysis & Monitoring Team leveraging terabytes of data from Energy Data eXchange, the Department of Energy’s Office of Fossil Energy’s virtual data library and laboratory. Data includes information about the water column and ocean currents, emergency response availability, oil particulate behavior and more. The tools can simulate 4-D oil spill and blowout scenarios, identify critical subsurface characteristics such as pressure and porosity during drilling activities, evaluate emergency response preparedness and assess the integrity of offshore infrastructure.

ORM comprises several stand-alone modules. Blosom, the blowout spill occurrence model, an open-source, comprehensive model that predicts how and where oil will travel following offshore blowout and spill events. CIAM the climatological isolation and attraction model that characterizes oil and particulate attraction/repulsion. CSIL, cumulative spatial impact layers for socio-economic and environmental risk assessment. SWIM, the spatially weighted impact model, a decision support tool. STA, subsurface trend analysis that combines petroleum geology methods with data science to improve prediction of subsurface properties. VGM, the variable grid method that communicates uncertainty in data and modeled results.

The NETL’s Kelly Rose said, ‘ORM is a new approach to the offshore environment. In the past, smaller-scale datasets were the primary focus in informing decisions. Using large-scale spatial and temporal data to inform local needs has the potential to increase the safety of hydrocarbon exploration and ensure responsible stewardship of the environment.’

More from NETL (dated 2016).


GBC IIoT and digital solutions in oil and gas 2019, Amsterdam

McKinsey: LEAN, Lighthouses and the digital transformation. BHGE: BP Pharos’ Post-Its. ENI: No more PoCs! Equinor: Omnia/Azure ‘deliver more than expectations’. OMV Petrom trials Teradata. Bloomberg ‘lack of astonishing results to date’. BP: how to fix digital underperformance. Linde: remote ops and the digital cylinder. Shell: PI at heart of transformation. Schneider Electric: predictive maintenance tops digital agenda. Askelos faster FEA. Startup pitches.

Introduction

The 3rd Global Business Conferences IIOT & Digital Solutions for Oil & Gas* held in Amsterdam earlier this year heard from a variety of practitioners in the oil and gas digital transformation space. Many current proof-of-concept implementations have struggled to prove their worth. BloombergNEF reported a ‘lack of astonishing results’ to date emanating from the digital transformation movement. For some this is an indication that only a full-blown push for enterprise-wide deployment will lead to transformative success.

McKinsey

McKinsey’s Anosh Thakkar opened the proceedings with a keynote on ‘maneuvering the Industry 4.0 jungle to deliver impact at scale’. McKinsey’s recipe for digital transformation starts with C-Suite/business project ownership (not IT). An approach that Thakkar enigmatically terms a ‘sheltered highlander’. Digital transformation shares a lot with the Lean approach, with ‘Lighthouse’ projects key to demonstrate impacts. There are five principle levers to Industry 4.0: AI-enabled predictive maintenance, production optimization, real time asset performance management, automation of manual processes with robotics and machine vision, end-to-end dynamic optimization. More on McKinsey’s Lighthouses from the 2019 World Economic Forum presentation.

The big challenge however is that only 20% of industry proofs of concept have started to scale, ‘even though their impact was proved’. The technology jungle is another impediment, ‘everyone offers everything’, in a world of 100 interfaces. McKinsey advises drafting a matrix of opportunities, building the business case and following the money. Where many are eager to jump into the technology pool and like to talk technology. Thakkar says, Whoa! Hold off, what is the problem and what is the expected value. Only then will the solution fall into place.

BHGE BP Pharos

Picking up on the lighthouse theme, Sak Nayagam (BHGE, formerly with Accenture), presented the Pharos digital transformation project, co-created in partnership with BP, Microsoft, Accenture and BHGE. Pharos kicked off some 18 months ago in an ‘ideation’ session that set out to ‘help BP be best in class’. A team was assembled in Houston with instructions to move beyond a ‘land of 1001 pilots’. A few thousand Post-It notes later, the team had worked through predictive analytics for GE’s PowerGen LM2500 gas turbine, a subsea flow assurance use case and more. Business transformation consultants Morphix contributed to the project.

ENI – no more PoCs!

Giacomo Silvestri (ENI) agreed that there is too much focus on the proof of concept. ‘We now understand there is value in the transformation, but this is not a short-term game changer’ ‘I am fed up with the constant PoC approach’. ENI had 20 drone PoCs where two would have done! We need more courage. Investment in these technologies is small beer for the industry. But you need to avoid noise and distraction from the hype. Enter the ENI Digital Agenda, gathering input from tech scouting and open innovation from outside, developing business cases and partnering. Focus now is on safety and asset integrity, enhancing performance, decarbonization and the circular economy. ENI anticipates a billion Euro value from digital over three years and a 7x ROI in ten years. There are currently 165 digital projects underway. These include image recognition in seismic, AI, HMI, IoT, robotics, 3D printing and blockchain. ENI is also working to improve the audit function with help from banks and financial services companies. The scouting activity has scanned some 350 startups and selected 5.

Equinor and the innovator’s dilemma

Einar Landre believes that data is to drive Equinor’s next wave of improvements. Equinor’s data roadmap centers on ‘Omnia’, its in-house developed Azure-based subsurface data lake and reservoir engineering platform. Omnia supports operations planning, digital twins, and drilling automation. Landre cited Fraunhofer’ Matthias Naab as stating that ‘since 2010 software can deliver more than expectations’. To date digitization has followed an incremental path. It is now time for new business models and ecosystems and ‘radical new digital services and solutions’. Like others, Equinor is confronted with the innovator’s dilemma. As new tech coming along, when do you jump? This is a tough question. But it may pay to evaluate radical concepts based on their future business value. In which context, Landre presented the Shell-inspired open subsurface data universe OSDU.

OMV Petrom trials Teradata warehouse

Jaco Fok reported on a pilot deployment of a Teradata warehouse at OMV Petrom, the largest energy company in Romania and southeastern Europe. The Teradata integrated, centralized business analytic hub provides a single version of the truth to Petrom’s refiners. The system provides real-time insight into blending performance, throughput and yields and tank farm/terminal operations. The pilot has moved through increasingly promising stages and is now credited with providing a basis for day to day decisions in planning and scheduling. Extended metrics along the hydrocarbon value chain feed into a prediction and recommendation engine. OMV’s Petrobrazi Refinery was home to the Teradata pilot.

BloombergNEF on emerging tech

Eleonore Lazat from Bloomberg New Energy Finance gave an overview of Bloomberg’s study of emerging technology and new value creation. No surprises in the list of emerging technology, from blockchain to drones and ‘disruption’ from new players and cloud computing. Another study of the impact of digitization in the refinery has found a lack of astonishing results to date in terms of ROI. Perhaps the value of digital is elsewhere, safety?

BP on digital underperformance

Noorddin Taj (BP) provided more data on digital underperformance. While eight out of ten companies are on the digital journey, only 14% are able to demonstrate sustainable digital project. ‘They just want to do digital because everyone is doing it!’ So why now it is so important now? Previously companies would just digitize something. Now they are changing their business model to gain improvement. Taj foresees a shift to more ‘agility’ and from ‘asset-intensive’ to ‘idea-intensive’ profitability, citing GE as an ‘asset-intensive failure’*. BP is using more and more APIs, ‘We are on an API journey, all interfaces expose discoverable reusable APIs’. Taj advocates microservices as opposed to a ‘monolithic’ architecture. Also, ‘usability is more important than functionality’. Taj offered one example of a changing business model, petroleum swapping between oil companies. Depending on the distance from the terminal, product can be bought and sold between operators such that BP gas may be delivered to a Shell station. This ‘used to be done in Excel’, now it runs on a blockchain-based exchange. IoT devices manage crypto token exchange and do ML analytics on lifting to predict inventory levels, sales etc. Smart contract and blockchain-based self-service identity allows individuals to check in without registration.

* Our tracking of GE would put it close to the forefront of ‘idea-intensiveness’. GE may not have invented the ‘ideation’ word (its first use was in 1818!) but it certainly popularized its recent use.

Linde – remote operations and the ‘digital cylinder’

Julien Brunel described Linde Engineering as a ‘digital remote plant pioneer’. Linde’s 1,000 plants are run from ten remote operations centers. A ‘Deep Cylinder’ machine vision identifies gas bottle types coming in. The Linde PlantServ portal let customers access information directly and supports a shift away from the old reseller model. Brunel acknowledged that there is still ‘a lot of paper P&IDs, Excel and cut and paste’. Linde is working on a tablet-based P&ID solution.

Shell – PI at heart of transformation

While others took a somewhat high-level view of digitization, Peter van den Heuvel exposed the nuts and bolts of Shell’s OSIsoft PI center of excellence. A real time architecture leverages C3IoT and PI System across the board. Shell has 15,000 users of PI, 20 years of real time data and 7.5 million connected instruments. C3IoT adds AI and machine learning to the mix, Matlab, R, Python and Seeq also ran although, ‘PI is the standard’. The latest deployment on the 488 m long Prelude FLNG leverages Seeq, C3IoT and PI in the Sky (cloud). The move to the cloud means more flexibility, but also requires getting used to Docker, virtualization, etc.

Schneider Electric – predictive maintenance tops digital agenda

Vincent Jacquemet (Schneider Electric) has integrated its Aveva and Invensys acquisitions into its EcoStruxure platform. Citing a McKinsey analysis, Jacquement reported that predictive maintenance tops the digital agenda in terms of expected value. Analytics can be delivered in two fashions. Opex-oriented via the cloud or capex-oriented at the edge or embedded in a device. Thermal monitoring of transformers or fault detection in pumps are amenable to the ML in the cloud. Models are trained on ‘normal’ behavior and detect anomalies. Predictive maintenance is a key driver of digital transformation. But it is not exactly new. France’s EDF has been using ML for predictive maintenance for 15 years. Duke Energy has some 14,000 ML models deployed. For successful deployment, data prep is key. Models need to be tested with data playback, to tune thresholds and alarms. Edge computing brings ML to the field and is key to avoiding ‘data decay’? Edge enables the use of high frequency data and just-in-time intervention. Schneider ran a pilot on five Canadian wells installed with Realift controllers and an edge gateway. Schneider’s Vijeo Designer was used to allow human intervention to check analytics and label Dynacard images. The PoC identified three patterns found to be indicative of a faulty load cell, a paraffin issue and rod centering guide problems

Askelos’ faster FEA

John Bell presented Askelos’ finite element analysis (FEA) toolset that performs structural analysis of large offshore platforms ‘1,000 x faster than previously possible’. Askelos came out of MIT and the US Department of Defense with backing from Shell Ventures. Speed-up is achieved by ‘reduced data’ FEA, demonstrated with a ‘full spectral fatigue analysis’ on Shell’s Bonga FPSO. The company is now working with Shell to prove to the regulator that some North Sea assets have much longer life than expected.

Startup Pitching Session

Swim.ai presented its ‘distributed data fabric’ that analyzes and learns from time series data, creating a digital twin from the data. Swim.ai replaces ‘gigabytes’ of Hadoop/Spark code with a message broker, a streaming API, dashboard andstorage.

Datumize is ‘unlocking’ industrial data and bridging the IT/OT gap with ‘non-intrusive’ data capture. A data appliance in the DMZ taps into the comms network with a ‘unique network sniffing protocol’ capturing data ‘without being noticed’. The system has been deployed by Cepsa to optimize refinery operations.

Precognize’s SamGuard provides predictive monitoring, spotting failures on catalyzers, heat exchangers and rotating equipment. Unsupervised ML on historical failure data builds a baseline model. This is combined with P&ID structures and plant process behavior, the cognitive bit. SamGuard ‘listens out’ for small changes to show where problems are about to occur. There is ‘no alert fatigue and no need for data scientists’.

Industrial Analytics is a specialist in acoustic/vibration fingerprints of rotating equipment from different suppliers and vintages. Sensors feed into TurboNode, a hybrid physics/ML-based model for processing with work orders delivered to SAP Maintenance Management.

White Space Energy plans to ‘navigate complex decisions’ in oil and gas through game technology-derived AI. Today, there is ‘lots of noise and hype out there’ (in the AI/ML space). White Space is moving slowly to apply ‘superhuman’ gaming to problems such as well trajectory planning, logistics and more.

Atomiton provides edge computing solutions for pipeline monitoring of vandalism/damage and leak prediction (not just detection). Acoustic sensors are processed with ML-derived pattern recognition in a continuous learning process.

* GBC is rebranding the conference which in 2020 will become GO Digital Oil & Gas with two events, one in Abu Dhabi (3-4 March 2020) and the other in Amsterdam (3-4 June 2020).


Folks, facts, orgs …

Ambyint, B. Riley Financial, Universal mCloud, Compressor Engineering, Energy Web Foundation, GSE Systems, Harbo, Hicor Technologies, Infrastructure Networks, MSi, MVP Holdings, Noble Corp., National Science Foundation, Opto 22, PAS Global, Pason Systems, Twenty20 Solutions, Prairie Field Services, ProStar Geocorp, Quintana Energy Services, Quorum Software, Petro.ai, Center for Petrochemical Energy and Technology, Society of Exploration Geophysicists, StormGeo, Total, Western Midstream Partners, Williams Industrial Services, Marcellus Shale Energy and Environment Lab, Moore Industries,

Ambyint has added David Zahn as COO. He was previously with PAS.

Jon Donnel has joined B. Riley Financial’s Great American Group advisory and valuation services as MD oilfield services in Houston. He was previously with Weatherford and Scotia Howard Weil.

Jim Christian has joined Universal mCloud as VP, emerging tech. He hails from Siemens.

Pat Hickey has joined Compressor Engineering (CECO) as consultant to the training and technical services team.

The Energy Web Foundation has named Walter Kok CEO. He was previously COO.

Chris Sorrells has resigned from the GSE Systems’ board and from his position as COO, telling the board that his resignation ‘was not the result of any disagreements with the company on any matters relating to its operations, policies or practices’. Kyle Loudermilk, president and CEO, is to assume Sorrells’ responsibilities. GSE does not plan to fill the COO position.

Brandon Buzarde has joined oil spill research outfit Harbo as chief commercial officer. He hails from Norway’s Cubility.

Hicor Technologies has changed its name to Reach Production Solutions, reflecting a transition from a compression technology company to a ‘full-service artificial lift and frac hit recovery solutions provider’.

Former CEO of Rignet and Osprey Informatics Mark Slaughter has been appointed CEO and chairman of Infrastructure Networks.

MSi has hired Don Ward as SVP Global Services. He was previously with TippingPoint.

Clay Buford and Jamie George have joined MVP Holdings at a new office in Houston. Both hail from CIMA Energy.

Adam Peakes, Senior VP and CFO has resigned from Noble Corp. The search for a replacement in on.

Margaret Martonosi now heads-up the US National Science Foundation’s directorate for computer and information science and engineering. She was previously director of the Keller Center for Innovation in Engineering Education.

Opto 22 has names Josh Eastburn, formerly with Genentech, as director of technical marketing.

Matthew Selheimer is now CMO of PAS Global. He hails from Alert Logic.

Pason Systems has appointed Laura Schwinn to its board of directors.

PetroCloud has changed its name to Twenty20 Solutions, reflecting a ‘broader market appeal’.

Trent Bischoff has joined Prairie Field Services as VP business development, Texas. He hails from Gibson Energy.

ProStar Geocorp has hired Peter Srajer as chief GIS scientist.

Quintana Energy Services has named Christopher Baker president and CEO. He was previously EVP and COO.

Thoma Bravo unit Quorum Software has appointed Tom Lacy to executive VP engineering and Clay Myers to CFO. Lacy hails from Bazaarvoice, Myers from One Technologies.

Ruths Analytics and Innovation has changed its name to Petro.ai.

Jim Griffin is associate vice chancellor and senior VP of the Center for Petrochemical, Energy, and Technology at San Jacinto College, a new, 151,000-square-foot petrochemical training facility part-funded by Emerson.

The Society of Exploration Geophysicists has announced its 2019–2020 board election results. Maurice Nessim is president-elect, Scott Singleton is second vice president and Baishali Roy is VP publications. Kurt Marfurt and Tad Smith are directors-at-large.

Søren Andersen is the new CEO of StormGeo. HE was previously with 2020seaways.

Jean-Pierre Sbraire is now CFO with Total and Helle Kristoffersen is president, strategy and innovation.

Western Midstream Partners has named Michael Ure as president and CEO and Craig Collins as SVP and COO. Ure hails from Occidental Collins returns to WES after a stint at Western Gas Partners.

Williams Industrial Services Group has promoted Matthew Petrizzo to president of Energy and Industrial. Kelly Powers was also promoted to president, Power.

Data

The West Virginia-backed Marcellus Shale Energy and Environment Laboratory (MSEEL) has made available a large geoscience data set as a ‘baseline for reservoir and environmental characterization’. More from MSEEL.

Death

Moore Industries announces the death of its founder and owner, Leonard Moore, at the age of 85. Moore Industries-International was founded in 1968. In 2009, Moore was inducted into the ISA’s group of Honorary Members. Read the obituary on the Moore Industries website.


Done deals …

Aucerna/Micotan. RigData/DrillingInfo/Enverus. Eagle Automation/Texas Energy Control Products. Emerson/KnowledgeNet. Engage Mobilize. ENGlobal. Hatch/Upside Engineering. Helmerich & Payne/DrillScan. Interface Fluidics. Lasser/TGS-NOPEC. Matrox. Siemens/PSE. Quorum/OGsys. SeekOps. TechnipFMC. Teledyne/3M. Total Safety/S&S. Siemens’ negative-yield bonds.

Aucerna has acquired Micotan Software, creator of the well lifecycle management tool, Generwell.

S&P Global/Platts has sold its RigData unit to DrillingInfo. Subsequently, DrillingInfo has changed its corporate name to Enverus.

Eagle Automation has acquired Fort Worth based Texas Energy Control Products, a control panel fabricator and distributor of automation and measurement products across Texas and Oklahoma.

Emerson has acquired KnowledgeNet (KNet) software from Tunisia-based Integration Objects. KNet’s analytics application software accelerates digital transformation initiatives for process industries by applying statistical and machine-learning algorithms to diverse information technology (IT) and operational technology (OT) data.

Engage Mobilize, a cloud-based digital field management, procurement, and electronic ticketing platform for oil and gas, has announced a Series A financing round led by Cottonwood Venture Partners. The funding will allow Engage to add to its Denver-based team to enhance its current platform and accelerate product development, including advanced-analytics.

ENGlobal has been notified by the Nasdaq Stock Market that the company has regained compliance with the minimum bid price requirement for continued listing on the Nasdaq capital market. At the time of writing (22 oct 2019) ENGlobal’s shares are just above the $1 watermark at $1.02!

Calgary-based Hatch and Upside Engineering are to merge.

Helmerich & Payne, via its wholly owned Helmerich & Payne Technologies unit, has acquired DrillScan, a provider of proprietary drilling engineering software, well engineering services and training for the oil and gas industry.

Interface Fluidics has secured $4.5 million in Series A from Equinor Technology Ventures and ‘global accelerator’, Techstars.

Lasser has been acquired by TGS-NOPEC.

Lorne Trottier, co-founder of Matrox, has acquired 100% ownership of the Matrox group of companies, including its three divisions—Matrox Imaging, Matrox Graphics, and Matrox Video.

Siemens is to acquire Process Systems Enterprise (PSE) and its gPROMS technology. The unit will integrate Siemens digital industries’ process automation business.

Thoma Bravo portfolio company Quorum Software has acquired Fort Worth-based OGsys, a provider of cloud-based oil and gas accounting software for small and medium-sized businesses. The acquisition complements its accounting solutions for mid-market and enterprise companies.

SeekOps has secured Series A-1 funds from the OGCI Climate Investments fund and Equinor Technology Ventures. The company’s miniature drone-born SeekIR gas sensors were validated in Standford University’s 2018 Mobile Monitoring Challenge as ‘top-performing technology’ for emissions localization and quantification.

Following the 2017 merger of Technip and FMC into TechnipFMC, the companies are to demerge. Possibly to avoid the embarrassment of appearing to go back to square one, the demerged companies have been dubbed ‘RemainCo’ and ‘SpinCo’.

Teledyne Technologies has acquired 3M’s gas and flame detection business in a $230 million cash deal that includes the Oldham, Simtronics, GMI, Detcon and select Scott Safety products.

Total Safety has acquired S&S Supplies and Solutions.

On 6 September 2019, Siemens issued €3.5 billion worth of bonds with maturities of two, five, ten and fifteen years. Investor demand was over four times the issue volume. Incredibly, the two and five year bonds had negative yields of respectively minus 0.315% and minus 0.207%! More from Siemens.


Upstream Intelligence Data Driven Drilling & Production Conference, Houston

Anadarko DPAT drilling automation. Total D-WIS drilling interop proposal. Anadarko diffusive time-of-flight. Repsol’s EarthSpy, the answer to ‘plummeting’ upstream profitability. LNS Research defines the digital twin. Red Hat OpenShift Energy Commons. More on OSDU. EarthPeel seamless geoscience in the cloud.

Dingzhou Cao from Anadarko’s advanced analytics and emerging technologies unit presented Anadarko’s real-time drilling (RTD) journey at the Upstream Intelligence Data-driven drilling and production conference earlier this year. Anadarko’s first generation RTD Analytics System leveraged a physics/rule-based system that ingested Witsml drilling data into a StreamBase (now a Tibco unit) real time datamart for exploitation with MapR. The system proved to be poor at recognizing drilling states in real time. The second generation RTD system was AI-based, built on a Google BigQuery database and a machine learning pipeline running on Kubernetes. Tibco StreamBase was again deployed for complex event processing with results captured to MongoDB.

The new system leverages a convolutional neural net and ‘semantic segmentation’ (pattern recognition) to distinguished between rotate and slide (directional) drilling states. A U-Net deep learning model can recognize drilling states from limited measurements (of standpipe pressure). Latterly the system has undergone a ‘digital transformation' and is now rolled up into the ‘DPAT’ drilling program automation tool and a Google cloud based ML pipeline. The only drawback is that ‘the engineers love Spotfire and Excel’. On the plus side, DPAT has automated the whole process in the cloud on a StreamBase high availability architecture.

Darryl Fett, from Total’s US E&P Research & Technology unit, floated a proposal for a drilling and wells interoperability standards (D-WIS) initiative. The aim is to establish recommended practices and standards that enable interoperability between all components, equipment and systems used in oil and gas well construction. Such seamless data exchange will reduce cost, increase efficiency and improve safety. Effective well data management will pave the way for high-end applications in drilling automation. Fett takes inspiration from other initiatives such as the SPE’s DSATS, Norway’s NORCE/DD-Hub and the IADC. Fett proposes a ‘systems engineering’ approach, coupled with advanced telemetry, data analytics and automation. There needs to be a strong focus on interfaces between components, systems and processes and a ‘plug and play’ capability. The objective is to provide decision makers with the best answer rather than just more data. Fett argued for a shift from proprietary systems to an ‘open’ mentality, breaking down the silos. Data ownership is not a new problem, but it is solvable. Contractual and legal issues need to be addressed as does the economic model that ensures value to all stakeholders. The good news is that the technical part is not very challenging.

Anadarko’s Sathish Sankaran sketched a spectrum of modeling styles, from the full physics, high resolution of the simulator, through upscaled, reduced-order models, approximate physical (streamline), hybrid data-driven/physics, and finally, physics-free ML-based data driven models. Anadarko has tested a random forest ML approach on a deep-water Gulf of Mexico field, using a combination of a reduced order model along with machine learning. Fine scale training simulations were used to determine an optimal production strategy with significant speedup (~30x –40x) over a full field simulation. Another test on an onshore unconventional used a ‘diffusive time of flight*’ model combined with flowing material balance and non-parametric regression to calculate true well performance and forecast based on routinely measured data.

Sankaran concluded that although these approaches work, the industry ‘lacks champions’ for the new technology. He cited a study from Boston Consulting, that found energy to have the highest percentage of ‘digital laggards’ of all industries and one of the lowest percentages of digital champions.

* See also the Texas A&M work on diffusive time-of-flight on OnePetro.

Repsol too has drunk deeply of the machine learning digital Kool-Aid. Raul Cabrera-Garzon, presenting on behalf of Francisco Ortigosa, qualified the transformation on Repsol’s geoscience as a ‘redefinition’ of how Repsol works in exploration. Digital transformation is moreover framed as the answer to ‘plummeting profitability’ in the upstream business. The cloud, and a ‘democratization’ of technology is enabling a shift from qualitative seismic interpretation to an analysis based on data-rich, quantitative input. Repsol’s trademarked EarthSpy illumination technology is ‘changing the seismic processor experience’ and speeding high-end offshore data processing. Google Cloud Vision also ran.

Industry observers puzzled by the arrival of the ‘digital twin’ in an industry with more simulators than you can shake a stick at should benefit from Joe Perino’s (LNS Research) attempt at a taxonomy. While the marketers promote digital twins as an easy path to industrial transformation, the challenge is understanding exactly what it is! There is no commonly accepted definition and architecture in the process industry. The DT is said to unify the three simulator families of advanced process control, simulation and operator training. The DT replicates the physical process and pinpoints ‘previously undetected or unexplained patterns, meanings, anomalies, and discrepancies’. To deploy a DT, Perino recommends setting realistic expectations, working with an enterprise architect and ‘get your data engineering right’. A platform is preferable to point solutions, but, warns Perion, ‘avoid lock-in’.

John Archer (Red Hat) reprised his PNEC presentation on open source in the upstream with shout-outs to oil and gas success stories chez BP, Equinor and ExxonMobil. Archer enumerated some data science challenges. Many efforts are stuck on the desktop as data is often of dubious, random quality, incomplete or lacking metadata standards and pedigree. Most are still maturing their AI work in a ‘very crowded and fast-moving space’. What’s more, ‘no one wants to be called a citizen data scientist!’ Red Hat’s OpenShift Kubernetes community now has an energy special interest group with BP and Saudi Aramco onboard. The OpenShift platform provides machine learning workflows for data scientists. Interested parties may like to sign up for the upcoming OpenShift commons gathering on AI/ML (October 28, 2019 San Francisco). The Red Hat OpenData Hub and Noobaa, an open source multi-cloud object gateway also ran.

Johan Krebbers (Shell) fleshed-out the OSDU (open subsurface data universe) project as addressing a new generation of applications under development that will enable flexible workflow orchestration. Under the hood, a micro-services driven, Kubernetes architecture with an HTML5-driven (including 3D support) GUI. Both physics-based and data driven applications will feature, ‘exploiting machine learning wherever possible. Current (legacy) Windows and Linux apps will also be supported in the ‘game changing, desperately needed’ platform. Data is central to the OSDU initiative. Data will not be left to one company, ‘that has never worked’. Krebbers presentation was made before Schlumberger announced its involvement in OSDU.

Ananya Roy and Jeshurun Hembd presented startup EarthPeel and its vision for ‘instant access to geoscience data’. EarthPeel promises a cloud-native architecture, modular components and well-defined REST APIs. Partners are sought for pilot consulting contracts.

More from Upstream Intelligence.


Cyber security round-up

Barracuda Networks and SCADAFence team. NIST CCoE on energy asset cyber security. NIST on IoT cyber sec. NIST on mitigating software vulnerabilities. WindRiver Opto22 and the URGENT/11 scada flaw. Schneider Electric and ISA Global Cybersecurity Alliance. Siemens on cybersecurity in the cold!

California-based Barracuda Networks and Israel-based SCADAfence have announced a joint cyber security solution spanning operational technology, critical infrastructure and smart buildings. The combination of Barracuda’s CloudGen firewall with SCADAfence offers OT threat detection and automated enforcement to improve incident response. More from Barracuda.

A new, 144-page publication from NIST and the Cybersecurity Center of Excellence titled ‘Energy sector asset management for electric utilities and oil & gas’ offers advice on how energy organizations can identify and manage OT assets and detect associated cybersecurity risks. Special Publication 1800-23 is available free of charge from NIST.

NIST has also published NISTIR 8228 titled ‘Considerations for managing internet of things cybersecurity and privacy risks’. The report covers high-level goals for risk mitigation in terms of device security, data security and individual privacy. This report is first in a planned series of publications on such topics More from NIST.

A draft cybersecurity white paper from, you guessed it, NIST addresses ‘Mitigating the risk of software vulnerabilities with a secure software development framework (SSDF)’. The white paper recommends that a core set of high-level secure software development practices be added to the software development lifecycle. The approach addresses development in information technology, industrial control systems, cyber-physical systems and the internet of things. The white paper is a phenomenal Collection of Acronyms (CoA). Read it here.

Following the outbreak of the Urgent/11 vulnerabilities in Wind River’s VxWorks IPnet TCP/IP stack, Opto 22 has assured customers that its products, including the Groov Epic edge programmable industrial controllers and SNAP PAC Systems, are not affected by the vulnerabilities. More on the Urgent/11 vulnerabilities from Wind River and from IoT security specialist Armis.

Schneider Electric is the first ‘founding member’ of the newly formed International Society of Automation (ISA) Global Cybersecurity Alliance. The Alliance sets out to advance cybersecurity readiness and awareness in manufacturing, critical infrastructure facilities and processes. The goal is to extend the ANSI/ISA/IEC 62443 series of standards to relevant markets and to help specific verticals apply the standards. The standards define requirements and procedures for implementing electronically secure automation and control systems and security practices and for assessing electronic security performance. Other founding members include Claroty, Nozomi, Johnson Controls, Rockwell Automation and Honeywell.

No, it’s not a misprint, Siemens has announced an industrial application hosting platform for cybersecurity in the cold! The new Ruggedcom Application Processing Engine (APE) is an industrial application hosting platform designed for running third party software applications in harsh, mission-critical environments. The Ruggedcom APE server is certified for operations in temperatures down to -40° C (or °F which is the same thing!).


Sales, partnerships, deployments

Amalto and FIS Global. Alberta Machine Intelligence Institute. AspenTech. BHGE and C3.ai. Blue Marble and MangoMap. dGB and TetraSeis. FPT Corporation. KBC and AVGI. Luna Innovations. Matrox and VuWall. Projetech and SRO Solutions. Quantum and RS Energy Group. SeekOps and Impossible Aerospace. E-matica and Seeq. Siemens and Fraunhofer. Schlumberger and Microsoft. Schlumberger and IHS Markit. Schlumberger and Tibco. Schlumberger and Chevron. SkyX and Aerial Project Analytics. TechnipFMC and DNV GL. TGS and Quantico. Quorum Software. UTSI International and Sensewaves. Wellbore Integrity and OFS Portal. AFTI WatchDog.

Amalto and FIS Global have entered into a partnership whereby the Amalto e-Business Cloud is integrated with FIS Getpaid. All invoices-related responses, disputes and deductions are now displayed in Getpaid. More from Amalto.

Amii, the Alberta Machine Intelligence Institute is to collaborate with Imperial Oil on machine learning for the oil patch. The two-year agreement targets the development of Imperial’s in-house machine learning capabilities with AI projects to enhance recovery, reduce environmental impacts and improve workforce safety. More from Amii.

China HuanQiu Contracting and Engineering Corp. is implementing AspenTech software. The wholly owned China Petroleum Engineering Corp. unit has implemented Aspen HYSYS Dynamics software to maximize safety, throughput and profits at the design phase of critical systems.

BHGE and C3.ai have announced BHC3 Reliability, an AI application that provides early warning of production downtime and process risk to improve operational productivity, efficiency and safety. BHC3 Reliability is the first artificial intelligence application developed by the BakerHughesC3.ai joint venture. BHC3 Reliability leverages a ‘system-of-systems’ approach that scales to any number of assets and processes across offshore and onshore platforms, compressor stations, refineries, and petrochemical plants, reducing downtime and increasing productivity. Reliability draws on BHGE’s domain expertise by augmenting application alerts with failure prevention recommendations and prescriptive actions.

Blue Marble Geographics has teamed with web map service specialist MangoMap to allow Global Mapper users to upload map data directly to an online Mango-hosted map site, for sharing with other stakeholders. Mango’s web deployment capability provides a ‘seamless’ one-stop GIS data management, visualization and sharing tool.

dGB and TetraSeis have joined forces to develop new workflows for detecting and interpreting seismic fractures. ‘DMW’, TetraSeis’ duplex wave migration technology for imaging sub-vertical discontinuities in seismic data is to be interfaced with dGB Earth Sciences’ OpendTect seismic interpretation platform. To create the new plugin, TetraSeis’ DWM code needs to be upgraded to commercial standards, a new GUI is to be developed and tools for post-processing, interpretation and QC need to be integrated in OpendTect workflows. Sponsors are being sought for the work and for participation in a proprietary case study. dGB has also announced Terranubis, a cloud-based portal for buying, selling and interpreting seismic data sets and interpretations. More from dGB.

RWE AG has extended its IT service agreement with Vietnam-based FPT Corporation, a leading South East Asian IT services provider. FPT will be providing emerging technology solutions in areas such as internet of things, mobile solutions and robotics process automation. FPT will also provide consulting and development services in test automation, SAP/ABAP development and SAP consulting and application maintenance. The agreement lasts until year end 2024.

Yokogawa unit KBC has partnered with AVGI, combining Petro-SIM with AVGI’s COILSIM1D software for olefin modeling. The solution will deliver capital efficient plant design and optimization through end-to-end modeling of integrated oil refining, aromatics and steam cracking complexes. More from KBC and AVGI.

Luna Innovations reports deployment of its fiber optic-based sensing technology with an unnamed Colombian pipeline transport company. The lines were outfitted with over 1,000 strain sensors. Luna’s Hyperion optical sensing interrogator now provides real time data on the mechanical behavior of the pipelines under diverse soil conditions. The information is used as a predictive measure of pipeline integrity, to establish new maintenance protocols and mitigate ruptures and remediation costs.

Matrox Graphics has partnered with VuWall on advanced video wall and visualization technology for the control room. VuWall’s VuStation is now connected to Matrox’s Extio 3 high-performance IP KVM extender, allowing for interaction with multiple sources from a single KVM station running on a standard Gigabit network switch.

Projetech, a distributor of IBM Maximo via SaaS has partnered with UK-based SRO Solutions to connect Maximo users with SRO’s data migration and replication tools. The alliance ‘sets the stage’ for Projetech to expand Maximo into the oil and gas vertical.

Quantum Energy Partners and RS Energy Group are to jointly enhance Quantum’s data analytics capability. RSEG’s predictive analytics will support its Quantum portfolio companies’ operations and strategic planning.

Gas sensor specialist SeekOps has teamed with Impossible Aerospace on solutions for the oil and gas vertical. SeekOps’ SeekIR gas sensors will be flown from Impossible’s US-1 drone. The new platform will allow natural gas leak inspection over larger geographical areas and complex facilities. The US-1 offers industry-leading flight times 3x greater than other all-electric multirotor drones. More from SeekOps.

E-matica is to offer Seeq’s OSIsoft-focused solutions to accelerate the digital transformation of its Italian client base. E-matica provides innovation and process improvement to clients in the oil and gas and other verticals.

Siemens reports successful tests of interactive data glasses for augmented reality applications. The Glass@Service HMI, developed by a consortium led by the Fraunhofer Institute, acts as a personal information system allowing for interactions such as eye and gesture control. The Glass@Service project was funded by the Federal Ministry for Economic Affairs and Energy (BMWi). More from Fraunhofer.

Schlumberger has announced a technology partnership with Microsoft to develop cloud-native solutions in the Azure and on the Azure Stack, Microsoft’s hybrid cloud. The Delfi cognitive E&P environment is now available on Azure, along with Schlumberger’s petrotechnical suite. The Eclipse and Intersect reservoir simulators are now available on-demand from the cloud. The DrillPlan well construction solution is now also available on the Azure Stack. Microsoft and Schlumberger are also working on an Azure-compliant open source Delfi data ecosystem, said to be one of the first OSDU data platforms in the Azure cloud.

Schlumberger has also announced a collaboration with IHS Markit on the WesternGeco GAIA digital subsurface platform. The deal provides analytics-ready data from IHS Markit directly accessible from GAIA. Initially the deal will cover well, production and asset information. The deal will subsequently extend to a collaboration in petrotechnical and data science R&D to deliver new data solutions on the GAIA.

More Schlumberger wheeling and dealing as witnessed by a deal with Tibco for the provision of advanced analytics in the Delfi environment*. Tibco Spotfire and Tibco Data Virtualization will be available in Delfi, ‘augmenting’ Schlumberger domain science applications with new analytics capabilities, ‘intuitive’ data virtualization tools and analytical workflows. The companies plan to ‘jointly develop descriptive and advanced analytics for new insights into the ever-expanding volume of E&P data’.

Comment: We thought that Delfi was already an advanced analytics environment!

Schlumberger has announced an ‘enterprise-wide’ deployment of Delfi chez Woodside Energy delivered in a seven-year technology collaboration. Woodside is to leverage the secure cloud-based software environment to increase consistency, reduce study cycle time and foster innovation in its subsurface characterization and development activities.

Comment: Sounds like cognitive Delfi has displaced cognitive Watson see Oil IT.

Also announced was a three-way deal between Schlumberger, Chevron and Microsoft for deployment of Delfi on Microsoft Azure.

A partnership between SkyX, an aerial monitoring solutions provider, and Aerial Project Analytics (APA), a topographic survey and site monitoring solutions provider, has kicked off with a ‘multimillion-dollar’ contract for the inspection and monitoring of hundreds of miles of pipelines for a major energy corporation in Africa. SkyX’s autonomous aerial systems will track the environmental impact of operator’s activity. APA adds a ‘thorough’ site inspection capability and incident monitoring to minimize and preventing catastrophic incidents.

TechnipFMC and DNV GL are to provide a qualification service for the integrity of digital twin technology. The methodology aims to bring a level playing field to the definitions and expectations of digital twins and is to ‘set a benchmark for oil and gas operators, supply chain partners and regulators to establish trust in digital twin-generated data for performance and safety decision-making in projects and operations’. The methodology will be built upon DNV GL’s Recommended Practice for Technology Qualification: DNVGL-RP-A203 a 20-year-old framework for the accreditation of unproven hardware in the oil and gas industry. The method is to be piloted on a subsea development project delivered by TechnipFMC starting early 2020 and published as a recommended practice during the second half of the year.

TGS is teaming with Quantico Energy Solutions to combine their respective offerings in seismic data, AI-based well logs, and AI-based seismic inversion, combining TGS’s data library and analytics ready LAS well logs with Quantico’s ‘QRes’ combination of physics and machine learning based subsurface mapping. More from Quantico.

Titan Rock E&P has selected Quorum Software’ SaaS upstream oil and gas solutions to support its operations and position the company for future growth. Quorum’s cloud-based production allocations and reporting, scada monitoring and land management solutions were included in the deal.

UTSI International and Sensewaves have partnership to combine UTSI’s industrial control system consulting for pipeline operations with Sensewaves’ AI technology. The result is Adaptix Pipeline, a machine learning platform to mitigating unplanned product stoppage through validation of leak alarms, improving predictive maintenance and identify intrusive activity in pipeline right-of-ways through analysis of cathodic protection data.

Wellbore Integrity Solutions has joined OFS Portal as a supplier member.

Whitecap Resources is to extend its use of AFTI’s WatchDog across its production assets. WatchDog provides low-cost oil well and pipeline monitoring, eliminating the routine well-site visit.


Standards stuff

CFIHOS V 1.4. D-WIS drilling interoperability. IIC and OGC collaborate. GeoTIFF V1.1. IOGP guide to P6/11 seismic binning. Open Industry 4.0. PPDM progress on 3.10. XBRL OIM for JSON and CSV. UK Data Exploration License. ‘Pint’, computing with units.

V 1.4 of Cfihos, the capital facilities information handover standard, has just been released with ‘improved and aligned’ tag and equipment class names (now called ‘things’), enhanced data requirements and updated implementation documents. Cfihos is in the process of transitioning from its Dutch origin chez USPI-NL to the UK headquartered IOGP. The aim is ultimately for an ISO standard. More from LinkedIn.

Speaking at a recent IADC Drilling Automation Technology Forum Total’s Daryl Fett floated a new standard, ‘D-WIS’ targeting drilling and wells interoperability. More in our report from the Upstream Intelligence Data Driven Drilling & Production Conference elsewhere in this issue.

The Industrial Internet Consortium and the Open Geospatial Consortium are to work together to further IoT industry market adoption. The orgs are to align efforts to maximize interoperability, portability, security, and privacy for the industrial Internet.

The Open Geospatial Consortium has approved GeoTIFF V1.1 as an OGC standard. GeoTIFF is used by the geospatial and earth science communities to share geographic image data. V1.1 formalizes the GeoTIFF specification, integrating it into OGC’s standardization process and aligning the spec with the EPSG geodetic parameter dataset. The standard is also backed by NASA for its Earth Observation products. More from the GeoTIFF standard home page.

The IOGP has issued Report 483-6u, a user guide to its P6/11seismic bin grid data exchange format. The report provides guidance to the writing, application and use of the P6/11 data exchange format. The P6/11 format replaces both the legacy P6/98 bin grid data exchange format and the UKOOA P1/90 data exchange format.

Software AG has joined the Open Industry 4.0 Alliance, an open ecosystem and framework for interoperability between IT and OT vendors in the process and other industries. Software AG’s focus is IoT connectivity, open edge computing and as a ‘hybrid integration operator’. More from OI 4.0.

PPDM reports progress on V3.10 of its eponymous upstream data model. Work on the hydraulic fracturing, water management, units of measure and coordinate reference systems is nearly complete. The Rules Team is moving forward with the enhanced application and classification work. More from PPDM.

The XBRL Standards Board has approved a new release of the Open Information Model (OIM) and xBRL-JSON specifications. OIM sets out to ‘simplify and modernize’ XBRL and to support exchange of XBRL data into other formats. An xBRL-JSON version is already available and XBRL is now working on xBRL-CSV, a translator for the ‘exceptionally efficient’ comma separated values format familiar to Excel hackers. More from XBRL.

The UK government’s Geospatial Commission has published a Data Exploration License to ‘harmonize and simplify’ access and use of geospatial data. The license gives free access to data held by the British Geological Survey, Coal Authority, HM Land Registry, Ordnance Survey and the UK Hydrographic Office, for research, development and innovation purposes. The Commission is now working on a machine-readable version of the license for open and commercial usages.

Finally, another heads-up from Agile Scientific which brought the following programming gem to our attention. ‘Pint’ is Python library that carries physical units around with computing quantities. The computer figures out the best units and multipliers to use for a particular calculation, handling dimensional analysis and detecting units from strings. More from Agile and Pint.


World’s largest industrial ontology

International description logics workshop hears from Aibel on the massive materials master data ontology behind Johan Sverdrup drilling platform. DNV GL on obstacles to semantic deployment. NORSOK Z-T data revamp JIP.

Introduction

The 32nd International Workshop on Description Logics hosted in Oslo by Sirius* and the University of Oslo earlier this year, was billed as the major event of the DL research community. An industry panel heard from semantic practitioners in oil and gas and other verticals and served as a ‘reality check’ on the use of ontologies and reasoning in industry.

Aibel – world’s largest industrial ontology

Christian Hansen (Aibel) spent the last five years working as a consultant in the oil and gas industry, applying semantic technologies and building ontology-based systems which use DL reasoning. The outcome is Aibel’s Material Master Data (MMD) engineering ontology that targets the management of complex requirements in new builds and modifications of oil and gas platforms. The Aibel MMD was notably deployed on Equinor’s Johan Sverdrup drilling platform. The Aibel MMD handles the complex requirements of cost estimates, material catalogs and data exchange and reuse. The result is possibly the ‘world’s largest’ industrial ontology with an OWL 2.0 encoded modular system that leverages public ontologies including SKOS, PAV (provenance), and others. The MMD includes some 1,840,769 axioms and 98,133 engineering data classes. The system was developed in Protégé, with an Oracle 12 master data repository. A Hermit OWL 2 reasoner controls export to SAP, AVEVA, EiS and CAD.

The system has eliminated data duplication and ensures that material catalogs containing only valid components. Incorrect component ordering has been reduced and system is said to be the foundation for future decision support/automation efforts and for a ‘digital twin’. On the downside, reasoning performance is poor across the large database. Temporal reasoning and lifecycle considerations are work in progress.

DNV GL automates complex but trivial tasks

Johan Klüwer (DNV GL) explained how an enterprise ontology can automate ‘complex, but trivial work’ by capturing requirements in a formal language and validate solutions as an asset is developed and operated. A reference vocabulary allows for integration across project/plant lifecycle, between disciplines, between stakeholders and vendors, and across ‘thousands’ of specialized software applications’. Obstacles to the data nirvana include a lack of ontology specialists, methods and tools and the need for standard terminology to describe industrial artefacts and processes. Current standardization efforts are ‘small-scale and limited to upper ontologies’. In which context, Klüwer’s current focus is a methodology for digital requirements for the Norsok Z-TI joint industry project.

But is it all still R&D?

We asked Hansen if Aibel’s semantic effort has expanded from the R&D sector to sustainably impact the upstream. Here is his reply…

At Aibel we have used semantic technology and ontologies in production as support to our major EPC projects since 2015. Johan Sverdrup DP was the first EPC project to use MMD for management of material catalogs for piping bulk, valves, and structural steel. Since then, we have used the MMD ontology system on 5-6 other projects. The data quality in the project material catalogs has increased immensely, and we have almost eliminated duplicate entries in the material catalogs. Further, materials are grouped into piping specs, and for each spec, the material list only contains valid materials for that spec. As a result, project material procurement is completed with better accuracy, and we are better able to utilize warehouse stock in other projects. Also, the time needed for defining specs for new projects is dramatically reduced.

Much of the MMD content is based on international engineering standards, like ASME and ASTM, and we are actively looking into sharing those parts of MMD within the oil and gas community. At the moment, there are several initiatives in oil and gas (at least in Norway) to promoting ontologies and logic as ways to tackle (part of) the industry’s need for a common approach to digitalization and information exchange, including digital twin.

Logistics 4.0

Those interested in following Norway’s persistent promotion of semantics and their extension into the field of AI might be interested in the upcoming workshop on formal methods and artificial intelligence in logistics. ‘Logistics 4.0’ advocates the universal digitalization of the supply chain to help to automate, verify and coordinate the execution of the procedures among the different stakeholders involved. The workshop will be held in Bergen on 2nd December 2019. More from Logistics 4.0.

* Centre for Scalable Data Access in the Oil and Gas Domain.

Review: The Open Group’s Digital Practitioner Body of Knowledge Standard

With The Open Group’s increasing penetration into upstream IT (OSDU, OPAF) the Digital Practitioner Body of Knowledge Standard is a timely compendium of TOG’s output. The 520 Page publication addresses the interface between IT and academia and attempts to formalize the sometimes rather nebulous concepts that confront the digital practitioner.

As The Open Group is increasingly present in oil and gas, with the Open Process Automation Forum and the Open Subsurface Data Universe (and also with Shell’s reported use of TOG’s IT4IT framework – see elsewhere in this issue) the release of a new publication should be of interest to those participating or thinking of participating in such initiatives. The Open Group’s Digital Practitioner Body of Knowledge is an imposing, 520 page publication* and a free download for evaluation. The DPBoK builds on other TOG work (Architecture, Open Platform 3.0 and IT4IT) and is in part, derived from Charles Betz’ software engineering program at the University of St. Thomas in St. Paul, Minnesota.

The introduction has it that ‘applied computing’ aka ‘digital technology’, is ‘transforming economies and societies worldwide’. (University) computing programs worldwide are under pressure to produce an increasing number of qualified professionals to meet ‘voracious workforce demand’. And skill requirements have undergone a seismic shift over the past 20 years. ‘Digital Practitioners require a wide variety of skills and competencies, including cloud architecture and operations, continuous delivery and deployment, collaboration, Agile and Lean methods, product management, and more. Industry guidance has over the years become fragmented into many overlapping and sometimes conflicting bodies of knowledge, frameworks, and industry standards. The emergence of Agile and DevOps as dominant delivery forms has thrown this already fractured ecosystem of industry guidance into chaos’.

DPBoK has it that ‘In the computing and digital professions, there is currently a significant and destructive gap between academic theory and research and industrial practice. In the interest of narrowing this gap, this document shall be verifiable […] Its structure, principles, practices, and concepts must be falsifiable. It shall be open to rational skepticism and criticism’. Interestingly DPBoK ‘must not fall into the trap of excessive semantic debate and the fruitless search for universally applicable abstract ontologies. A framework with recognized inconsistencies but well- grounded in industry domain language is preferable to a perfectly consistent framework based on conjectural concepts’.

Verifiability and falsifiability are tall orders for a field as prone to marketing hype as IT and the digital transformation. DPBoK attempts to address these issues with sub sections on ‘evidence of notability’ for its definitions. What constitutes ‘evidence of notability’? DPBoK proposes the following ‘heuristics’, the existence of an organized community, […] practitioners self-identifying under its banner and […] attending local, national, or international events [… the availability of … ] books on the topic from reputable publishers, media and analyst coverage. This is all very well, but such lines of evidence neglect the impact of marketing on IT, and of the trendiness and nebulous nature of many concepts.

For ‘Agile’, evidence of notability is cited as the fact that it has a large, active, and highly visible community and is increasingly influential on non-software activities as well. There are some 289 references to ‘Agile’ in DPBoK. Agile is defined a) what it is not (the Waterfall method of software development) and b) by the touchy-feely stuff of the ‘Agile Manifesto’ which inter alia values ‘individuals and interactions over processes and tools’ and ‘customer collaboration over contract negotiation’. DPBoK has it that ‘Agile is at its strongest in the cohesive team context. It does not have the same level of consensus or clarity in larger contexts, and the topic of scaling Agile is controversial’. Notwithstanding this, DPBoK advocates a shift to a ‘more Agile style in the Enterprise Architecture capability’ with the application of other TOG standards like TOGAF and ArchiMate.

DPBoK covers topics including virtualization, containers and Kubernetes and cloud services. Despite their novelty, ‘The idea of running IT completely as a utility service goes back at least to 1965’ but it took till around 2010 to deliver the true multi-tenancy cloud. ‘The future of cloud computing appears assured, but computing and digital competencies also extend to edge devices and in-house computing. The extent to which organizations will retain in-house computing is a topic of industry debate’. It would be interesting to hear more of this debate as we have seen ‘edge’ computing covering stuff from embedded devices, a server or … a desktop!

Our experience of IT goes back quite a long way and we have always considered the Unix shell to be something of a high point in the development of computing. DPBoK agrees that ‘shell scripts can create and destroy virtual servers and containers, install and remove software, set up and delete users, check on the status of running processes, and much more.’ On the other hand, we learn that ‘the state of the art in infrastructure configuration is not to use shell scripts at all but either policy-based infrastructure management or container definition approaches’. So much for the Unix shell!

As a quick test of DPBoK, we looked-up ‘microservices’. A ‘microservices-based architecture’ is presented (by Schlumberger in Delfi and in ODSU promotional material) as some kind of holy grail of IT. But what are they exactly? DPBoK starts well with the observation that ‘Other than the “branding”, there is no clear definition or a list of characteristics for “microservices”.’ There follow some defining criteria. A microservice ‘performs an atomic function’, is ‘elastic, resilient, complete, and composable’ and is not OS programming language-dependent’. Their use ‘should be well thought out and justifiable’. A bit further on DPBoK concludes that ‘when all of these aspects are considered and solved, microservices definitely helps the organization to be nimble in responding to user expectation changes or business logic changes.’ Sounds like a return to the ‘branding’, which we take to be a polite way of saying, ‘it’s just marketing stuff’.

If the 500 pages of DPBoK are not enough for you, there are over 300 ‘informative references’ for your further enjoyment. A couple of our favorites were missing, no mention of IT stalwarts like Brian Kernighan or Les Hatton. But even more telling, there are no references at all to the true drivers of digital practice – FUD and FOMO**!

* The Open Group Digital Practitioner Body of Knowledge Standard. Document Number: C196. ISBN: 1-947754-33-1. Published by The Open Group, July 2019.

** Fear uncertainty and doubt and Fear of missing out.


Digital technology in refining

France’s Evolen trade body hears from Total on its early detection center for rotating equipment monitoring. Valourec announced Digital Pipes. Alpha Maintenance: shutdown and turnaround the poor relation of digital operations.

Speaking at a meeting hosted by France’s Evolen organization, Total’s Jean-Christophe Courcol presented the first results from an early detection remote center that monitors rotating machinery at Total’s refining and chemical plants. Total deploys an eclectic assembly of software components. The digital road map leverages software including B&K Vibro Setpoint, GE System 1, Prognost, Aveva TrendMiner, Aveva and Avantis Prism to predict failures a few months ahead. These apps run atop a platform built on Yokogawa and ProdFlowServ technology.

Some 1.5 million data points/day from nine plants flow into Avantis Prism and the early detection remote center. Since the early detection program began in 2018, the system has caught some 100 problems. For instance, the system spotted vibration ramping up on a steam turbine when a governor coupling sheared, causing the imbalance. Early detection saved maintenance and avoided downtime. Currently the system catches around 10 issues every month. The system is now being extended to new plants and also, as an advisory, to Total’s joint venture partners. Total is now looking at monitoring static equipment for corrosion. The ‘hardest and most expensive’ application involves adding AI to the system. This is not currently in the budget.

Comment: Total’s ED system appears to fall squarely in the camp of situational awareness as we argued back in 2016. Total’s demonstration is that if you monitor effectively, potential issues stick out. The AI story of computer intelligence spotting low level signals undetectable by the human eye is as-yet undemonstrated.

Clara Boisserand presented Vallourec’s ‘Digital Pipes’ program which consists of a suite of digital solutions for tubular asset management. These are marketed under the new Vallourec Smart marque. Guillaume Wolf presented the system as providing digital traceability from the tube mill to the well. Initial specs are provided in a tube databook. Tubulars then undergo inspection to detect and quantify defects with 2D/3D asset mapping, non-destructive and mechanical tests. After inspection data is captured to an ‘digital twin’ and an update to the databook. On site, a mobile ‘FieldApp’ provides tag management, pipe end identification and data export.

Frédéric Di Marcantoni and Jean-Marie Marcinkiewicz (Alpha Maintenance) described shutdown and turnaround as the ‘poor relation’ of digital operations. Refinery shutdown and turnaround currently involves paperwork, bar coded work orders and quality control sheets. Alpha Maintenance replaces these with a suite of digital solutions that extract data from Primavera into an asset management database. This feeds apps that provide progress reports, geo localization and lockout/tagout control. AM is now working on a business intelligence tool to exploit the data captured in turnarounds. Alpha Maintenance has Ineos as flagship client.


On data warehouses, data lakes and tandems

Boston Consulting Group bloggers offer advice on the merits of various data platforms.

A blog post from Boston Consulting Group considers the ‘tough times’ in oil and gas following the 2014 downturn and the popularity of alternative forms of energy. Such challenges are being addressed by ‘implementing digital technologies’. However, ‘success has been limited [ because of ] an inability to fully leverage data’. BCG enumerates the well-studied issues of multiple legacy software applications, data formats, quality and ‘inflexible architectures’. The answer to these woes is ‘a central platform that includes a data warehouse, a data lake, or both’. Some companies are seeing the benefits of data platforms with one unnamed international oil reporting a ‘$7 billion cost saving over three-years’ from its platform investments (hardly a ‘limited success’!).

BGC offers a platform taxonomy to help-out with the transformation. A central data platform comes in three flavors. A data warehouse (a repository of structured data), a data lake (a repository of both structured and unstructured data) or a combination of warehouse and a lake. Data warehouses are proven technologies, with many solutions, vendors, and experts readily available. The downside is that data must go through a lengthy structuring process before it can be stored, and a rigid structure may make it hard to incorporate new data sources. Building a data lake is easy, just load the information as-is. The downside is that the ‘because the information hasn’t been structured, data lakes require more rigorous governance and management than warehouses’*. Moreover, ‘people with data lake architecture and data engineering skills are far scarcer than data warehouse experts’. Data lakes can include large, high-frequency time series production data and are amenable to the adoption of new digital technologies.

Using a data lake in tandem with a data warehouse is now a possibility with cloud-based data warehouses such as Snowflake or Amazon Redshift. These allow composite queries across structured, semi-structured, and unstructured content. Ready to write the check? BGS suggests either a DIY platform running on AWS, Google, or Azure. Alternatively, you can purchase a data suite from an industry vendor ‘such as Schlumberger Delfi or Palantir’. For more pros and cons on the different options and advice on technology selection read the BCG blog.

* Co-authored by by Sylvain Santamarta, Peter Forbes, Rash Gandhi and Michael Bechauf.

* This sound rather like the old ‘schema on write’ vs. ‘schema on read’ issue.


Twin digital twins

Kongsberg and Shell sign digital twin deal for Nyhamna gas facility. DNV GL proposes a ‘probabilistic digital twin’

Shell has awarded Kongsberg Digital a contract for the digitalization of its Nyhamna facility, a gas processing and export hub. The 100 million NOK contract promises ‘agile and iterative’ deliverables starting from Q4 2019. A dynamic virtual representation of the gas plant and its behavior will be built atop Kongsberg’s Kognifai data platform, continuously updated with real time information on the facility’s status. The platform will provide Shell with the ability to ‘simulate scenarios and uncover new options for optimization of its real-life counterpart’. Err.. sounds like a simulator to us… More from Kongsberg.

In a separate announcement, DNV GL has proposed a ‘Probabilistic Digital Twin’ concept, a three-way combo of process, structural reliability and quantitative risk models. PDT is said to ‘close the gaps’ between digital twins. According to DNV, risk models are rarely leveraged in operations. They are usually confined to desk studies of historical data, offering a static picture of potential risks. PDT allows operators to adjust operations or take preventive actions to maintain an acceptable risk level at all times. The PDT includes probabilistic degradation and failure models, logic and relational models that relate performance variables to failures and loss events and surrogate models for fast queries and propagation of uncertainty and model coupling. Read the DNV GL’s PDT position paper here.


EITI Paris (OECD) 18-19 June 2019

Extractive Industries Transparency Initiative meet hears from Oxfam on auditing oil and gas projects. Open Data and disclosure by default. Mapping and database technology for transparency. UK’s EITI candidacy. Transparency and the energy transition.

Introduction

The headline introduction to the Global Conference of the Extractive Industries Transparency Initiative (EITI) in Paris earlier this was ‘Open Data, Build Trust’. Worldwide, trust in government is under strain. A perceived lack of progress in tackling corruption, tax evasion and illicit financial flows are contributing to the rise of populism and economic nationalism. The impact of the oil, gas and mining industries is often a focal point of public concern and potential source of conflict. The conference was billed as a high-level forum on extractives governance and an opportunity for multi-stakeholder dialogue and openness in addressing these challenges.

EITI 2019 Progress Report

Fredrik Reinfeldt, the outgoing EITI Chair, launched the EITI 2019 Progress Report. In the ensuing discussion, Dominic Emery (VP strategic planning at BP Group) acknowledged the achievements that resulted from ‘robust discussions’ between stakeholders. Emery highlighted the work on beneficial ownership disclosure and reporting which is now accepted in the EU and Canada as a means of disclosing revenues in countries where these companies operate. ‘What’s not to like about beneficial ownership!’ Emery sees such disclosure extending to climate change and revenue transparency, helping to solve the dual challenge of increasing energy availability while reducing the carbon footprint, a ‘significant deal for the coming decade, and frankly sooner than that!’

Oxfam on Petroleum cost auditing

In a breakout session Daniel Mulé presented Oxfam’s 2018 report, ‘Examining the crude details; government audits of oil and gas project costs to maximize revenue collection. Petroleum cost auditing, a new subject for EITI, is key to ‘pro-poor’ accountability and to minimize the downside of oil and gas extraction. Petroleum royalties may have a greater impact on a country than the benefit from employment. Getting taxes right means in-country public benefits and better returns for investors. The EITI’s transparency push is a step in the right direction as is the International monetary fund’s fiscal transparency code of resource revenue management (finalized in January 2019). What is now needed is more effective revenue collection and compliance. This needs sector-specific knowledge, enforcement via audits and tax adjustments and dispute resolution. Oxfam’s own studies have found that ‘ineffective revenue administration and cost overstatement are key risks in modeling emblematic petroleum projects’. Tax avoidance is possible by manipulating transfer pricing in vertically integrated companies, shifting profits from a higher tax host country to lower tax jurisdictions. Significant revenues are at stake and there is little information on audits.

Open data and disclosure by default

A plenary session looked into the impact of the open data movement on ‘disclosure by default’. The 2019 EITI Standard encourages implementing countries to strengthen disclosure of data and information. It encourages government agencies to embrace open government and open data, so that citizens can access up-to-date information. Companies are also providing more detailed information in their annual reports and online. Anders Pedersen presented the Natural Resources Governance Institute’s own open data work. Information on extractive projects are scattered across different company, government and civil society websites. These sources are available in varying storage formats and tabular structures. The NRGI collects and cleans these different sources of project data into a single harmonized data format. Project level data from hundreds of reports is available at projects and contracts. Pedersen also gave a shout-out to Tim Davis’ global report on the State of Open Data 2019.

UK’s EITI candidacy

Matthew Ray from the UK government department for business energy and industrial strategy traced the UK’s commitment to EITI back to 2013. The UK is a ‘candidate’ country, currently going through the EITI validation process. The UK extractives sector represents some £27 billion, with 80% from oil and gas and a £1 billion tax take. Most information is currently disclosed via government or company reporting. Revenue disclosure is currently not in sufficient detail and there are issues around ‘taxpayer confidentiality’. Companies House data is already widely used and there is a move for a wide reform of the company register. There is a strong business case for transparency, notably to combat money laundering. Civil society, investigative journalists, law enforcement and public are encouraged to ‘report it now’. The UK is also consulting on identity verification for beneficial ownership.

In-country activity

Sierra Leone presented ‘SLEITImap’, an Esri development by Integems Ggroup, an elegant GIS map front end with click-through to license information.

The Revenue Development Foundation has rolled out the MoMP Transparency Portal for the Afgani Ministry of Mines and Petroleum. This system contains data on mineral rights, exploration, mining, dealers and exporters licenses and related payments. Data comes directly out of the Mines and Petroleum systems that are in use every day.

RDF has also developed a transparency portal for Liberia leveraging its toolset for portal development and integration with other government departments and data sets. Regarding technology development, RDF considers blockchain as ‘a bad example’. ‘It is better to find a challenge for technology to solve rather than the other way round.’

Total on contract disclosure

In the contract disclosure special session, a new EITI effort, Stephen Douglas, Total’s industry rep, stated that Total’s company policy since 2017 is to encourage and support contract publication, not just in EITI countries but everywhere. Why? Transparency encourages investment and educates civil society. Public contract disclosure is a ‘virtual waiver’ of confidentiality in joint ventures. Confidentiality in the industry has a long history. But when someone asks why, most cannot come up with a good reason. Douglas is not naïve; contracts are fundamental economic moving parts of corporations’ and host country’s economics. Disclosure may or may not please, but at least it avoids suspicion that a deal is dodgy. From 2019, EITI standards will require disclosure of all contracts, published or amended. The situation in regard of old contracts is ‘quite complex’.

Extractives and carbon transition (Chatham House/UN/OECD/YPF/Chevron)

Sian Bradley (Chatham House) chaired a session on the relationship between extractives and the low carbon transition asking, can transparency help? There have been big changes in energy since EITI started in the mid-2000s. COP 21 has profound implications for fossil fuel. Renewables are now cheaper than fossil in some markets, elsewhere there are bans on internal combustion engines in an ‘accelerating process’.

The dilemma facing producing countries was well illustrated by Tony Addison (United Nations University) who has been working with the bank of Mozambique. The country suffers from climate change exacerbated cyclones. At the same time there is the promise of significant revenues from natural gas. This is ‘a really tough one for the government’. More generally there is worldwide uncertainty in the face of climate change as countries ‘walk the talk’, first on coal, then oil and eventually gas. We could see three quarters of world coal and half of the oil stranded. Gas should perform a bit better as a transition fuel. Companies will ‘shrink themselves’ and return capital to shareholders. Poor countries like Mozambique should ‘strand’ later than rich. Of course, this may not happen! Is any of this credible? When? The next decade, couple of decades?

Lahra Liberti (OECD) observed that countries are already reducing methane emissions. There is a World Bank initiative on global gas flaring reduction. Nigeria has pledged to reduce flaring and deploy gas-to-power technology. But there is little incentive for countries and companies to comply. Funding the transition with costly solutions like CCS is challenging. Liberti offered a ‘word of warning’, the last IEA/OECD report on progress in reform of fossil subsidies suggests that these are now rising again after a period of decline.

Bradley observed that low cost producers will vie to be the ‘last producer standing’. There will be a more short-term impact for others, like Canada’s tar sands.

Miguel Gutierrez (YPF and chair of G20 work group on the energy transition) pointed out that developing countries need cash for their energy-poor populations. Such countries need to look out for their own people. If they are rich in natural gas or biofuels, ‘let them do that’. 30% of worldwide CO2 emissions come from two countries. ‘What do you want to do. Solve climate change? I leave it to you!’

Chevron’s Stuart Brooks, an EITI board member, said that this was one of the biggest challenges facing an oil company. The argument that oil and gas will elevate poor people to middle class energy consumption levels is ‘no longer sustainable for the world’. But whether such considerations should be core to the EITI’s agenda is moot. EITI has a lot on its plate with human rights and anti-corruption. ‘We have to be careful what we add to the pile, there are many other players in this space. EITI should maximize what we do.’

Another speaker from the floor disagreed, ‘don’t underestimate potential of EITI in putting climate risks on agenda and into disclosure. EITI has become the place for rigorous debate on topics like this. Under national standard oils will be disclosing financials and opportunity to add in analysis of long-term economic risk of climate change’.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.