Oil IT Journal: Volume 24 Number 6


McKinsey’s new mandate for Oil & gas CIO

Limited digital transformation success to date means CIOs must up their game, becoming a ‘peer to the CEO’. IT needs to evolve from a support function to a full ‘engineering discipline’. Call for new breed of software engineers ready to acquire domain knowledge.

An ‘Insight’ position paper* from McKinsey advocates a ‘New mandate for the oil and gas chief information officer’. Oil and gas companies have embarked on the digital transformation and technology-enabled use cases have been ‘identified, built, and verified’. However, ‘successes so far have been confined to small pockets’. Such pilot digital transformation efforts can be developed in a sandbox, but full-scale implementation requires ‘enterprise IT’. Use cases involving automation, robotics and AI ‘require vast amounts of data to be exchanged across traditional silos’.

‘Technology must be rapidly transformed across the enterprise to handle the transaction volumes, user expectations, and security requirements of the digital age. To do this, oil and gas companies need a strong CIO with a mandate not only to change the way the business uses technology but also to transform the technology estate itself from disparate systems into scalable platforms’.

Unfortunately, ‘few oil and gas CIOs are set up for success in today’s digital age’. IT is measured on stable operations at the lowest possible cost. The CIO’s mandate revolves around supporting the status quo rather than enabling growth or transformation. The oil price downturn of 2014 intensified the squeeze on IT spending, depriving CIOs of between a third and a half of their capital expenditures. Hamstrung by such cuts, CIOs are ‘doomed to disappoint an increasingly technology-hungry business’.

Oil and gas CIOs are too often ‘stripped of engineering capability’ and instead focus on infrastructure and third party-licensed software (40% of IT spend). IT personnel spend in oil and gas is low at 27% as compared with a cross-industry average of 37%. There is also more outsourcing, with a 50% hike in outsourced application development between 2014 and 2018.

Companies are moreover ‘locked into an unsuitable delivery model’ and the CIO is unable to control end-to-end data flows. Operations data from equipment and sensors is central to many digital initiatives but these are managed by the asset, without policies for sharing operational data across the enterprise. Under the old model, CIOs procured and maintained bespoke, licensed applications to serve the functional needs of the business. But opportunity in an oil and gas company is largely data-driven which requires a more holistic approach, connecting data sources to form an information architecture that both enables the digital transformation and provides just-in-time functionality and data for specific use cases.

So, what is the CIO’s new mandate? McKinsey argues that the CIO ‘must be an equal partner in transformation with the business’. The CIO needs to be a visionary change agent, technology architect, expert builder and provider of talent to the whole organization. CIO need become ‘deeply involved in driving strategy and building alignment on bold digital transformation themes that go beyond classic technology enablement to process automation, advanced analytics and robotics’. The CIO’s focus needs to shift from process and procurement management to engineering and software development ‘which will make IT look and feel less like a support function and more like an engineering discipline’.

McKinsey advocates a DevOps** approach with platforms that can respond to changing demand. This requires flexible data storage, a tool chain, and processes that enable continuous delivery and integration of new code. Additionally, a ‘new breed’ of software engineers that, works side by side with the business, will gradually obtain domain knowledge as they solve pain points from day to day. The new CIO will need enhanced leadership qualities to run the new IT organization and will (somehow) have to ‘act as a peer to the CEO and chief data officer’ in driving a digitally enabled performance transformation of the business.

* A new mandate for the oil and gas CIO by Aman Dhingra, Sverre Fjeldstad, Natalya Katsap and Richard Ward.

** McKinsey states that some industries are further ahead than oil and gas in this respect, the Dutch ING Bank, for instance, invests 20% of its annual IT budget in DevOps. More on ING Bank’s DevOps here.

Comment: McKinsey’s pitch for more CIO engagement is a resounding endorsement of Johan Krebbers’ efforts in OSDU, of which, much more in this issue. The technology underlying ING’s DevOps also echoes the OSDU call for Kubernetes and cloud-based data access. In both cases, the effort is predicated on a brave new world, where ubiquitous data access trumps the benefits of the ‘monolithic’ app, with its own domain-specific data structures and limited data sharing. This begs the question as to how much domain knowledge can realistically be ‘gradually obtained’ by programmers. Whether this is really an issue depends on whether you see the machines taking over the ‘domains’ and ultimately making the behemothic app, and perhaps the knowledge worker, redundant.


MemComputing ‘quantum performance’ in software

Chevron Technology Ventures and US DARPA back HPC boutique’s ‘deep belief networks’.

High performance computing specialist MemComputing has been selected by Chevron Technology Ventures for inclusion in its Catalyst Program. Chevron’s funding will enable MemComputing to explore the applicability of the technology in oil and gas. MemComputing’s MemCPU ‘extreme performance computing’ (XPC) architecture is said to be ‘based on the logic and reasoning functions of the human brain’. The hosted software uses ‘physics principles’ to solve problems in the field of artificial intelligence.

MemComputing claims that its GPU-based software, like the quantum computer, ‘enables processing power to exist in multiple states and perform multiple tasks at the same time’. MemComputing inventor and UC San Diego physicist Max DiVentra claims the technology offers quantum computing performance today, in software!

Chevron is not the only believer, DARPA, the US Defense advanced research projects agency is to chip-in to the program with a $500,000 grant spread over 18 months to ‘further develop MemComputing’s technology and its applications to AI’. The DARPA funds will be used to trial the technology on the unsupervised learning, or pre-training, of ‘deep belief networks’, systems of multi-layer neural networks that are used to ‘recognize, generate and classify data’ and to address the goal of the third wave of AI, machines that adapt to external stimuli in real time. More from MemComputing.


Déjà vu all over again?

Introducing this issue’s (the 250th) copious coverage of the Open subsurface data universe, Oil IT Journal editor Neil McNaughton, traces industry’s attempts to solve the interoperability conundrum and its evolving position regarding open source software, to conclude with a variation on the old ‘Stone Soup’ fable.

When we started out back in 1996, as Petroleum Data Manager, data management was, on the one hand, the topic du jour. On the other hand, there was a feeling that, mirroring Fukuyama’s ‘end of history’, data management was a ‘done deal’. Depending on your persuasion, either what was then POSC (now Energistics) or PPDM* had fixed, or would shortly fix, all of what were then the pressing issues of the day**. That is to say, the pressing issues of the data world that have remained with us ever since; awkward data formats, idiosyncratic data stores and multiple issues of interoperability. The very issues which OSDU, the Open subsurface data universe, has set out to solve.

This issue extends our coverage of OSDU with a technology update, a report from The Open Group’s Amsterdam meet (with also an update on the other TOG oil and gas initiative, O-PAS, the open process automation standard) and interviews with OSDU champion, Shell’s Johan Krebbers and TOG CEO Steve Nunn.

Will OSDU fix interoperability? To believe so, you have to believe that ‘this time is different’, which is always a big ask! On the plus side, OSDU has momentum. But momentum is not enough. There is a tendency amongst major operators to spin-out putative standards work at a certain stage of development in the hope that they will see take up or sustainable further development. Such in-house work is generally presented as non-core business. Over the years we have seen the following standards spun-out (mainly from Shell) into the public domain: ISO TC/184/15926 with its Express data modeling, Business Objects with the spin-out of Open Spirit, revamped ISO 15926 with added ‘Semantic Web’ technology and lately Cfihos and its standardized Excel engineering spreadsheets***. These all have had a mixed reception in the real world for a variety of reasons which Oil IT Journal has covered extensively over the years. I think it is fair to say that they have all fallen short of their original intent, although to be fair, it is too early to pronounce on Cfihos. Today Shell is spinning out its in-house SDU infrastructure into OSDU. Will this time be different?

I really don’t know, but one interesting facet of OSDU is its focus on open source software. Johan Krebbers told us that ‘We can’t have any proprietary stuff on the platform’. Which begs the question, why are so many vendors of proprietary software interested in the initiative? Have they all undergone a damascene conversion to free? On the side of the oils, a similar conversion is required in regard of open source. Open source (apart from Linux in HPC) has had an unenthusiastic reception in oil and gas to date. We have tracked the open source movement in the upstream out of interest and I confess, from something of an activist stance, dating back to the days when it was IBM vs. Unix, or Microsoft vs. the ‘cancer’ of Linux. There was general disdain, and even fake news, with whispered warnings of legal repercussions if you were caught using open source software. Industry events targeting open source, notably chez the EAGE, failed to get much traction.

Today, the GAFAs have changed the nature of the open source game and even Microsoft has ‘embraced’ open source****. What oils need to do now is get into the open source culture. That means a shift away from exclusive in-house work, from closed R&D joint ventures, and into writing and financing a code base for the public good. The activist in me adds that the code should not be ‘open source .NET’, which would make it exclusively Windows. Neither should it be too finely-tuned to a particular vendor’s data universe. This makes for quite a tricky balancing act. But caveats apart, OSDU is the most interesting development in the upstream software world since the heydays of POSC’s Epicentre.

* At the time this was the Public Petroleum Data Management association. In 2008 PPDM changed its name to the Professional Petroleum Data Management association.

** Notably with Project Discovery, which in 1996, set out to merge the Epicentre and PPDM data models.

*** Parallel ‘open’ offerings from the major vendors have also emerged but these, from Landmark’s OpenWorks to Schlumberger’s OpenDES, are too tightly-coupled with their proprietary infrastructures to be considered truly ‘open’.

**** Although this could be interpreted as continuation of its ‘embrace-extend-extinguish’ strategy of the 2000s.

~

My apologies if you have heard this story before. A mysterious traveler arrives in a mediaeval village in the midst of a famine. The traveler promises to feed the starving villagers with a magic stone he has in his knapsack but first he needs a soup cauldron and some boiling water. This duly arrives and he pops his stone into the pot saying, ‘That’s it. The soup is cooking now and will be ready soon. But, you know, it will be nicer if any of you have some vegetables to add in for extra taste’. The headman disappears briefly and comes back with a large parsnip he has been hoarding and pops it in. One by one, the other villagers do likewise, until the caldron is full of boiling veggies. A few minutes later, the villagers tuck into a delicious broth and thank the traveler for doing his magic.

I’m not sure that I did the stone soup story justice. Wikipedia has a few interesting versions. I like the Hungarian one which has it that, after the meal, the traveler sells the stone to the villagers. Which fits my point better. In the IT world, the stone is the ‘next big thing’. Over the 25 years or so that Oil IT Journal has been publishing, the next big things have included the relational data model, CORBA, business objects, the semantic web, and more recently, cloud computing and ... Kubernetes. The vegetables are your domain knowledge and your data.


Oil IT Journal Interviews – Johan Krebbers, Shell and OSDU and Steve Nunn, CEO The Open Group

Oil IT Journal caught up with Johan Krebbers and Steve Nunn at the 2019 EU meet of The Open Group in Amsterdam. We chatted (separately) to Krebbers about OSDU specifics and to Nunn about The Open Group’s approach to standards.

Interview with Johan Krebbers, Shell and OSDU

Oil IT Journal - We been tracking various attempts at upstream interoperability for over 20 years. The Open subsurface data universe booklet reads like poetry. OSDU has a very ambitious scope which may not be a bad thing. But how can you solve these issues after 20, 30 years of folks trying to do the same? What’s new now?

Johan Krebbers – The cloud is what’s new, along with our implementations by Microsoft, Amazon and Google, plus a bunch of standards. PPDM has just joined the initiative and we will be using the PPDM definitions (not the data model). In the cloud there will be no RDBMS. The deal is to separate data and applications so that small companies can play in this space.

No Database? How do you persist data?

With cloud-based blob* stores. All apps will get access through the API to a metadata schema database, along with Elastic Search**. Deployable in Amazon, Google or Azure.

Why TOG? Why not Energistics?

Back in 2017 Shell made a strategic decision around its in-house SDU. We decided to consider it as ‘non-competitive’ and that its future would be better served by sharing across industry. Shell needed a legal framework such that we could talk freely and share information with Exxon and others. All the OSDU intellectual property goes to TOG. The contracts around this were signed in only 6 weeks of discussion.

The big news of late was the arrival of Schlumberger – our 800lb gorilla. Doesn’t this mean that OSDU is falling towards/into the Schlumberger camp?

No. As I said, this is all open source IP owned by TOG and, for parts of the open source stack, by the Linux Foundation.

So could I use it?

Yes – just sign up with an Amazon account.

Is there a test data set?

Yes, the current Azure demo includes 5,000 Dutch wells.

Did you consider Volve?

We looked at Volve but the data set was too small.

What’s in the blobs? SEG-Y, LAS?

Not so much. Some of these are at the proprietary layer – such as Petrel data files. Also, we prefer OpenVDS from Bluware to SEG-Y.

And GIS – will it be an open source GIS?

We can’t have any proprietary stuff on the platform.

So, it is open across-the-board?

Yes.

In our last issue we wrote up the work that Shell has been doing with IT4IT. Does this have relevance to OSDU?

No. Even though Shell does use IT4IT. All Shell SDU code has been handed over to OSDU.

And is this a parallel development to O-PAS?

Not really. O-PAS is more standards-based. OSDU Is code-based. (see our interview with Steve Nunn below)

And if I wanted to run the whole shebang on my workstation? Do you have to have the cloud?

Maybe – you could run it against a MySQL database. Actually, we will have a special solution for small companies – more of a software as a service solution.

And this is all in the public cloud?

Not necessarily. It could be running in your own datacenter or it can be Amazon Outpost or Microsoft Stack.

Is Halliburton out in the cold?

No. Halliburton gave a demo a few weeks ago with their apps running against OSDU.

We chatted with some folks at SPE ATCE and it seems like for some, there is no way they are going to open source their stuff!

Maybe, for an application provider.

What about the pure-play data management software vendors? Will they get burned by this?

Yes, they will lose out as we move to a non-competitive API and data store.

How is OSDU going now?

PPDM just joined and we had a face-to-face in Houston with 220 people and 75 companies. We cannot be seen to be just Schlumberger.

* Binary large objects.

** A word of warning on ‘Elastic Search’ which, according to the New York Times is suffering from unfair competition from Amazon’s homonym ‘Elasticsearch’. More on this issue from Elastic.

Interview with Steve Nunn CEO TOG

Oil IT Journal - What’s the raison d’être of The Open Group (TOG)?

Steve Nunn - Our goal is boundaryless information flow, even though information boundaries will always be there. Our member organizations solve technical and business issues, working with IT vendors and customers in government, financial services and academia. We build standards and certify their adoption in different industries.

I see you are wearing a UNIX badge.

Yes indeed, in 2019 we are celebrating UNIX at 50. There’s some interesting history there. Originally UNIX belonged to ATT and was defined as a code base. The user community wanted standards here and TOG was chosen to standardize, not the code, but the specification. This was done with some 1,100 specs.

Such as?

Shell script behavior and the like.

We have just reviewed some other TOG output… IT4IT and the DPBoK and were struck by the absence of code – your specs are more of a touchy-feely nature.

Yes, we produce text-based standards, guides and white papers although we also do code-based test suites for conformance. The Apple iPhone for instance runs a version of Unix that we have certified. The same goes for other UNIX vendors. IBM AIX, Oracle Solaris, HPE…

I thought IBM has downplayed AIX in favor of Linux?

Actually, it seems like AIX has been reinvigorated of late. Unix is still very important to banks/financial services.

Where will OSDU be in the code vs touchy feely stakes?

OSDU will be text and code-based, along with a significant open source reference implementation of the OSDU architecture. Schlumberger has contributed code as a starter.

And is there code in O-PAS?

No not at the present. We are currently evolving the standard and certification program. We will also be developing certification programs for individuals.

What exactly is TOGAF, TOG’s architecture framework?

Back in the 1990s, the US Department of Defense had an enterprise architecture called TAFIN (Technical Architecture Framework for Information Management. The DoD wanted to share and develop a common approach to architecture and donated TAFIN (which became TOGAF) in 1995. Today some 100,000 individuals have TOGAF certification and 80% of Fortune 50 companies use it. They like it because it’s free. Although consultants and educators need to be licensed. TOGAF is modular and can be used piecemeal.

Where did your oil and gas credentials originate?

In 2010 we were approached by the US Navy to develop a Federal Avionics Consortium FACE. Building aircraft meant multiple contractors and designs and there was a push for more reuse and rationalization, so that not everything was designed from scratch. They formed the FACE organization (aka the cockpit of the future) to develop an open systems architecture. FACE includes TOG’s SOSA, the sensor open systems architecture. FACE caught the eye of ExxonMobil which modeled its OPAF/O-PAS, the open process automation standard initiative on the FACE approach. O-PAS was a direct result of FACE. Exxon had similar issues as the Navy with procurement and bringing in suppliers. Exxon went with Lockheed Martin (also behind FACE) and joined up with other oils and process industries.

We just reviewed some TOG output. IT4IT and Shell, and the DPBoK – where we concluded that it was a shame that there was no mention of either FUD or FOMO*!

(Laughs) Yes – there was a lot of push back in O-PAS from folks who had ‘tried all that before’ but in the end it was the customer pull that was the key along with customers mandating certified products. Only then do folks see the benefits! Bringing other industries on board gives critical mass.

We sat in on the ‘process’ track this afternoon. One talk (from Audi) was all about virtualization and did not mention O-PAS. OSDU is all about the cloud. What does the enterprise architecture have to say about these game changers?

You can’t be an architect if you don’t know what all this means although you don’t have to have an in-depth knowledge of what the cloud means. There is a gap in the standards landscape for the cloud but not, as yet, a demand from customers. Meanwhile the vendors are feeding the flames and folks are trotting out old stuff and calling it new!

* Fear, uncertainty and doubt. Fear of missing out. Two old IT tenets!


2019 Open Group ‘Agile Architecture’ event, Amsterdam

Shell presents the Open subsurface data universe to The Open Group membership. ExxonMobil provides an update on O-PAS, the Open process automation standard.

Shell on the OSDU

Johan Krebbers (Shell) presented OSDU, the Open subsurface data universe, at the 2019 EU meeting of The Open Group. OSDU is to ‘put data at the center of the oil and gas industry’. Today, data is closely linked to applications in proprietary file formats. Data quality and lineage and other metadata are stored elsewhere, away from the data. The OSDU target is a single data platform based on Kubernetes, separating applications from data with access via APIs. This will be good for startups, third parties, in house developers and academia and will represent a large new market based on open source standards. OSDU will be managed by TOG such that all parties have equal access. All APIs and reference implementations are to be made available to all, members and non-members alike. Members can join subcommittees and drive changes. OSDU will be released with an Apache 2.0 license. The Schlumberger (OpenDES*) data platform is now open source. This, and other ‘data type-specific’ APIs, will be merged to OSDU such that ‘no single company is in control’. The common code base will run on multiple hosting providers Amazon, Google, Microsoft and IBM. OSDU has subcommittees working on enterprise architecture, data definitions (with PPDM), information security and the business model. The demo release (R1) came out on AWS with a well data focus. R2 (Q1 2020) will see the addition of seismic data with the incorporation of Bluware’s Open VDS. In Q2 2020, R3 will see ‘full integration’ with Schlumberger Delfi. OSDU will be a ‘game changer’ for the industry. The platform approach and public APIs will ‘introduce competition to the market’. ‘Data cannot be solved by one company – it’s too complex’. Read Krebbers’ presentation here.

* See also the Schlumberger statement viz, ‘In contribution to The Open Group Open Subsurface Data Universe (OSDU) Forum, we have open sourced our data ecosystem, which is based on core components of the DELFI environment, to accelerate the delivery of the OSDU Data Platform. We are committed to supporting the joint development of the OSDU Data Platform’.

ExxonMobil on O-PAS

Bradley Houk (ExxonMobil) presented O-PAS, the open process automation standard initiative*. First announced in 2016, O-PAS is now operating a test bed collaboration to accelerate the initiative. Houk advised interested parties to check the slides from The Open Group 2019 Denver event for background information**. Data is ‘foundational and key to process improvement’, hence the need for open access to data enabling distribution to the cloud or back to the edge, as needed. ExxonMobil’s vision for the process industry is for a standard interface, such that edge hardware is interchangeable, and for open software access. Today, ExxonMobil ‘spends too much developing for different platforms’. ‘It’s a shame when the chips you are writing code for are older than you are!’ ‘It is hard to get people to work on 20 year-old control systems. Enter the OPA reference architecture that will ‘get computing out on the edge and into the DCN’. All of which will operate through the O-PAS connectivity framework***. Unfortunately, connecting such a layer to the cloud is currently very costly compared with a single sensor to the cloud linkage. ‘There is no reason for this!’. O-PAS can also be envisaged as a ‘virtual DCN’. Yokogawa is the system integrator, but the initiative is ‘agnostic’ with multiple suppliers involved. Over next year or so the system will be tested chez ExxonMobil before being joined by other partners.

* See our previous reporting on O-PAS which was previously OPAF.

** See here and here where there are several presentations but it’s hard to see a canonical backgrounder!

*** The O-PAS standard, V 1.0 Part 4, Connectivity Framework (OCF) was accepted by The Open Group in September 2019. However, despite the ‘open software’ pretense, O-PAS information is currently members-only.


Open subsurface data universe R2.0

Latest OSDU release adds seismic data support with Bluware OpenVDS and Schlumberger Petrel ZGY. Data access forks into ‘domain’ and ‘generic’ API calls.

In its December 2019 newsletter the Open Subsurface Data Universe explains the evolving architecture of the Shell-backed initiative that is set to transform upstream data management, access and software development. OSDU Release 2 adds seismic data to the well data capability of R1. R2 is also a ‘brand new code base’, based on the Schlumberger OpenDES open data ecosystem that provides cloud-independence across the Microsoft, Amazon and Google clouds. The seismic capability comes from the incorporation of Bluware’s OpenVDS* code along with support for Schlumberger Petrel’s ZGY compressed seismic format. The common code base will be the ‘foundation of all new development and will maintain functionality across all cloud platforms’. Release 3, scheduled for early 2020, will be a deployment-ready system with added services, schemas and optimized storage of logs and seismic data, accessible on-demand via an API. R3 will allow non-developers to ‘deploy an instance’.

On the subject of APIs, OSDU now distinguished between ‘generic’ and ‘domain’ APIs. The latter optimize data access for applications with ‘semantics’ that are not available in the generic API. The ‘generic’ APIs are to stay, providing data-neutral support for accessing metadata and content that are independent of the underlying structure for ‘data oriented’ users. The next OSDU ‘face to face’ meetings are scheduled for March 23-24 (Amsterdam) and April 20-23 (Houston). Track OSDU development on the TOG OSDU Wiki.

* Bluware OpenVDS is an open source edition of Bluware’s commercial VDS software. The open source version exposes the Bluware APIs and lossless seismic data compression format. The full commercial version adds GPU functionality, write with compression and support.

Comment: One of OSDU’s overarching principles is for the decomposition of data into microservices to facilitate a shift from the monolithic applications of the past to lightweight apps. It is a little ironic then that one of the first deliverables (ZGY support) targets Petrel, a behemoth amongst monoliths.


2019 Houston Oil and Gas (seismic) Machine Learning Symposium

Advertas*-hosted event hears from: IGI on AI in geophysics and the SEG/SEAM AI project. Geophysical Insights – Will ML replace the interpreter? Chevron – ML in reservoir engineering. ConocoPhillips – ML in 4D seismic and ‘direct quantitative prediction’. AASPI Consortium on ML in attribute classification. Geophysical Insights - making deep learning accessible to the interpreter. Dell on digital natives vs. the ‘see the beach’ generation.

IGI AI in geophysics and upcoming SEG/SEAM AI project

Former SEG president Nancy House (now with IGI-LLC) provided a history of seismic interpretation leading to the revolution in exploration geophysics that unconventional exploration has produced. Today the watchword is 3D quantitative interpretation, allowing fine detailed rock properties to be mapped from seismics including key indicators such as total organic content. Currently the industry is at a crossroads with conventional physics-based methods being supplemented by AI techniques. In one example, a 3D seismic attribute volume was generated in a pre-stack inversion tied to geochemical well data. A hydrocarbon indicator was obtained using an artificial neural network. The interpreter may be confronted with a very large number of seismic attributes (90 +). Manual interpretation with multivariate regression analysis can usefully be replaced with a neural net/machine learning approach. House announced an upcoming AI co-operative project, ‘testing AI applications in petroleum geophysics’ that will run under the auspices of the SEG SEAM. Objectives are the evaluation of AI-derived results for accuracy and efficiency. The SEAM website does not yet reference this work but interested parties should email: seam@seg.org. there will also be an SEG-sponsored workshop, ‘Artificially intelligent earth exploration: teaching the machine how to characterize the subsurface’ in Muscat, Oman 19-21 Apr 2020.

Geophysical Insights – Will ML replace the interpreter?

Rocky Roden (Geophysical Insights) asked rhetorically, ‘Will machine learning profoundly change geoscience interpretation?’ AI/ML is disruptive technology, but it will not replace geoscience interpreters. However, geoscience interpreters who do not use machine learning will be replaced by those who do. ML can improve interpretation workflows by providing the desired answers faster and more accurately. Already, ML applications provide higher resolution than conventional methods for inversion, fault delineation and facies. Semi supervised and reinforcement learning methods ‘hold great promise to reveal previously unseen phenomena’.

Chevron on ML in reservoir engineering

Sarath Ketineni (Chevron) presented on machine learning applications in reservoir engineering with reference to AI based reservoir characterization (deriving synthetic well logs from seismics using NN) and pay zone identification. This used a big data approach involving feeding well, seismics, completions data and more into a neural net to forecast future production. Both techniques can be combined to steer a well path through major sweet spots. For more on this see Ketineni’s paper SPE-174871-MS ‘Structuring an Integrative Approach for Field Development Planning Using Artificial Intelligence and its Application to an Offshore Oilfield’. Another use has been in unconventional reserves forecasting in the Eagle Ford shale. Conventional methods of predicting oil and gas recovery fail on unconventional reservoirs. Random Forest regression helps identify the most important parameters (25 variables from 4000 wells were used in analysis). Again, the results are available in SPE 196158-MS,’A Machine Learning Analysis Based on Big Data for Eagle Ford Shale Formation’. A third example is SPE 196089-MS: The importance of Integrating Subsurface Disciplines with Machine Learning, a case study from the Spirit River formation (John Hirschmilleret al., GLJ Petroleum Consultants). Ketineni concluded that ML applications in oil and gas are growing rapidly and that petroleum engineers need a better understanding of data science fundamentals, their applicability and limitations. Useful resources include the SPE’s Data Science and Digital Engineering Journal, Coursera’s ML courses and the ubiquitous Tensorflow.

ConocoPhillips – ML in 4D seismic and ‘direct quantitative prediction’

Mike Brhlik (ConocoPhillips) reported on trials of ML in 4D time lapse seismic interpretation for reservoir monitoring. Here the aim is to follow reservoir property changes directly using multiple attributes from successive seismic surveys. This is currently done with rock physics modeling and simulation in ‘semi quantitative’ linear workflows. Brhlik proposes a new ‘direct quantitative prediction’ approach using a data-driven workflow that also embraces the physics. The model was trained on forward-modeled synthetic seismic data to assess various predictors (elastic/seismic attributes) and targets (pressures, saturations) at wells. The 4D synthetic project established that the time lapse seismic inverse problem is solvable by ML regression algorithms. The Random Forest approach gives the best results along with gradient boosted trees. Neural Nets were not as successful. The approach has reduced interpretation cycle time and provides a ‘common ground’ for revisiting 4D model updating workflows that facilitates inter disciplinary integration.

AASPI/University of Oklahoma – ML and seismic facies classification

Kurt Marfurt (AASPI Consortium/University of Oklahoma) gave a wide-ranging presentation on finding the best attribute combinations for seismic facies classification. Marfurt has worked with a large Gulf of Mexico seismic dataset, using multi-attribute/ML classification to distinguish salt, mass transport flow and sediment accumulation. He concludes that human interpreters working with seismic attributes are good at identifying 2D spatial patterns, but that human interactive analysis is limited to about 3 volumes using different color models, transparency, and/or animation. In contrast, machine learning can analyze dozens of attributes at the same time. Moreover, ‘for normal amounts of training data using modern computers, an exhaustive search for the optimum number and combination of attributes is both desirable and feasible’. Having said that, Marfurt cautioned ‘We’ve presented three workflows for attribute selection; we do not yet know which is best for a given mapping task’.

Geophysical Insights - making deep learning accessible to the interpreter

Dustin Dewett (Geophysical Insights) wants to make deep learning accessible to the seismic interpreter and to identify facies from complex seismic waveforms. However, the most commonly used deep learning technique is the convolutional neural network which requires large amount of training data that may not be available. Dewett’s approach is to tune the model so that it can be trained with less data and to automate the generation of training data. Using data from the Taranaki Basin, Offshore New Zealand, 31 training lines were manually labeled as a training set. This represented a 1GB stacked volume of some 500 lines with 100 CDPs/line. Dewett showed that a CNN models with very few training lines still provides a useful result. CNN fault detection can be further accelerated by training on large scale synthetic data. Directly applying a pretrained fault detection network is extremely efficient.

Dell and the digital natives

David Holmes (Dell) gave a reprise of the Agile hackathons along with another example of a hackathon event. Here an Agile/Enthought/Fugro team working on 200GB of high res multi polarization thin section imagery trained a model to segregate and classify grain mineralogy. Tools of trade included SciKit Learn, Jupyter, NumPY and AWS hosting. Holmes presented the brave new world of AI/ML with what could be considered an ageist contrast between Gen Z ‘digital natives’ and Gen ‘See The Beach’, the latter characterized by a rather tired-looking group of EAGE luminaries! Agile’s Open subsurface stack got a mention as did the more recent OSDU, a ‘unique and unprecedented’ collaboration. Holmes concluded with his favorite theme of the ‘citizen data scientist’, part hacker, part statistician and part geoscientist. Whichever camp you are in, you need provenance tools and governance; for the data that is consumed, for the models that are produced and the algorithms that are generated. Watch the Dell EMC oil and gas video here.

Registration for the 2020 Oil and Gas Machine Learning Symposium is available at UpstreamML.com.

* Houston-based Advertas is a marketing and PR firm serving clients in energy and technology. In 2009 the company was retained by Geophysical Insights as its outsourced marketing and business development partner.


Future geoscience jobs

French Geological Society hosts AI in geoscience event. Total on AI’s impact on geoscience jobs. Earth Analytics applies AI to NPD dataset. Seisnetics’ ML for seismic processing.

Total – AI/NLP in geoscience, drilling

A recent French Geological Society gathering looked into the future of geoscience jobs in the light of the rise of machine learning and artificial intelligence. Total’s Yves Le Stunff presented on progress in AI and digital technologies in geoscience. Le Stunff described the two contrasting scientific approaches, on the one hand, physical theory and numerical modeling, on the other hand, data driven discovery and machine learning (ML). ML is not new, Total has been using it for twenty years, with a lot of work in the 1990s on classification. Today, computers are more powerful and there is lot more data. The work done in the 1990s is now being revisited across the upstream, in support of technology watch, through E&P to production. Total is currently investigating the application of AI in kick prediction, stuck pipe and predictive maintenance. In the geoscience area, AI is being investigated for automatic classification of micro paleontology in thin sections and on satellite image tracking for seeps and spills. In the field of unconventionals, the physics is not very well understood, but there is a lot of data where AI is being tried to predict production decline. Total has initiated the Gaia program (with Google) to test AI in geoscience.

What will be the likely impact of all this? Will AI prove a game changer in oil and gas? Today Total has some 600 geoscience interpreters – so these technologies ‘could have a significant impact’. Today, geoscientists spend time looking for and manipulating information. The knowledge work is often constrained by a deadline. Gaia’s objective is to give time back to the geoscientist with a ‘virtual assistant’ (like Apple’s Siri) that screens data and speeds-up interpretation by proposing a solution to the user.

But the ML part is not as hard to realize as the data collection and ‘productizing’, which requires a lot of IT know-how. The Gaia results will be delivered as Sismage* plug ins. Le Stunff cautioned that ML is not magic and doesn’t work all the time. If data is clean, results can be remarkable, saving up to 80% of the time in fault mapping. If the data is dirty, things are not so great! It may take as much time to clean the data as it does to perform the ML work. Overall, average fault pick productivity is probably around 30%. For stratigraphic interpretation (horizons and geobodies) the results are interesting but not usable without rework. Interpreter-assisted interpretation is better, but again, noise levels are critical results are poor in areas of complex geology.

Another line of research is the ‘semantic stream’ i.e. natural language processing where Google excels. This is intended to speed document and information retrieval from reports, from Wikipedia and other sources. Total is developing a web app that goes beyond the search engine, enabling object detection in documents with named entities recognition (well, fields) and classification of maps, cross plots and seismics. This is all at the PoC stage currently and the outcome is uncertain. It is a harder problem than seismics. Some components already work – like image classification with Google AutoML and can recognize for example a geological map. Extracting tables from a PDF document is ‘emerging’. Total is building a knowledge graph to support better natural language query. Initially, the expectation was that algorithm development would take time. In fact, Total spent more time thinking about what to do with ML, defining KPIs and building infrastructure. Integration is much more time consuming that the optimization. ML is ‘never 100% some projects never stop’.

* Sismage is Total’s in-house developed seismic interpretation environment.

Earth Analytics removes human interpretational bias

Steve Purves gave a compelling presentation on the technology that Norwegian Earth Analytics has developed to apply AI across the upstream workflow. Studies using Norwegian Petroleum Directorate (NPD) data have revealed ‘bias and errors’ in human-driven process. More sophisticated ways of analyzing geoscience data are needed, using computer vision, NLP and graph databases. There is a lot of data out there, but it is not clean or convenient and is a struggle to use. The AI/ML application landscape is balkanized, with pockets of individual apps. EarthAnalytics provides a stack of a structured, ML-ready database (EarthBank) and EarthNet a suite of AI-based interpretation tools. The Norwegian EarthBank is a cleansed edition of the national Diskos dataset. EA is currently working to produce an Earthbank for the UK from OGA data. The combo allows a petrophysicist to analyze hundreds of wells in an afternoon or a seismic interpreter to perform rock physics-based seismic inversion in hours instead of months. The technology was developed with a combination of manual labelling on 3D seismic and (increasingly) synthetic data from models. EA sees potential for data-driven pipelines to capture the accuracy and uncertainty of models from hundreds of realizations. ‘A new wave of ML is already disrupting subsurface workflows. With increasing automation, data availability and better data management, ML adoption will continue to accelerate’.

Seisnetics leverages database of 100s of millions of seismic waveforms

Cynthia Gomez presented Seisnetics’ ML for seismic processing and interpretation. Seisnetics uses a database of ‘100s of millions of seismic waveforms’ obtained from unsupervised ML to speed up and augment seismic interpretation and processing. The technology applies learnings ‘from the human genome project’, with the seismic waveform replacing the chromosome along with ‘natural selection and survival of the fittest’. Waveforms are grouped into a ‘geopopulation’, of genetically and spatially related types. The system is 100% data-driven, learning from the data. A test on the SEG SEAM dataset correctly extracted the SEG logo! Sub seismic resolution is claimed.

AgileDD

Meeting Chair Henri Blondelle also presented AgileDD’s work on extracting meaningful information from legacy documents and reports with Yolo which we cover in our report from ECIM also this issue.


2019 Pipeline Open Data Standard Fall Conference

New Century on PODS deployment. Kinder-Morgan on PODS business rules in 7.0. PHMSA on the national pipeline mapping system and new gas pipeline rules.

New Century on PODS deployment

Attendees to the 2019 Fall Conference of the pipeline open data standards association (PODS) learned from Kirk Cameron (New Century Software) how to set up PODS 7.0 in ‘physical’ configurations on Oracle, SQL*, and on APR, the Esri ArcGIS for pipeline data model.

* Including Microsoft SQL Server, PostGIS, SQLite.

Kinder-Morgan on PODS business rules in 7.0

Buddy Nagel’s (Kinder-Morgan) presentation delved into the new business rule functionality of the ‘next generation’ model. Business rules are pseudo-code definitions of how a row, or attribute within a row, of data are validated before data is written. A PODS business rule project sets out to develop a ‘semi-formal methodology and tool-kit’ for writing the requirements and specification of data validation rules. Rules are written in PBRDL (PODS business rule definition language) and stored in a new ‘Business Rule’ table. As an example, consider the following constraint ‘If HISTORY.TO_DATE is populated, then HISTORY.FROM_DATE should be populated and should be earlier than HISTORY.TO_DATE’. As required, a rule might ‘fire’ and return a list of records that do not validate.

PHMSA on the national pipeline mapping system and new gas pipeline rules

Chris McLaren from the US department of Transport’s PHMSA office of pipeline safety described current concerns and challenges in assuring the safety of the nation’s pipeline network. Central to this activity is the NPMS, the national pipeline mapping system that displays GIS data from over 1500 different pipeline operating companies. Operators are required to submit geospatial data for gas transmission and hazardous liquid pipelines and LNG plants annually, about 98% submit Esri data formats. The NPMS now holds almost 1 million pipeline records, over 520,000 miles. PHMSA also maps pipeline history, accident and incidents (to track asset history), inspection boundaries, and high consequence areas. The NPMS was developed in 2009 using a ‘heavily modified’ PODS schema. A new LRS format for data submission is under consideration as part of the proposed NPMS information collection (Docket PHMSA-2014-0092). Leigha Gooding (PHMSA) then presented on the new rules covering the safety of gas transmission and gathering lines. More from the Federal Register. To ensure risk mitigation, safety KPIs are measured and trended. Monitoring performance data is reported on public websites including PHMSA data and statistics overview and national pipeline performance measures and others including the PHMSA technical resources website.

These and other PODS presentations are available on the conference website.

Software, hardware short takes …

CoreLogic lightning risk map, Petex MOVE 2019.1, Actenum DSO V 6.6 for upstream, Agilent’s 990 Micro Gas chromatograph, Brüel & Kjær Vibro DT-12x vibration monitors, Barco Canvas VR, Petrosys’ dbMap/Web 2019.4, EasyCopy EasyCore 2.0, Elysium quality framework for Asfalis, Endress+Hauser’s ATEX-rated iTEMP TMT71/72, Integrated Informatics’ Marco Commander, OspreyData’s Unified Monitor, Thermo Scientific PerGeos 2019.3, myQuorum DynamicDocs, New Excel freeware from Ryder Scott, Wood Mackenzie’s Lens, Yokogawa’s dynamic real time optimizer for OpreX. Siemens Simcenter Amesim.

CoreLogic has announced a lightning risk map for the oil and gas industry. The LRM uses lightning data from Vaisala to visualize geospatial trends in lightning strikes and help reduce downtime and expense from strikes More from CoreLogic.

Petex, which acquired Midland Valley Exploration in 2017, has released MOVE 2019.1 with new model building and display functionality. A new API enables communication with other Petex and 3rd party applications via Resolve or the Petex OpenServer. Read the release notes here.

V 6.6 of Actenum DSO for upstream, an ‘AI-driven’ scheduling platform for, inter alia, drilling, completions and workovers provides enhanced collaboration and a streamlined scheduler’s workflow. Integration with ARIES aligns production forecasts with Actenum production profiles data area and well IDs can now be imported from ARIES. More from Actenum.

Agilent’s 990 Micro GC portable gas chromatograph monitors calorific valuation and odorant level in natural gas. The ‘smart-connected’ digital unit delivers laboratory-quality data for natural gas distribution networks.

Spectris unit Brüel & Kjær Vibro has announced the DT-12x series of loop-powered displacement transmitters. The units are an easy to deploy solution for shaft displacement and vibration monitoring capability for a wide range of industrial machines. The transmitter can be directly connected to a DCS or PLC More from B&K Vibro.

Barco has announced Canvas, a virtual reality visualization platform for construction planning meeting rooms. Canvas provides 3D images coupled to business objectives to decision-makers and experts across architecture, engineering and construction.

The 2019.4 release of Petrosys’ dbMap/Web adds new risk model guidance, a series of configurable questions that lead to risk ‘sub-factors’, that can be combined to a final risk value. The new release also adds tornado plots and enhancement to the well failure analysist. More from the release notes.

The release 2.0 of EasyCopy’s EasyCore core description package adds new visualizations, data management and a ‘magnifier view’, a free-hand tool for making sketches in the field or lab. Download and try EasyCore here.

Elysium is to embed QIF, the quality information framework, in its flagship Asfalis platform. QIF is an XML architecture that manages quality information across in manufacturing systems. QIF was created by the Digital metrology standards consortium and has been accredited by the US ANSI standards organization. Users can now translate CAD geometry and product manufacturing information with QIF and leverage high-quality data in QIF-supported metrology hardware and software. More from Elysium.

Endress+Hauser’s new ATEX-rated iTEMP TMT71/72 Bluetooth wireless transmitters can be accessed from E+H’s SmartBlue app. The transmitters are compatible with signals from resistance or voltage sensors and thermocouples and offer monitoring functions, device troubleshooting and diagnostics in accordance with the NAMUR NE 107 recommendations for self-monitoring devices.

Integrated InformaticsMarco Commander for ESRI ArcGIS Pro V6.0 brings high performance and parallel processing to geospatial data inventorying, identification and management. The Commander is a component of the Marco Studio suite that features spatial data discovery, file modification and a repoint/repath capability. Commander crawls enterprise GIS assets and captures key GIS metadata to the Marco Database, a collection of some 30 tables deployable in Microsoft SQL Server, Oracle or SQLite. Integrated Informatics has also announced Geodetics Toolkit for Pro for oil and gas and natural resources GIS professionals. The toolkit provides over 70+ tools to read and load well survey and seismic navigation information into a Geodatabase.

OspreyData has released ‘Unified Monitor’, a component of its production optimization platform, as a stand-alone offering to enable consolidated views of scada and artificial lift feeds from producing oil and gas wells. The UM accelerates time to deployment of a digital oilfield with centralized visualization of a production, regardless of scada source, lift type or pump manufacturer. The solution can be deployed stand-alone or integrated into OspreyData’s Advanced Analytics, a complete AI platform for production optimization. More in the Jetta Operating case history.

The 2019.3 release of PerGeos, Thermo Scientific’s digital rock analysis software includes new artificial intelligence-based processing tools. A new deep learning environment supports the combined use of AI with traditional algorithms. A new online portal, the Xtra library, provides Python add-ons (recipes, scripts, demos) for domain-specific workflows. More from the release notes.

Quorum Software has announced myQuorum DynamicDocs, a cloud-based document management system for oil and gas. DynamicDocs supports search for land, accounting and well files using flexible parameters based on document attributes, tags and phrases. The software is claimed to drive operational efficiency and streamline regulatory compliance. Watch the testimonial from client Camino Natural Resources. Quorum also recently released OGsys On Demand, a cloud-based edition of its oil and gas accounting software targeting small to medium-sized businesses.

Ryder Scott has posted new freeware Excel add-ins to its Reservoir Solutions software portal. The Lognormal Probability Tool performs a ‘probit plot’ of the potential of resource plays using the methodology outlined in SPEE Monograph 3, ‘Guidelines for the Practical Evaluation of Undeveloped Reserves in Resource Plays (2010)’. The Exponential Calculator helps compute parameters used in exponential decline production forecasting. Another new freeware tool from Ryder Scott is the Well Collator, a web-based application builds a pad-branch-stem hierarchy for a well cluster from a file of surface and bottom-hole coordinates. More freeware from the Ryder Scott Software minisite.

Wood Mackenzie’s new Lens is a global upstream property valuation solution that embeds Woodmac’s global data sets and models in a cloud-based solution for opportunity screening, valuation and visualization. Watch the video here.

Yokogawa has released the Dynamic real time optimizer, a solution in the OpreX asset operations and optimization family. The DRTO uses a combination of first principles simulation and multivariable predictive control technology to optimize plant operations. Under the optimizer hood is technology from Yokogawa’s KBC Advanced Technologies unit.

Siemens Digital Industries Software is expanding its cloud hosted portfolio to include Simcenter Amesim (simulation software for modeling and analysis of multidomain systems) and Simcenter 3D (environment for 3D Computer aided Engineering, CAE). A monthly subscription offers small and medium-sized enterprises a modular function library and a pay-per-use computer infrastructure.


2019 Standards Leadership Council

Houston get-together hears restatement of SLC aims, pitch for new mapping work groups. Cross fertilization of Energistics UoM work into PPDM. Energistics OPC-UA information models for drilling and producing.

Introduction

The recent Houston meeting of the [oil and gas] Standards leadership council (SLC), chaired by OPC Foundation CEO Tom Burke was an opportunity to catch up on SLC progress since its last public meetings, almost five years ago. The SLC was founded in 2012 to ‘enhance collaboration on standards for the benefit of industry’. The SLC pitched a suite of eight work groups that set out to map across the different standards. The WITSML to PPDM Mapping was presented as work in progress back in 2014.

The 2019 SLC meet was more of a restatement of the SLC’s initial intent and an opportunity for the participating orgs to set out their stalls. There was little progress to report on what Oil IT Journal described as ‘the herculean task of tying together so many radically different standards’. Trudy Curtis (PPDM) mentioned the WITSML to PPDM mapping and also PPDM’s leverage of Energistics units of measure work as a ‘wonderful example’ of SLC collaboration.

Pipeline open Data standard association (PODS)

Pete Veenstra (PODS), the ‘new kids on the SLC block*’ described the new PODS data exchange format and PODS 7.0, the latest version of its pipeline data model. The various flavors of PODS and its multiple GIS models are managed in the UML-based Enterprise Architect, which makes for a tenuous connection with another SLC member, the OMG.

* Actually we have it that PODS was one of the SLC founding members.

OMG on SysML

Claude Baudoin (OMG) spoke of UML along with the OMG’s work BPM and Case modeling and SysML, a new graphical data modeling language that could be leveraged in the SLC’s mapping initiatives*. The OMG has oil and gas credentials via the DDS automation protocol (notably commercialized by RTI). Baudoin also mentioned SensR, a proposed ‘vendor-independent metamodel for data generated by sensors’.

* Early PPDM to WITSML mapping was done in Excel.

Energistics ETP/UoM

Jay Hollingsworth retraced Energistics’ flagship standards for data transfer and (now) for analytics. He distanced ETP, the Energistics transfer protocol from DDS. ETP is also to go beyond XML to JSON, the ‘preferred representation’ of data scientists. Energistics is the official keeper of the oil and gas UoM standard and the ISO 19115 oil and gas metadata profile. Energistics is collaborating with the OPC Foundation ‘enabling OPC UA vendors to sell into oil and gas’. Energistics is also working on OPC companion standards (to be released real soon now), a.k.a OPC-UA information models for drilling and producing. Hollingsworth also pitched Energistics’ role in fostering open standards and community, with reference to Halliburton’s Open Earth, Schlumberger’s OpenDES and most recently The Open Group’s OSDU data lake, a ‘well-architected open source data lake for a common data environment’.

BP on IOGP ISSC vision for upstream digitalization

Ken Dunn (BP and chair of the ISSC, the information standards subcommittee of the IOGP) stated that the ISSC’s vision is for a ‘common industry framework for digitalization’. The ISSC is to select and support ‘preferred’ information standards. Where there are there are overlaps between standards, the ISSC will ‘seek winners and then get the industry behind the [approved] standard’. The IOGP’s standards, unlike other SLC members, are primarily focused on engineering and construction. These are covered by three task forces, a global equipment hub, product lifecycle management and ‘digitalization’. task force. Dunn described the World Economic Forum’s mooted $ 2 trillion ‘prize’ to be won if industry fully digitalizes as ‘a totally daunting number’. Engineering interoperability is to be facilitated with a new ISO standard ISO 18101, formerly the Open Industrial Interoperability Ecosystem (OIIE).

Mimosa and the open industrial digital ecosystem

Alan Johnston (Mimosa) described the OIEE as an ongoing effort that has been 10 years in ‘bootstrapping’. Much of the ISO landscape (TC 67, 14224, 108, 184, SC4 SC5 .62264 18101-1 appears to have been distilled into the OIEE along with the efforts of OpenO&M, OAGI, ILAP and SPIR (the spare parts interchangeability record).

Listen in to the SLC webinar recording.

Comment: Standards selection, evaluation, elimination of overlaps and a drive to leverage commonality in the search for interoperability are, as we have said before, herculean tasks. Our 30,000 feet view of the SLC’s activity suggests that, rather than completing specific tasks like the mooted mappings of the SLC’s early days, there is a tendency to look even further afield and embrace even more standards. The invocation of ISO standards also needs qualification. While some ISO standards may be useful, hard-wired and ready for use, others are wordy entreaties or simply incomplete. One thinks of ISO 15926 and, for example, the ‘Ras Tanura ISO flop’ we reported on in 2010. Getting your protocol to ISO standard is as much politics as technology. The result can also be perverse in that ISO standards are behind a paywall, making it less open for use or scrutiny by interested parties. Despite our requests, ISO does not issue standards for review.


ECIM 2019 Haugesund

Shell – ML is happening but too slowly, SPDM launch, Teradata data science needs a scalable data system, Shell on OSDU, Okea and IO Data on big, tough data project, Pandion/Computas and the Kerigen subsurface data platform, Wintershall’s AMIE automated information extraction project, Schlumberger on the UK National Data Repository, Equinor on the ‘inadequate’ LAS well log standard, Petroware JSON Well Log Format (JSON-WLF), IHS Markit/PPDM and taxonomic clarity in the upstream, Sword/Venture - data science unlocks the value in BP’s unstructured data, Schlumberger’s damascene conversion to open source. Short takes: NorskOlje&Gass data exchange. AgileDD work for Equinor. Diskos 2.0. North Sea overlooked pay project. Geodata - ArcGIS front end to Volve dataset. Kadme Whereoil front end to Diskos. Interica on Woodside’s rule-based archiving.

Shell – ML is happening but too slowly

Marianne Oslnes’ (Norske Shell) cited the World Economic Forum as putting a ‘$1 trillion value*’ to the oil and gas industry over the next decade. But this bonanza is slow to materialize in the upstream. Machine learning in seismic ‘is happening, but too slowly from a business urgency perspective’. E&P is the least productive sector of the industry. Shell is therefore focusing on digital technologies that are reaching an inflection point and have impact – with blockchain at the front of curve! The upstream value chain is based on ‘analog, sequential work processes spanning decades’ with ‘thousands of fingerprints’. This is not competitive and unattractive to shareholders. A step change is required to end to end workflows that remove barriers and siloed thinking, ‘Even before with analog drawings it was faster than today!’ Well known data management principles ‘data as asset, data quality, ownership, metadata and so on) may seem old hat but ‘you need to do it again, maybe now you will have business leaders who will make the effort to listen and understand the value of the data. Remember though that others may not understand your perspective. So make what you do pictorial and simple. As ‘culture eats strategy for breakfast’, think about how to undo 20 year-old ways of working. Push the boundaries, understand and communicate the problem in hand. For data and information managers ‘this is you era’ help us change an entire industry you finally have our attention.

* Ken Dunn BP quoted the WEC ‘value’ as $2 trillion! (see our SLC report in this issue).

SPDM launch

Lars Gåseby formally launched the Society for Petroleum Data Managers, that sets out to encourage ‘lifelong learning, advanced KM and career development’ for petroleum data managers through ‘community, conferences and events and the sharing of reference materials and standards’. The SPDM currently has 135 members, most in the EU.

BP’s DataWorx organization

Robbie Watson introduced BP’s new DataWorx organization that is to ‘create $10 billion of cash, not perceived, value’. Data management and data science is ‘no longer just a job’, it is now a career. DataWorx is an environment to ‘thrive learn and fail in’. Previously data management was seen as a ‘little back office thing that no one talks about till it goes wrong’. Now BP has created the upstream data management framework in a global approach to drive consistency. Other majors are doing the same thing and there is competition for talent. One early result is improved production at BP’s non operated Angolan subsidiary where in under 7 weeks, BP developed an automated tool that removed need for manual data extraction. The ‘machine learning*’ tool advises on potential production opportunities and has led to ‘$25 million in savings’. The tool was developed by Ruairi Dunne, a recent grad who joined BP’s internship program. Dunne opined on expectations for the industry and his personal transition from a geoscience to a data/tech focus. ‘Would I enjoy a geoscience role more? Is there less prestige in data?’ So far, his journey has involved training in data science with PowerBI, Kaggle, Python and an intensive Coursera-based data science training. Following this came a signal processing and machine learning PoC in reservoir engineering, work with acoustic sensors in sand management. This has been delivered as a real-time operational dashboard (with Palantir) integrating production data with predicted sand events. The tool runs every hour, detecting sand events and adapting chokes.

* Our subsequent investigation suggests that the tool is less dependent on machine learning than on situational awareness. See our editorial.

Teradata data science needs a scalable data system

Teradata’s Niall O’Doherty stated that data management matters more than ever now to drive data science. The Harvard Business Review book ‘Prediction Machines*’, shows the economic impact of AI/ML that is ‘making prediction cheaper’. The financial service industry is getting excited about the technology. AI has the potential to ‘fundamentally change or eliminate parts of your industry’. One key to data science success is a scalable data system that helps move from POC and into production. Enter the [Teradata] agile data warehouse. Another aspect is people. Today we have ‘the wrong people doing the wrong jobs’. Data architects are not data scientists and vice versa. This has led to the ‘accidental data architecture/ecosystem’ where 80% of the time is still spent moving and preparing data. O’Doherty cited Hadley Wickham whose ‘tidy data’ structures are easy to work with and free analysts from mundane chores. As data science thought leader Andrew Ng put it in his seminal talk on the ‘nuts and bolts’ of applying deep learning, the unified data warehouse is key. Teradata is also keen on OSDU and ‘will support the initiative as much as possible’.

* See also the Prediction Machines website.

OSDU The Open Group..

Philip Jong (Shell) provided an introduction and update on OSDU, the Shell-backed Open subsurface data universe. Jong agreed that a data platform is essential but that the full benefits will not be realized if it is kept in house. OSDU therefore has set out to develop an industry-wide data platform. This will counter low productivity due to multiple, small data ecosystems. A meeting with Total and Equinor in March 2018 led to Shell seeding the initiative with the platform and code for its in-house developed ‘Subsurface Data Universe’. In August 2019 an OSDU demo release was made available on Azure and AWS. The demo used INT’s IVAAP well log data viewer, machine learning from CGG and NLP from Shell. All running against a 5,000 well data set from TNO. Mapping appeared to leverage Bing Maps. The overall plan is to separate data from the apps and to put all data on single data platform à la unified data warehouse. The initial scope is exploration/wells. Source code will be made available for cloud services. OSDU will assure end to end data support, management and information security. In 2020, the Schlumberger OpenDES is to be ‘merged’ into OSDU following an ‘overwhelming’ vote from the members. ‘This will be a game changer. A public API will allow access for small players and academia’.

Okea and IO Data on big, tough data project

Pål Andresen from Norwegian E&P startup OKEA teamed with Johan Kink (IO Data) to present a data loading case study on the transfer of the Draugen data set from Shell. Draugen was discovered in 1984 and unpacking of the diverse data set is ‘still in progress’. Seismic data presented multiple problems, a defective tar archive in Rode format required a bespoke repair program to decode. SEG-D in Rode format was likewise badly encoded. Some 3592 nav merge tapes were unreadable. All which required significant programming ability and seis tape domain knowledge. Data delivery and completeness proved problematic. ‘Even with NPD reporting rules, data will be underreported’. Shell maybe not have reported everything. Diskos is not all complete. The NPD could and should take a more active role in data transfer by making field data repository lists, signoff on data delivery and arbitration on data disputes. This will increasingly be an issue as majors hand over ops to smaller companies.

Pandion/Computas and the Kerigen subsurface data platform

Pandion Energy’s Kine Johanne Årdal, with help from Computas, has developed the Kerogen subsurface data platform. A cloud-based, AI-enabled upstream data platform that ‘heralds the new era of the augmented geoscientist’. Kerogen (rather like OSDU) sets out to solve the problem of technical and organizational silos which hamper collaboration across for instance, geochemistry and geophysics.

Wintershall’s AMIE automated information extraction project

Dejan Zamurovic presented the results of Wintershall’s AMIE (automated multidisciplinary information extraction) PoC. This has leveraged ATTIV/O natural language processing and data catalog, Tibco data virtualization, and AgileDD iQC. Extracted data from the document repository feeds endpoints including Spotfire, OpenWorks, Petrel and ArchiveDB. AgileDD performed better than DIY with Python at extracting well metadata from a scan of log. Training dataset is problematic in that different usages have crept in over 20 years. ATTIV/O NLP extracts ‘gas shows’ or ‘serious injury’. For data pairs like ‘vitrinite reflectance’, a value can also be extracted. Drilling depth progress can be extracted from a graph in a text. More from a Tibco blog on well drilling. Tibco data virtualization and a Modula plug and play data pipeline also ran. Costs are ‘non-negligible’.

Schlumberger on the UK National Data Repository

Michael Smith (Schlumberger) presented on the UK National Data Repository that launched in March 2019. This is ‘not exactly a new idea’, with CDA as a precursor since the mid 1990s. In 2018, CDA, Schlumberger and the Oil & Gas Authority decided to launch the NDR and decommission the CDA site. The NDR embeds Schlumberger ProSource with a web app and online/nearline storage, a GIS server and a secure FTP download manager. The system is hosted by Schlumberger outside Aberdeen from a ‘private cloud’. The NDR holds 12k wellbores online, 600k disclosed data items plus seismics and has 200 users per day with 4,700 users registered. The NDR represents a huge change for public access with ‘thousands more wells available than before’. An API is coming real soon now. You can get all UK well data for £20.

Equinor on the ‘inadequate’ LAS well log standard

Bjarne Bøklepp(Equinor) ironically wished the ‘inadequate’ LAS well log ascii standard a happy 30th anniversary (LAS was first published in The Log Analyst in 1989). In 2019 DLIS and LAS 2.0 remain the main exchange formats for well logs. (actually the Canadian Well Log Society issued an LAS 3.0 specification in 2000).

Petroware JSON Well Log Format (JSON-WLF)

In fact, Bøklepp’s presentation was a lead-in to Jacob Dreyer who unveiled the Petroware JSON Well Log Format (JSON-WLF). Well log formats (LIS, DLIS, LAS, WITS; BIT, XTF … ) are outdated, complex, and based on 1980s tape technology. They lack documentation, expertise is withering, and they are costly to maintain and use. Petroware’s business involves reading and writing logs in many legacy formats. Its LogStudio flagship uses an in-memory intermediate format offering maintainable, lossless conversion maintainable. Petroware is now proposing a persistent storage format derived from its internal protocol. JavaScript Object Notation is a non-proprietary standard with support for UTF-8 (Unicode), built in no value, good type support, Energistics UoM support and ISO 8601 date and time. Visit the web page with sample data from Volve all converted and republished to JSON in 50 lines of code. See also the GitHub repository. JSON-WLF data can be loaded to Petrel, Geolog and Matlab. The format ‘has huge potential, the impact will be massive when we get it rolling. We need your help – some homework for you. Pressure your DMs and others to accommodate this format. Standards orgs should embrace this technology’.

Petroware’s talk sparked off some debate. Both Schlumberger and Halliburton stated that they already used JSON internally for log data. Boklepp was queried about the role of the standards bodies and in particular of Energistics WITSML, surely a candidate for log data persistence? Energistics is waiting on a final version of JSON that accommodates multi dimensional arrays. Meanwhile some are waiting for OSDU to specify a standard. For Norway at least it appears that JSON WLF is a strong contender.

IHS Markit/PPDM and taxonomic clarity in the upstream

Elizabeth Patock (IHS Markit) addressed the issue of taxonomic clarity in the upstream with reference to the PPDM What is a completion work. There are many possible interpretations of what is important to a ‘completion’ and regulatory authorities differ on what is required with ‘consequences’. To alleviate such semantic confusion, PPDM advocates faceted taxonomies, hierarchical structures where instances of each facet can be unambiguously described to support interoperability. The PPDM WIAC taxonomy has branches for ‘business’ and ‘physical’ usage, with child facets going down to reservoir, activity, geologic and wellbore contact interval. There remain complex issues with deviated wells and awkward WBCIs and the mechanical interface facet (equipment) present more possibilities for confusion. PPDM is working to tie the facets to its eponymous data model.

* Well bore contact interval.

Sword/Venture - data science unlocks the value in BP’s unstructured data

Attila Balazs (Sword/Venture) presented work performed for BP on ‘unlocking the value in unstructured data with data science’. As data volumes explode, companies either have to organize everything upfront (which is hard to do) or just ‘accumulate stuff’. Data science offers a middle way using exploratory data analysis, data wrangling, model building, prediction and action. R used to be the preferred tool but Python is winning the competition. Pandas exploratory data analysis and SciKit Learn are ‘essential for any ML project’. The BP reference solution has subsurface documents on the server, content parsed and OCR’d and metadata (well names, field, companies) extracted. Documents are classified with ML and stored appropriately with rich metadata. Apache Tika is used for data scraping. Pandas has displaced SQL. The Spacy and NLTK natural language processors both got a mention. The solution underpins BP’s ‘Julien’ automated document management system that processes hundreds of documents and emails from Outlook/Aspera and populates NT Shares. A data harvester framework PoC was developed for the Azerbaijan unit, extracting static reservoir attributes from Office and PDF documents with Camelot and Tabular and feeding a PowerBI dashboard. Tempering the current enthusiasm for data science, Balazs observed that ‘A good data set beats algorithms. Simple regression may be enough’.

Schlumberger’s damascene conversion to open source

Jamie Cruise wound up the event with an enthusiastic presentation on the ‘tipping point’ in the upstream and on Schlumberger’s damascene conversion to an open source software company with its Open Data Ecosystem (OpenDES or more accurately the Delfi data ecosystem) first announced by the then CEO Paal Kibsgaard in July 2019. OpenDES is touted as the data environment that underpins Schlumberger’s Delfi ‘cognitive’ E&P environment with roadmaps for corporate stores; NDRs, extant and next generation products. ‘All of the things that we have been building into out silos over the years will migrate into the data ecosystem. Schlumberger noted work that Phillip Ng was doing in OSDU and realized that there was not really room for two data platforms and decided to contribute openDES to ODSU to accelerate both programs. For the skeptics, Cruise insisted that ‘our conversion to and understanding of open source is very authentic, running under Linux Foundation rules. ‘What is being shared is not skeleton code but the real thing as used by us with core services, data flows, optimized storage and domain data management services. This is the same Delfi code as demoed to our clients. OpenDES is configured for Devops and we made the first commit in Git at a ribbon cutting ceremony in Monaco. Now we are going to learn how to make this open source stuff work as a community. The future of data is open!’

Short takes: NorskOlje&Gass data exchange. AgileDD work for Equinor. Diskos 2.0. North Sea overlooked pay project. Geodata - ArcGIS front end to Volve dataset. Kadme Whereoil front end to Diskos. Interica on Woodside’s rule-based archiving.

NorskOlje&Gass is a regrouping of three Norwegian quangos (GeoTrade, EPIM and legacy NOG) with a history (some would say form) of developing standards for upstream data sharing. NOG is now working with ConocoPhillips, Equinor, Shell and Total on a minimum best practice data set for daily information exchange. The new platform and APIs are about to be released on Azure. The NOG data solution now uses the GraphQL query language.

Henri Blondelle (AgileDD) presented work done for Equinor on extracting the rich (and often unindexed and unused) content present in composite logs. Successive training of lithological descriptors, depth pixels, shows and geological descriptions was performed with YOLO convolutional neural net computer vision software and AgileDD’s own iQC. In fact, the trials were carried out with the entry-level TinyYOLO app, and a ‘light’ IT environment of 5x GPU GTX 1060. Training data was generated by ‘human and heuristic tagging’ of composites. Lithology proved a hard task, show symbols were easier to detect. TinyYOLO was OK for the trial but better results are expected using a larger GPU farm and YOLO’s full implementation of the R-CNN algorithm.

DISKOS 2.0. The current DISKOS contract expires year end 2020. A tender is out but NPD has the option of extending the current contract for an extra three years. As part of its Released Wells Initiative, the NPD, via DISKOS, is seeking to revitalize old data and to digitize cuttings and make the data shareable through Diskos. Stratum/Rockwash have been selected as vendors for the project.

Gillian White from the Aberdeen-based Oil & Gas technology Center floated the Northern North Sea machine learning in exploration overlooked pay project. The idea is to use well data (not currently seismics) from some 1,200 exploration wells and some 6,000 development wells. Logs, core data and reports are available for the study. The project was first announced in 2018.

Erlend Kvinnesland (Geodata) showed how the heterogeneous released Volve data set has been mastered with Esri’s ArcGIS. The Geocap Petrel plug-in for ArcGIS allows for 3D seismic data to be viewed and manipulated in ArcGIS while the production data can be visualized with the ArcGIS operations dashboard. A compelling alternative to unpacking the Volve data with the original software used to create it.

Jesse Lord showed how Kadme’s Whereoil API was used, in conjunction with RoQC Tools, to combine data from multiple sources to enable real time data validation using the most recent NPD and Diskos data, directly from within Petrel. Troika’s Marlin seismic data trawler was used to scan, discover, QC and, if necessary, repair data. ‘Actual and correct’ metadata was extracted directly from the data and stored in the Whereoil index. All the data is now searchable with Whereoil and mappable with the Whereoil Map*.

* We were curious to know more about Whereoil’s mapping technology. Kadme kindly provided the following. ‘We store the spatial data in Elastic, and then we use the GeoTools libraries to manipulate the spatial features. The map interface itself has been built using OpenLayers.This is Kadme’s GIS system with a few man-years of work in it. The end user does not need any third-party licenses to run it. Everything comes with Whereoil.

Chris Bearce presented Interica’s work for Woodside on automated rule-based archiving across multiple data sources, a component of Woodside’s subsurface data transformation program (SDTP). Interica uses Microsoft Azure AI to geotag its datasets, leveraging its substantial training data derived from existing connectors. The geolocation catalogue includes confidence indicators derived from file path, file name, content, well name and other indicators. Woodside’s SDTP is to declutter and prioritize its online projects and reduce disk storage taken up by old Petrel projects. The solution leveraged Interica’s PARS, ARBA, a Petrel connector and open API’s for integration with the global GIS system. The solution is integrated with AWS S3 for long-term storage.

The 25th ECIM is scheduled for September 14-16 2020.


Folks, facts, orgs …

Atwell, Avetta, Bilfinger, Borr Drilling, BP, NexTier Oilfield Solutions, Capgemini, Cfihos, CGG, Cognite, Data Gumbo, Element Materials Technology, Element Analytics, Energistics, TietoEVRY, Fugro, GHD, Infrastructure Networks, IOGP, Lloyds Register, mCloud, Mission Secure, NSI Technologies, OAGi, OspreyData, PRCI, ProStar Geocorp, Quanta Services, Schneider Electric, Siemens, Spire, T.D. Williamson, US Department of Energy, Weatherford International, Ziyen Energy, GeoDeVL, Radix, Technology Collaboration Center, XBRL International.

Atwell LLC has hired Drew Powers as oil and gas team leader. He is based in Atwell’s Pittsburgh office.

Arshad Matin is now president and CEO of Avetta, provider of cloud-based supply chain risk management solutions. Matin was previously with Emerson’s Paradigm unit.

Bilfinger has appointment of Jon Rokk to president and CEO of its middle east division. Rokk was previously with Interserve.

Pål Kibsgaard is now chairman of the Borr Drilling board. He was previously with Schlumberger.

BP CEO Bob Dudley is to retire in March 2020. He will be succeeded by Bernard Looney, currently chief executive, upstream.

NexTier Oilfield Solutions’ CEO is Robert Drummond. Ted Lafferty is senior VP and CTO.

Capgemini has named Elfije Lemaitre as head of its north American energy practice. She hails from Accenture.

Peter Townson (IOGP) is to take over Paul van Exel’s (USPI-NL) role as manager of Cfihos, the capital facilities information handover standards organization.

Bertrand Tertrais heads-up CGG’s new regional geoscience center in Abu Dhabi.

Nori Tokusue heads-up Cognite’s new Tokyo, Japan-based branch

Blockchain boutique Data Gumbo has appointed Michael Matthews to construction industry principal. Matthews is chair of the upstream, midstream and mining sector committee at CII, the Construction industry institute at The University of Texas at Austin. He was previously with Enstoa.

Thomas Walsh heads-up Element Materials Technology’s expanded center of excellence for oil and gas materials testing in Houston. EMT CFO, Jo Wetz, is to succeed retiree Charles Noall as CEO.

Element Analytics has hired oil and gas sector veteran Steve Beamer as VP customer success and transformation. Beamer hails from BP.

The Energistics board has a new chair, Paul Zeppenfeldt (Shell). He replaces BP’s Elinor Doubell who, along with board members Peter Nielsen (Equinor) and Eric Toogood (NPD), are stepping down at year end 2019.

The combination of Tieto and EVRY into TietoEVRY is now complete. Leadership positions include Thomas Nordås, head of digital consulting, Johan Torstensson, head of cloud and infrastructure, Christian Segersven, head of industry software.

Fugro has appointed Erik-Jan Bijvank as group director for Europe and Africa. He hails from Stork, a Fluor company.

GHD has appointed Jim Giannopoulos as executive general manager, Canada.

Houston-based Infrastructure Networks (INET), has named John Colwell as CFO. He was previously with VanZandt Controls. INET also recently named Mark Slaughter, formerly with RigNet, to CEO.

Shell’s Olav Skår has joined the IOGP as safety director.

John O’Neill has joined LR (Lloyds Register) as head of wells delivery. He was formerly with Maersk/Total. Melvin Banford joins as head of wells assurance and Derek Harrold (ex CNOOC) as UK, Europe and Africa wells manager, alongside head of wells HSSE Steve Harris (also ex-Total).

mCloud has appointed Kent Chan as strategic growth manager to its smart process business segment. He hails from Petronas.

Kent Pope has joined Mission Secure as chief revenue officer.

Dante Guerra has joined NSI Technologies as technical services manager.

Oracle’s Michael Rowell has stepped down as chief architect at OAGi.

OspreyData has appointed Richard Wuest as VP sales and Alex Lamb as data scientist. Wuest was previously with Oracle.

PRCI, the Pipeline research council international has named Walter Kresic (Enbridge) as chair of the board. Lisa Madden (ExxonMobil) is vice chair.

Steven Maldonado has joined ProStar Geocorp to manage sales and service at its oil and gas division in Houston.

Martha Wyrsch has joined the board of Quanta Services.

Nathalie Marcotte has been named president of Schneider Electric’s process automation business, replacing retiree Gary Freburger.

Siemens has announced the appointment of Hanna Hennig as chief information officer following the departure of Helmuth Ludwig, ‘at his own request’. Michael Sen will be CEO and Tim Holt COO of Siemens Energy, a unit that will be spun out of the Siemens AG parent in 2020.

Scott Smith has joined Spire as president of Spire Storage and Spire STL Pipeline. He hails from Midstream Energy Holdings.

T.D. Williamson has appointed Bill Rees as VP Western Hemisphere. Chuck Harris has been promoted to VP marketing and product management.

The US Senate has confirmed Dan Brouillette secretary of energy at the DoE.

Christoph Bausch is stepping down from his position as executive VP and CFO at Weatherford International. His interim replacement is Stuart Fraser.

Ziyen Energy has announced that David Greenberg has been appointed chairman of its new ‘energy production asset tokenization and trading platform’, ZYEN. Greenberg is chairman of Greenberg Capital. The digital trading platform will launch in 2020 using ‘permission-based blockchain technology’, ‘creating a liquid market for previously illiquid assets’. Caveat emptor!

ORGS

GeoDeVL, the Australian geoscience data-enhanced virtual laboratory is a collaborative initiative, co-funded by the Australian Research Data Commons, that provides researchers with seamless access to data, tools, compute resources and related services via a single portal. The online repository of previously incompatible data from different national groups provides open access to geological, geochemical and geophysical data, expanding the AuScope virtual research environment.

Radix is now a member of the Technology Collaboration Center (TCC) a partnership between the NASA Johnson Space Center, industry and universities.

Hans Buysse has moved from vice chair to chair of the XBRL International board. Robert Tarola moves into the vice chair role.


Done deals …

Implico, Brainum. Baker Hughes Company. Repsol, Belmont Technology. NexTier Oilfield Solutions. Circor International. CSA Ocean Sciences, MMT. Fugro Seabed Geosolutions. GE Fanuc, Emerson. Seequent, Geoslope. L3Harris Technologies. McCoy Global, DrawWorks. Pason Systems, Intelligent Wellhead Systems. Petrofac, W&W Energy Services. Petrosys, GNS Science. Prometheus Group, Engica. Siemens, Atlas 3D, Pixeom. SitePro, Integrated Control Solutions. Subsea 7, 4subsea. Sword Group, DataCo. Titan Cloud Software, JMM Global. Universal mCloud, Fulcrum Automation Technologies, Autopro Automation Consultants. WolfePak Software, DocVue

Implico has acquired Brainum, developer of the QINO hosted tank terminal management solution.

Baker Hughes, a GE company has changed its name to Baker Hughes Company. Its stock now trades under the ‘BKR’ symbol.

Houston-based AI startup Belmont Technology has received an equity investment from Repsol’s corporate venture arm. Belmont’s Sandy platform uses graph technology and artificial intelligence across geoscience and reservoir engineering domains.

C&J Energy Services and Keane Group have completed their ‘merger of equals,’ establishing a new company, NexTier Oilfield Solutions.

Circor International is to sell its distributed valves business in a ‘strategic shift’ away from upstream oil and gas. The company plans to ‘focus on more attractive end markets with enhanced growth and earnings potential’.

CSA Ocean Sciences is to join forces with geophysical survey boutique MMT.

Fugro Seabed Geosolutions has sold its shallow water ocean bottom cable recording equipment for approx. $10 million. The sale completes Seabed’s transformation into a ‘pure ocean bottom node company’ focused on its Manta technologies.

GE’s Intelligent Platforms (a.k.a. GE Fanuc or Automation & Controls) are now to be marketed by Emerson.

Seequent has acquired Geoslope’s GeoStudio geotechnical analysis software.

Harris Corporation and L3 Technologies have merged, creating L3Harris Technologies.

McCoy Global has acquired DrawWorks, advancing its digital technology roadmap. Total consideration for the acquisition was $6.0 million, of which $1.5 million was financed through a vendor take-back consideration. McCoy borrowed $2.4 to finance the acquisition.

Pason Systems has invested C$25 million to acquire a minority interest in Intelligent Wellhead Systems, a privately-owned specialist provider of surface control systems for shale, subsea intervention, critical well intervention and offshore operations. IWS’ core technology, inVision gives operators a ‘digital window’ into the pressure control stack via ‘patented, non-invasive technology’. Werklund Growth Fund also acquired an additional C$10 million of IWS equity.

Petrofac is to acquire W&W Energy Services and gain an ‘entry-level’ position in the US onshore operations and maintenance market. Petrofac is to pay an initial cash consideration of $22 million with more monies due based on W&W’s financial performance over the three-year period ended 31 December 2021.

Petrosys has acquired the Globe Claritas seismic processing software from New Zealand’s GNS Science.

Prometheus Group has acquired maintenance and safety management software house Engica.

Siemens has acquired Atlas 3D and its GPU-accelerated Sunata 3D printing software. Atlas will join Siemens Digital Industries unit and the Xcelerator software portfolio.

Siemens also announced the planned acquisition of Pixeom’s Edge technology, a Docker-based runtime and device management solution for edge devices.

SitePro has acquired Integrated Control Solutions. The combined company will address logistical challenges of fluid management operations in the oil & gas industry. The transaction was financed through a combination of a debt facility provided by J.P Morgan and SitePro’s existing shareholder group, including Cottonwood Venture Partners and several family offices.

Subsea 7 has acquired 4subsea in a 100% stock buyout. 4Subsea is to provide Subsea 7 with advanced digital solutions for life of field and field development contracts and will contribute to the ongoing digital transformation of Subsea 7’s business delivery.

Sword Group has acquired DataCo. The deal augments the Sword Venture business unit’s upstream data and information management offering.

Titan Cloud Software has acquired JMM Global. The combined company will become the ‘undisputed leader’ in compliance software and services for the downstream petroleum industry. The deal was led by Titan’s equity investor M33 Growth. Titan’s software currently monitors 50% of all US consumer gasoline throughput across some 65,000 facilities.

Universal mCloud has acquired Fulcrum Automation Technologies and Autopro Automation Consultants, signaling its entrance into process industries including oil and gas, petrochemical, and pipeline management.

Charlesbank Capital Partners unit WolfePak Software has acquired DocVue bringing a ‘comprehensive’ document and data management solution to the digital oilfield.


ECN 2019 5th IoT in Oil and Gas

The Energy Conference Network’s 2019 IoT in Oil & Gas, Houston event offered an introduction to the internet of things* in terms of market potential and technology. Real-world early-adopter deployments from Chevron and Hess. Hitachi Ventura on the promise of digital/IoT. Detechtion’s IoT primer. Vantiq on IoT at Total’s TADI project. Roxar’s MPFM and the cloud. Anadarko/Oxy on AI in drilling. Apergy Spotlight. Chevron’s IoT cybersecurity. Other vendors - Ametek, Appian, Hawkiii, Neudesic, Onica, Swim. Clariant on oil and gas in … 2035.

Hitachi Ventura on the potential of the IoT

David Smethurst’s (Hitachi Ventura) presentation encapsulated the promise and the hype of the IoT. Seemingly, ‘Capital markets now … routinely ask energy companies what they’re doing to prepare for digital’. The 2014 collapse in the oil price meant that ‘many oil and gas and service companies were unable to invest in business improvement’ and the industry is ‘well behind in figuring out how to leverage digital innovations’. The next five years will see the cloud storing the new flood of data, enabling new disruptive business models and providing the foundation for other digital innovations. ERP systems will continue to provide the commercial underpinnings for the industry, while artificial intelligence will ‘read and interpret all the data’, supporting key human decision-making functions. Sensor technologies and the internet of things will unlock remote asset monitoring and maintenance and process efficiencies, while generating ‘vast quantities of data to store and analyze’.

Detechtion - The IoT qu’est-ce que c’est? And why 74% of PoCs are unsuccessful.

But what exactly is the IoT? Is it new? Is it real? Eric Neason (Detechtion) presented on ‘Accelerating asset performance management (APM) with the IoT’. Citing various sources, Neason has it that the IoT involves ‘machines, computers and people enabling intelligent industrial operations using advanced data analytics for transformational business outcomes.’ But no, it’s not exactly new. The IoT has evolved out of prior art including the PLC, M2M devices, the Ethernet, the Internet and the cloud. The IoT can be thought of as ‘the next generation of scada’. IoT connectivity benefits from the expansion of cellular networks including private LTE networks and emerging protocols like 5G. The oil and gas flavor of the IoT is different from other industries as it often operates in remote, sparsely populated areas with multiple participants across the value chain. The ubiquitous connectivity of the IoT allows for situational awareness of remote assets and enables optimal servicing of ‘underserved’ assets such as gas compressors. Notwithstanding the potential benefits, Neason warns that ‘74% of surveyed companies report that IoT initiatives were not successful’ while 60% reported that they ‘looked good on paper but proved to be more complex than expected’. APM success means ‘starting with a business problem, defining the finish line and creating a roadmap’.

Total’s TADI and Vantiq’s IoT RAD

Blaine Mathieu presented Vantiq’s development environment for IoT connected applications. Vantiq claims a significant speedup in development and reduction in code using its technology over ‘vanilla’ IoT/cloud app development. One satisfied user is Thierry Baron who presented Total’s TADI (Total anomaly detection initiative) at the 2019 Vantiq GPS user group. TADI uses Vantiq to perform early detection of equipment failure or gas leaks using next generation sensors and real-time data analysis. TADI was developed at Total’s decommissioned Lacq gas field, an EU Seveso 3-regulated facility which can reproduce large gas leak flow rates, from 0.5 g/s to 300 g/s. The technology is now deployed in Total’s ‘Operations Center of the Future’ testbed. A demonstrator unit is planned for delivery in 2021. Watch Baron’s GPS talk on Youtube.

Emerson/Roxar/Microsoft, the MPFM and the cloud

Mike Boudreaux from Emerson’s Roxar unit explained how its 2600 MPFM (multiphase flowmeter), touted as a replacement for the test separator, has been connected to the Microsoft Azure cloud via a ‘secure first mile’ using the Azure IoT Edge gateway and Modbus connectivity. Microsoft’s Bobby Lee also presented a use case involving edge-deployed pattern recognition to determine pump condition in remote locations. Lee observed that ‘continuously inspecting thousands of dyno cards individually can be costly’. The IoT solution can detect pump issues at scale and in real time. If necessary, a pump can be stopped, and field technicians alerted. More from the project minisite.

More on Oxy/Anadarko’s use of ML/AI in drilling

We already reported on what was then Anadarko’s ML/AI project. Since then, Anadarko has been acquired by Oxy whose Dingzhou Cao provided more chapter and verse on the flagship project, carried out with help from IPCOS and Apex Systems. As we reported previously, Cao’s team is using a spectrum of ML/AI tools to derive real time drilling information from WITS0 and WITSML data streaming from the wellsite. One key function was to correctly identify drilling states and change points in directional drilling. A dataset of 10 rotary steerable system wells and 21 mud motor wells was used to build the change point detection algorithm. This was developed by converting time series data to an image and using pattern recognition technology (UNet, ResNet and transfer learning). The system proved ‘highly accurate’ with a 99.93% success rate.

Apergy Spotlight for real time event driven app development

Paul Bonner presented Apergy Spotlight, a real-time, event-driven application for oil and gas. Spotlight combines IoT, edge computing and automated analysis and machine learning. One use case is continuous monitoring of high-speed engines and reciprocating compressors to predict the onset of failures and enable timely maintenance. Spotlight is a Class I Div. II/IP67 add-on monitor for industrial hardware along with an edge controller and wireless gateway. Successful analytics requires domain expertise in compressors and good monitoring with the right data points at the right frequency. In this application Spotlight provides pressures, temperatures, crankshaft position and crosshead vibration for every degree of crankshaft rotation. Spotlight analytics predicts valve leaks, piston ring leakage, loss of rod reversal and other issues. The system has been trained on many compressors with a variety of features. Some 14 features were used to build the valve leak model. ‘Feature engineering’, leveraging domain knowledge, composes features into explainable models.

Chevron on cybersecurity on the IoT

Michael Lewis cited IoT use cases in Chevron as monitoring process control, information gathering from connected assets and data analytics. IoT opportunities include predictive maintenance to help prevent unplanned outages and reduce number of scheduled repairs, optimizing transportation schedules and improve safety by spotting worker fatigue. IoT data is not used directly within process control networks because of the susceptibility of IoT devices to attack. IoT devices may lack standard cybersecurity solutions, they may be insecurely designed or expose a complex architecture that is hard to secure. Such vulnerabilities are magnified by sheer numbers of devices. Protecting Chevron’s extensive networks, from wells to gas stations, is an exercise in risk management, with protection that is appropriate to the intended use. Preventing a catastrophic event is key. IoT sensors can only read process control equipment data sources, preventing denial of service attacks. IoT is a peer to peer network where everything, from computers, cell phones and tablets, to monitors, windows, light bulbs, cars, watches is networked and capable of communicating to each other. Threats, either malicious or accidental, may exploit vulnerabilities or other aspects to cause loss events. Lewis gave a pointer to the NIST Cybersecurity Framework and work by NCCOE on IoT control selection. For Chevron, segregation of the process data network is the principal control as it precludes a compromised IoT from affecting the process network.

Other vendor presentations – Ametek, Appian, Hawkiii, Neudesic, Onica, Swim.

Ametek’s Skybitz commercial telematics provides real-time information on the location and status of assets. SkyBitz delivers end-to-end solutions for enterprise and local fleets, tank monitoring and petroleum logistics. Truck monitoring optimizes truck visits and provides visibility across assets.

Appian’s ‘low code’ platform for modernizing enterprise and operational applications connects field and device data to the front office for preventative maintenance, incident management pipeline inspection and safety systems. More from Appian.

Hawkiiii low cost, low energy wireless solutions for rod pumped wells.

Neudesic Insights analytic framework for merging AI and IoT. A unified data platform enabling load, store, analyze and retain knowledge. Clients include BP and Hess.

Onica’s ‘IoTanium’ rapid prototype board offers multiple pre-integrated connectivity options, including Wi Fi, BLE and LTE and exposed contacts for easy prototyping. Data can link to AI/ML analytics in the AWS cloud. One application showed a bespoke downhole sensor, feeding MQTT data to a Bluetooth surface device and on into the AWS Greengrass gateway.

Swim DataFabric is an open source platform for building data-driven applications. DataFabric replaces a bare bones Hadoop/Spark programming environment and is used ‘by supermajors’ for OT cyber monitoring and refinery production optimization.

Clariant on oil and gas in … 2035

Most futurologists only look a shortish time into the future and see stuff that is rather similar to what we already have! Clariant’s Paul Gould was more adventurous, imagining the oil and gas business in 2035 while acknowledging that ‘predictions are almost always wrong!’ As an oil country chemicals provider, Clariant sees oil operations of the future as centered on unconventionals, with intensive development of the factory drilling (and fracking) paradigm. The future will see autonomous drilling rigs, fracking operations and completions. New ‘micro fracking’ will target micro formations with precise lifecycle recovery plans ‘delivered with AI’. Giant well clusters of hundreds of wells drilled in a circular pattern will extend with 4 to 5 mile laterals into multiple production zones. ‘Super depots’ will service the clusters with collaborating robots operating 24 x 7. This means that the graduates of 2025 will need to be 40% robotics engineers. The workforce will be under one third of today’s. There will be ‘many, many more small, reliable low-cost sensors’ replacing drones. Scada suppliers will need to transform into AI and Robotics software companies. Due to the high density of wells, well life will be very short, but will yield greater production. Infill wells will no longer be needed. There will be only a few ‘mega sized’ operators with very few mid-sized and small operators. Lifting costs will be down ‘40%-60%’ compared to 2019.

The 2020 ECN IoT in oil and gas will be held at the Hilton Americas in Houston on the 28th and 29th of September 2020. More from the Energy Conference Network.

* The IoT, the internet of things, is referred to by some authors as the IIoT, the industrial internet of things. We use IoT throughout this report as a synonym for the IIoT.


AI platforms shrink at the edge

Foghorn runs video analytics on Jetson Nano. 40Geo demos TensorFlow computer vison on Raspberry PI. OpenMV, DIY machine vision for the Arduino.

Foghorn runs video analytics on Jetson Nano

Ramya Ravichandar (Foghorn), speaking at the 2019 Nvidia developer meet in San Francisco, presented on some video analytics use cases in the industrial internet of things. Foghorn claims to have the ‘World’s smallest and fastest inference engine’ Its EdgeAI platform comes with out-of-the-box solutions. Edge computing bests a long round trip via the cloud as AI/streaming analytics is local. Real time, millisecond reaction powers time critical actions. Foghorn Lighting edge software includes VEL complex event processing and AdgeML AI. An oil and gas use case is flare monitoring on Jetson Nano showing how deep learning models can be run on minimal hardware. Foghorn is backed, inter alia, by DellEMC, GE, Honeywell, Saudi Aramco Energy Ventures and Yokogawa. More from Foghorn.

40Geo demos TensorFlow computer vison on Raspberry PI

In a different context we saw a similar AI-in-a-box on display at the 2019 Esri EU Petroleum User Group (full report in our next issue). Keith Fraley was demonstrating both 40GEO’s Raptor geo-located internet of things technology and his ‘maker’ skills. A Raspberry PI running a tensor Flow model and video camera were capable of identifying a range of objects held in front of the lens. It’s not 100%. The system thought my clementine was a ping pong ball ... a good try. What was interesting is that, according to Frayley, the compute power needed to run computer vision in real time on a small hardware footprint is not all that great.

OpenMV, DIY machine vision for the Arduino

Our own googling also located the OpenMV an Arduino-based, Python-powered machine vision bundle for makers and hobbyists. Checkout the neat IDE.


ECCMA ISO TC 184/SC 4 industry day

CT 184 Consulting reports on new ISO/TC 18101-1 ‘Oil & Gas Interoperability Standard’ for engineering data exchange.

Speaking at the ECCMA ISO TC 184/SC 4 industry day in Marina del Rey, California, Dan Carnahan (CT 184 Consulting) presented the Oil & Gas Interoperability Standard (published in June 2019). Carnahan is technical advisor to the ISO/TC 184, automation and systems integration standards body. The OGI established how ISO 8000 is used across the life cycle of an oil and gas project (platform or refinery) to assure data quality as multiple suppliers at different stages in the project exchange large amounts of information.

The data exchange standard is said to provide a neutral, tool and application-independent exchange mechanism for use in EPC (engineering procurement contractor) or owner/operators’ internal systems. The concept is for a ‘virtual network’ that extends beyond construction and into operations and maintenance. The standard provides context-sensitive semantics and syntax. For example, a manufacturer can provide make and model information for the equipment supplied which an EPC can leverage in process flow, engineering design requirements before providing the final design documentation to the owner/operator. The intent is that all stakeholders provide their information in standard so as to interoperate withing the OGI.

Previously (today?), each stakeholder would use its own corporate dictionary to describe their products and services, often with confusing differences in meaning and syntax. Much information is exchanged in spreadsheets with little contextual information. The OGI provides a framework for a common data dictionary/catalog for use by all. ‘As more companies adopt the basic requirements for the data architecture, this will become the enabling factor for the digital business ecosystem’.

* As reported in the October 2019 ECCMA Newsletter.

Comment: The standardization of plant and process data has a long and checkered history from ISO 15926 to Cfihos which have both provided ‘frameworks’ for data exchange. These have proven hard and expensive to populate with real data and have thus had limited take-up in real-world projects.


Sales, partnerships …

Kongsberg Digital, Shell. Hexagon, PDO. Shell, Baker Hughes, C3.ai, Microsoft. Beacon, Shell New Energies. Xpansiv, Bluefield. Data Gumbo, BBL Ventures. Ecolog, TST-Turbo. Eni, eDrilling. Emerson, Petrobras. Equinor, Saipem. Repsol, Halliburton. Halliburton, PTTEP. Honeywell, ADNOC, KIPIC, BP. ExxonMobil, IBM. IT Mation, Ignition. Intertek, NEPSI. Mesa, Crusoe Energy Systems. Propell, Stimline. Recon Technology, Ping An. RS Energy Group, Shell Western E&P. Shell, SparkCognition. Siemens, Tenaris. ADNOC, Total. Kasi Group, Wärtsilä. Petrobras, Weatherford International. Woodside, IBM.

Kongsberg Digital is to build a digital twin of Shell’s Nyhamna gas processing and export hub for Ormen Lange and other fields connected to the Polarled pipeline. Kongsberg will deploy its Kognifai ‘dynamic digital twin’ in a contract worth approx. 100 million NOK.

Petroleum Development Oman has commissioned a document management system from Hexagon. Hexagon’s SmartPlant Enterprise for owner operators will provide ‘smooth data and document handover and seamless, workflow-driven submissions and feedback for document and tag completeness.

Baker Hughes, C3.ai and Microsoft have announced an alliance to ‘accelerate the digital transformation of the energy industry’ with enterprise artificial intelligence solutions running in the Microsoft Azure cloud. The grouping received an endorsement from Shell Group CIO Jay Crotts who commented; ‘Shell supports the aim of this strategic alliance to improve efficiencies, increase safety, and reduce environmental impact through digital transformation. Baker Hughes is a long-standing partner in oilfield services and software development, and we use the C3.ai platform on Azure to accelerate digital transformation across our business, helping to improve overall operations’.

Shell’s New Energies unit is to leverage the Beacon Platform to develop ‘full-stack web applications’ using Beacon’s cloud-based developer platform and dependency graph technology.

Xpansiv, a BP-backed specialist of ‘responsible’ natural gas sourcing is to leverage remote sensing data from Bluefield to develop its encrypted digital representations of natural gas.

Data Gumbo has announced an implementation partnership with BBL Ventures and its Grid Innovation Ventures unit, both described as ‘business strategy and implementation consulting companies’.

Ecolog International and TST-Turbo Service & Trading are to partner on digitized and integrated life cycle solutions for rotating equipment in the hydrocarbons and energy industries.

Eni has chosen eDrilling’s well simulation digital twin and artificial intelligence technologies to support its drilling activities, from well planning and training up to operation real time follow up. eDrilling is also to supply its software solutions to Pertamina in a collaboration with partner Navita Origo, a real time monitoring specialist.

Emerson has signed a ‘multi-year, multi-million dollar’ agreement with Petrobras for managed hosting services in support of deepwater operations. The deal includes Emerson’s exploration and development software served through high-performance, remote desktops for office and field locations. Emerson is to provide and manage dedicated servers and associated hardware with remote data management, configuration and optimization.

Equinor, on behalf of the Njord licence, has awarding a wireless subsea drone contract to Saipem making Equinor ‘the first user of technology expected to be completed in 2020’. The ten year contract for the Hydrone R/W is valued at approx. €40 million.

Repsol has awarded Halliburton a multi-year agreement for the provision of a cloud-based master data management solution for exploration and production activities. Halliburton/Landmark’s DecisionSpace 365 Data Foundation will provide simultaneous access and management across ten global locations during the contract’s first year with additional locations to follow.

Halliburton also announced that Thailand’s PTTEP has selected the DecisionSpace 365 Digital Well Program application to automate its drilling, completions and engineering processes across the well lifecycle.

The Abu Dhabi National Oil Company has selected Honeywell’s asset monitoring and predictive analytics solution for deployment across its upstream and downstream operations. The 10-year contract is claimed to be one of the largest predictive maintenance projects in the oil and gas industry and is part of ADNOC’s flagship Centralized Predictive Analytics and Diagnostics (CPAD) program. Honeywell’s Forge Asset Monitor and Predictive Analytics solutions will be deployed at ADNOC’s Panorama command center in its Abu Dhabi HQ. Panorama aggregates real-time information across ADNOC’s businesses and applies smart analytical models, AI and big data to generate operational insights. The addition of Honeywell’s solutions will enable the central monitoring of up to 2,500 critical rotating equipment items across the group.

Honeywell also announced a deal with Kuwait Integrated Petroleum Industries Company (KIPIC) which has selected its Honeywell Process Solutions unit as the main automation contractor for its new Petrochemicals and Refinery Integration Al Zour Project (PRIZe). HPS will provide KIPIC with front-end engineering design and advanced process control technology for the complex.

Honeywell has also been awarded a contract to remotely support BP’s Trinidad Cassia compression platform with an integrated control and safety system (ICSS). Honeywell is the project’s main automation contractor. The solution will be based on Honeywell’s flagship control system, the Experion process knowledge system. Honeywell has served as MAC for all three BP offshore platforms in the Cassia Complex.

ExxonMobil’s ‘all-in-one’ loyalty and payment app, Exxon Mobil Rewards+ is hosted in the IBM public cloud. The app replaces the ExxonMobil Speedpass mobile payment app and was designed and developed in partnership with IBM iX.

IT Mation has been named an authorized Ignition distributor for France.

Quality assurance specialist Intertek has signed an agreement with China’s National Supervision and Inspection Center for Explosion Protection and Safety of Instrumentation (NEPSI) for the provision of assurance, testing, inspection and certification solutions (IECEx, ATEX, ECAS, ETL and CCC) for imports and exports to China.

Mesa Natural Gas Solutions and Crusoe Energy Systems are to target 50 megawatts of flare-to-computing projects within the next two years. Mesa’s ‘digital flare mitigation’ generators turn flare gas into electrical power for on-site usage by Crusoe’s data centers.

Propell and Stimline are to team on a ‘next generation’ well-site. Initial focus is for integrated completion and intervention operations through Stimline’s IDEX architecture, a digital twin of the well linking pumps, coiled tubing equipment with automated control systems and digital infrastructure.

Recon Technology unit Future Gas Station and Ping An Property & Casualty Insurance are to team on the digital transformation of gas stations in China’s Zhejiang province. The deal involves Ping An’s Auto Owner app and FGS’ DT Refuel app’s API. Auto Owner has over 80 million registered users.

RS Energy Group is to supply Shell Western E&P Inc. (SWEPI) with its ‘advanced intelligence, analytics and data science solutions’ for the Permian basin.

Shell has also teamed with SparkCognition via the Shell GameChanger program to apply artificial intelligence to well pore pressure prediction. SparkCognition recently raised some $100 million in a Series C funding round.

Siemens Digital Industries Software is expanding its cloud hosted portfolio to include Simcenter Amesim (simulation software for modeling and analysis of multidomain systems) and Simcenter 3D (environment for 3D Computer aided Engineering, CAE). A monthly subscription offers small and medium-sized enterprises a modular function library and a pay-per-use computer infrastructure.

Tenaris, a major manufacturer of tubular goods for the energy industry has deployed Siemens’ TIA (Totally Integrated Automation) engineering framework across its Texas plant. This bill of materials for the deal included 124 programmable logic controllers, 4,024 drives, 1,620 network nodes, 4,000 HMIs and 5,000 I/O modules. The plant is said to be the ‘largest and most comprehensively automated pipe manufacturing facility in the world’.

Abu Dhabi National Oil Company and Total are to trial a ‘world’s first’ automated seismic acquisition system. This project will trial Total’s METIS, multiphysics exploration technology integrated system over a 36 sq. km desert environment. The seismic sensors will be dropped by six autonomous aerial drones and later be retrieved by an unmanned ground vehicle.

Kasi Group (Malaysia) has ordered an LNG bunkering vessel simulator from Wärtsilä. The simulator will provide hands-on operator training. The project involves a TechSim LCHS network class simulator and five workstations, one for the instructor and four for the trainees. The number of LNG bunkering vessel simulators has risen from one in 2017 to nine by year end 2019. 30 more are forecast to be delivered during the coming five years.

Petrobras has ordered 24 Optimax deep-set safety valves from Weatherford International to be delivered over the next four years. The Optimax protects against catastrophic loss of well control by providing fail-safe closures at pressure up to 10,000 psi and depths of up to 12,000 ft.

Woodside Energy and IBM are to leverage current and emerging technologies like AI and quantum computing to realize the vision of an ‘intelligent plant’. The deal sees Woodside becoming a member of the MIT-IBM Watson AI Lab, an industry/academia laboratory focused on advancing fundamental AI research. Woodside is also to join the IBM Q Network to use quantum computing to ‘conduct deep computational simulations across the value chain of Woodside’s business’. Five years ago, Woodside engaged with IBM Watson to implement cognitive solutions, ‘now used by 80% of the workforce’.


Standards stuff …

PIDX Field Ticket Guideline. EU Financial Transparency Gateway. CAPE-OPEN 2.0. Energistics Energy Transfer Protocol V1.2. IOGP Report 373-26, GoM coordinate transformations. OGC API Features Part 1. ISO/IEC JTC1, ETSI, oneM2M, W3C AIOTI. NASA SWEET. XBRL. NIST’s new foot.

PIDX has released a Field Ticket Business Process Guideline document, a 12 page explainer covering the use of the PIDX XML field ticket in a typical oil country transaction.

The European Commission has announced the ‘EU Financial Transparency Gateway’ (EFTG), a blockchain-based pilot project for sharing data financial data. The EFTG is claimed to give citizens and investors access to public regulated information.

Speaking at CAPE-OPEN 2019, Shell’s Mark Stijnman presented his views on an upcoming CAPE-OPEN 2.0 edition of the chemical process modelling standard. Cape-Open needs to evolve to multi-core computing and for deployment in the cloud, possibly with a standardized web API. The thermodynamic package and the unit operation packages both need a redesign. More from Cape-Open.

Energistics Energy Transfer Protocol (ETP) V1.2 is will be now out for public review real soon now. In 2020, Energistics is to ‘explore an optimal approach for WITSML data accessibility for data analytics platforms and applications’. Energistics also received an endorsement from OSDU lead Johan Krebbers (Shell) who stated, ‘The development and deployment of the OSDU Platform with the embedded Energistics standards will be an important step change for this industry’.

The Geomatics Committee of the IOGP has released Report 373-26, Coordinate transformations in the US Gulf of Mexico OCS a.k.a. Guidance Note 26, describing the use of coordinate reference systems and transformations. Key issues deal with in the report are the replacement of NADCON with NADCON5 and the use of GNSS measurements made in a dynamic CRS (WGS 84).

The Open Geospatial Consortium (OGC) has published its first API standard, OGC API - Features - Part 1: Core. The API building blocks allow for the creation, modification and query of geographical features in web maps. The core spec covers geometries in WGS 84 and fine-grained access to data. The standard was developed in coordination with ISO / TC 211 and is currently under consideration by ISO for approval and publication.

A cross-organization expert group involving ISO/IEC JTC1, ETSI, oneM2M and W3C are collaborating with AIOTI on accelerating adoption of semantic technologies in the internet of things. The group has published two white papers on semantic interoperability viz. ‘Semantic IoT Solutions - A Developer Perspective’ and ‘Towards semantic interoperability standards based on ontologies’.

The Semantic Technologies Committee of ESIP, NASA’s Earth sciences information partners, has released V 3.4.0 of SWEET, the Semantic web for earth and environmental terminology, a suite of over 6,900 concepts in 225 ontologies covering earth system science. SWEET ontologies are written in W3C Turtle and are available under the Apache license. More from the SWEET homepage.

The World wide web consortium has just celebrated its 25th anniversary.

The XBRL Standards Board has a candidate recommendation for a XBRL-CSV specification, combining the ‘exceptionally efficient’ CSV format, with the taxonomy-backed structured data of XBRL. More from XBRL.

The US National Institute of Standards and Technology has announced that the US survey foot is to retire in 2022, along with the modernization of the National Spatial Reference System (NSRS). The US foot will be replaced by the ‘foot’ (formerly known as the international foot) equal to 0.3048 meter exactly for all applications. The two definitions differ by around 0.01 foot* per mile. More from NIST.

* NIST did not actually specify which foot it used for this delta! So we cheekily pinged NOAA and got this rather informative reply…

Neil,

I assume your question is tongue-in-cheek, but for fun, let’s take a look at this. There are actually four permutations, because it can be evaluated in international feet and miles, and in U.S. survey feet and miles. For each case, the absolute value of the difference, to 14 decimal places, is:

• 0.01055999999944 international foot per international mile

• 0.01056004223957 international foot per U.S. survey mile

• 0.01055997887944 U.S. survey foot per international mile

• 0.01056002111949 U.S. survey foot per U.S. survey mile

Although offered in jest, this does illustrate why having two nearly identical versions of the foot in current use creates confusion. There are many actual examples that do cause real problems.

Cheers,

Michael Dennis


Safety first …

Accenture’s CraneTagZ. US Chemical Safety Board on Philadelphia refinery blaze. DevonWay reports major safety management system deployment. DNV GL’s safety-focused digital twin concept, updates IRIS sustainability rating system. Ideagen Q-Pulse Law compliance management. IOGP Report 453, Safety Leadership in Practice. Intelex ISO 45001 checklist for HSE. CCRT/Washington State release S-CAT, a safety climate assessment tool. Restrata’s eponymous safety platform. Rockwell Journal Oil & Gas eBook on safe operations.

Accenture has developed CraneTagZ, a safety solution to mitigate the risk of crane incidents, for Chevron. The system uses ultrawideband wireless technology and wearables to improve worker safety in crane operations. CraneTagZ received the top honor in the oil and gas category at the Verdantix EHS Innovation Awards.

The US Chemical Safety Board has released a video animation covering its investigation into the explosion and fire at the Philadelphia Energy Solutions (PES) Refinery in Philadelphia. The explosion occurred when a corroded pipe elbow ruptured releasing hydrofluoric acid that ignited, causing a massive fire and explosions. A secondary event saw a 38,000 pounds fragment of surge drum was blown across the Schuylkill River. The CSB has also released a safety digest on the value of worker participation to prevent chemical incidents. The digest notes that lack of worker participation was a factor in several major incidents investigated by the CSB because workers and their representatives were not engaged to help identify hazards and reduce risks.

DevonWay’s safety management system has been deployed by a ‘large US utility’ with over 10,000 users. The system supports the NTSB/API Recommended Practice 1173 for pipeline operators which, according to PHMSA, the US Pipeline and Hazardous Materials Safety Administration is ‘one bad accident away’ from Congress making it an official regulation.

DNV GL has proposed a ‘digital twin concept’ to show the real-time status of safety risk and operations, adding a risk-analysis layer to the digital twin. The ‘probabilistic digital twin’ couples data-driven digital twins with risk-analysis modelling. More from DNV-GL.

DNV GL has also updated ISRS, its International Sustainability Rating System to ‘create transparency on how business processes impact on key operational criteria’. ISRS9 heralds a broader view of loss categories including cyber security. ISRS has been endorsed by Indonesian NOC Pertamina which has used the system to improve its sustainability culture and performance in its Algerian operations.

UK-based developer Ideagen has release Q-Pulse Law to help organizations manage compliance and legislation requirements, reducing the time, effort and cost in finding, translating and interpreting global compliance obligations and local laws. Q-Pulse results from Ideagen’s £3.5million acquisition of Scannell Solutions earlier this year.

IOGP Report 453, Safety Leadership in Practice: A Guide for Managers, provides an update on the 2013 IOGP Report 452, Shaping safety culture through safety leadership, providing insights and experience gathered in the interim. The report helps management apply the safety leadership characteristics described in Report 452 to create a workplace culture that values safety.

The IOGP has also published and overview of its Risk Assessment Data Directory (RADD) project, a summary of the history and intended uses of the RADD and an overview of the RADD components and their use in risk assessment.

Intelex has published Evaluating safety program performance: An ISO 45001:2018 checklist for EHS professionals. The Insight Report includes assessments for monitoring and measurement, performance evaluation, audits and management review.

The Center for Construction Research and Training and researchers at Washington State University have partnered to develop the Safety Climate Assessment Tool (S-CAT), a free online tool that helps construction companies assess their site safety climate. A questionnaire assesses safety climate factors that were identified by construction industry subject matter experts participating in a Safety Climate/Culture workshop sponsored by CPWR and NIOSH.

Security, safety and emergency response software house Restrata has rolled-out its eponymous Enterprise software platform. Restrata provides real-time monitoring and control of people, environments assets and reputation. Serica Energy has already deployed the solution across its North Sea operations and other companies are ‘on the verge’ of adoption.

Shell has awarded a five-year global enterprise framework agreement to Mangan Software XXXX http://www.ManganSoftware.com for the deployment of its Safety Lifecycle Manager (SLM) platform. SLM will manage Shell’s safety instrumented systems lifecycle for new facilities and across its upstream, downstream and midstream businesses. Pilots in Shell have demonstrated improved hand-over of process safety information from construction to operations, a 74% reduction in man-hours spent on functional safety engineering and optimization of safety critical trip function testing.

A special issue of The Rockwell Journal, the 2019 Oil & Gas eBook, focuses on reliable and safe operations. The issue covers the new IEC gas safety standards, safer chemical operations, well pad optimization and using analytics in the digital oilfield.


2019 OSIsoft EU user meeting, Gothenburg

DROP, MOL’s Danube refinery online program. EDEA, ENI’s digital energy analytics solution.

DROP, MOL’s Danube refinery online program.

Tibor Komróczki and Károly Ott presented MOL’s DROP (Danube refinery online) program, comprising OSIsoft PI Asset Framework (AF) and a data science stack built around Hadoop and Cloudera. MOL’s team of process information and automation specialists operate with IT in a minimum, supportive role. DROP reflects the need for an operations technology (OT) data infrastructure for rapid development of scalable applications to reinforce the use of data and analytics-based decision making. The aim is to be more confident in refinery decision making by ‘capitalizing on data science to statistically predict productivity’. The system is said to increase productivity and efficiency through best practices for data harmonization and provide a ‘deeper understanding of technological processes’.

A PI AF architecture is the backbone of digital transformation and advanced analytics in MOL Downstream. Data from the PI Integrator connects into a Kafka/Spark data preparation environment. RStudio also ran. Conditioned streaming data then passes off premises into the Azure cloud for real time analysis. Tools of the trade in the cloud include the Kafka event hub, Stream Analytics, ElasticSearch, Kibana Grafana and (for visualization) Power BI. The system is to integrate with the NICE inventory management system and Opralog (in 2020).

EDEA, ENI’s digital energy analytics solution

Lorenzo Lancia and Gianmarco Rossi presented ‘EDEA’, ENI’s digital energy analytics solution, an analytics dashboard that leverages machine learning models to monitor and forecast the energy efficiency of an upstream production facility. EDEA sets out to help technicians detect anomalies and suggest corrective actions. The system embeds a PI Data Archive, PI AF and a big data infrastructure built around bespoke Python programs and PI Vision. The forecasting model leverages a gradient boosting regression algorithm to predicts a CO2 emission index KPI for energy intensive equipment over the next 3 hours. The computation takes account of operational parameters, seasonal features and ‘exogenous’ constraints like temperature or humidity. Site operators receive a notification when real time data diverges from predicted values, indicating an anomalous situation. The dashboard is then used to drill down to pinpoint the root cause of the anomaly.

Data science development in ENI uses open source tools from the Python environment including Anaconda, Jupyter and Spark, leveraging ENI’s EDOF digital oilfield platform. EDOF’s standardized architecture provides ‘zero configuration’ access to the PI data archive for secure, authenticated access to time series data and PI functionality. EDEA ingests only ‘relevant’ time series data, there is no need to perform a ‘utopic’ ingestion of all PI data. After training, a serialized Pickle object is exported for deployment. Other tools used include Cloudera and Qlik. The authors report that to date, some 15 energy efficiency actions have resulted from EDEA monitoring, leading to a ‘significant reduction in CO2 emissions from a giant oil field’.

Read MOL’s, ENI’s and other OSIsoft Gothenburg presentations here.


2019 LBCG Onshore well site facilities summit

American Business Conferences event hears from Oxy’s digital oilfield, Agar Corp on multi-phase metering on every well and Hy-Bon EDI on vapor recovery and eliminating flaring.

digital oilfield work in the Denver-Julesburg Basin, Colorado. Here Oxy operates some 1800 producers, gathering 70 million data points per day from 140K real time tags. The team has developed digital oilfield workflows around Petex’ Integrated Visualization Manager (IVM), now used across the company. One use case addresses lease automatic custody transfer pressure swing surveillance arising from paraffin plugs that cause unmeasurable production deferment. The solution involved ‘breaking down silos’ and combining workflows such that everyone gets to see the same data. The system provides an in-depth understanding of production losses, analyzing impacts from perforations to sales. Another digital oilfield development automates pigging schedules, previously managed in multiple spreadsheets. Now data is captured automatically, linking dynamic and static data and providing roll-up summaries. Operators can record pigging operations in real time from their phones with increased efficiency and fewer errors. While the DoF program has produced few single large wins, the incremental 5 to 50 bopd gains ‘soon add up’.

Agar Corp multi-phase metering on every well

David Farchy presented Agar Corp.’s multi-phase metering solutions (MPFM). Today, the separator is still the standard equipment for assessing gas/oil/water composition. Agar wants to change this such that every well has a low cost MPFM. Agar’s systems have been deployed in the Eagle Ford shale, the Permian basin and in and Venezuela. MPFMs are networked, via the cloud, where ‘semi empirical self-learning AI algorithms’ correct and calibrate MPFM data continuously. Agar’s digital platform provides a visualization dashboard to web and mobile endpoints, enabling surveillance at the well, well-pad, field or reservoir level.

Hy-Bon EDI on vapor recovery and eliminating flaring

Jeff Voorhis, from Cimmaron Energy unit Hy-Bon EDI, advocated the use of vapor recovery units/towers (VRU/VRT) to eliminate flaring, mitigate volatile emissions and provide an additional revenue source. A site currently flaring 55MSCFD could be earning $132,000 per year (at a possibly optimistic $4/MSCF) and providing a 9 month payback on a $100,000 capex. VRTs need to be engineered for proper retention time to allow gas to separate and to avoid liquid traps in gas vapor piping to the VRU. Voorhis warned that the Texas regulator’s stance on flaring is shifting against the practice*. The Texas commission on environmental quality is also upping its inspection régime, with flyovers of oil and gas production sites with FLIR Cameras to spot fugitive emissions. Back in 21015, Noble Energy’s DJ Basin unit was fined some $13.5 million in civil penalties and another $60 million ‘to support environmental mitigation projects’. Noble agreed to upgrade its equipment to reduce emissions, with the work expected to be complete in 2019.

* See also the New York Times article ‘Despite their promises, giant energy companies burn away vast amounts of natural gas.

Next year’s Wellsite Facilities event is scheduled for 15-17 September 2020 in Houston.


2019 OilComm Conference and Exposition Houston

AccessIntel/OilComm conference hears from the Houston office of the FBI, from Invatare on AI/ML ‘not delivering as expected’ and from Datarobot on automated AI in oil and gas.

A word from the Houston office of the FBI…

James Morrison of the FBI’s Houston office cited former FBI chief James Comey as saying that the private sector is both a key player in cyber security and a likely victim, ‘the private sector possesses the knowledge, expertise and information to address cyber intrusions and cyber crime in general’. The problem is that, as ‘a survey’ has found, ‘60% of oils lack adequate cyber defense’ which leaves them open to exploits such as NightDragon, ShapeShift, the Havex ICS trojan, CrashOverride (caused the Ukrainian power outage) and the Trisis ICS malware. Other exploits target wireless systems such as Zigbee and LTE cellular networks. Morrison sees three ways forward: 1) blockchain-based systems that prevent data manipulation and fraud, 2) a ‘zero trust’ model of ‘adaptive security and visibility’ and 3) instant messaging to replace email which ‘will be obsolete by 2020, replaced by Slack’.

Invatare – AI/ML not delivering as expected

Trond Ellefsen, CEO of digital transformation specialist Invatare described digital transformation as both a ‘business risk and opportunity for oil and gas’. Landmark and Equinor started their ‘painful’ transformation journey early and today, both companies have reached an ‘impressive maturity point’ that will continue to accelerate both companies’ advances and distance them from their competitors. ‘We are experiencing a head-spinning and profound moment in time where everything is affected by everything else’. This is a ‘cross-industry self-fueling process’ which will, over the next five years ‘expand, accelerate and create new ripples in the fabric of the world we live in’.

Ellefsen provided a digital transformation status report for year-end 2019. Despite the promise, current investments in digital technologies, AI, machine learning etc. ‘do not seem to be delivering as expected’. Projects are not gaining traction, not generating the expected ROI and not changing behavior. The silos are not being broken down, the expected better answers are not coming and speed to delivery has not changed significantly. A 2019 Harvard Business Review study, found that of a ‘staggering’ $ 1.3 trillion spend on digital transformation journeys in 2018 across industries, close to $850 billion ‘went to waste’ and ‘80% of all digital projects are considered a failure’.

In an analysis that echoed the lead story in this issue* Ellefsen puts these failures down to the application of ‘small adjustments on top of a legacy architecture that was never meant for digital hyper connections and massive interrogation of connected historical and real-time data’. In order to succeed the oil and gas industry need to take ‘deliberate, differentiated and foundational approaches to its digital effort, incrementalism is no longer adequate’. Industry is in a deep transition which for the past few years has been driven by consultancies with more focus on their own revenue growth agenda than on better industry solutions.

What is needed is a cross functional integrated open platform. Ellefsen sees a ‘need for common concepts and cross the industry efforts’ to solve the pressing issues the industry is facing. In this context Ellefsen cited Halliburton’s Open Earth Community an example of ‘a future open architecture where data can be interrogated across functions and systems’. Cross company and community-like collaborative for modules like blockchain, security and other efforts that should not be handled by any one company alone. The OEC is an open industry collaborative concept available for ‘effective development of digital muscles’.

* On McKinsey’s advice for oil and gas CIOs.

Datarobot - Automated AI in oil and gas

Rajiv Shah’s (Datarobot) starting point was another digital fail, the fact that, as Rexer Analytics has found, ‘only 13% of data science projects reach production’ and worse, even fewer generate real business value. There are many reasons for this. Many AI projects are poorly defined from the start, with unclear goals and no measure of current results to benchmark against. The reliance on data scientists makes for long model build times, long review cycles and the creation of model documentation for review. Currently, model building involves tedious, manual work that requires deep experience of data science and coding. This leads to shortcuts, especially with junior data scientists! Finally, most data science projects fail when the models are integrated with business processes. DataRobot offers an alternative approach, automating the model building, validation, and deployment process with embedded data science best practices and guardrails incorporated directly into the platform. On example is geological facies classification from well logs. In the DataRobot environment, well logs are compared with labeled core samples to set-up a supervised machine learning model. The model is then generalized to predict facies types from log data alone. More on AI in geoscience from the DataRobot blog.

Next year’s OilComm Conference will be held from October 14-15, 2020 in Houston.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.