Oil IT Journal: Volume 23 Number 5


US Geological Survey moots National Data Repository

Energy Resources Program report highlights lack of ‘single source of consistent, national-scale geologic data’.

Future directions for the US Geological Survey’s Energy Resources Program (ERP) are described in a new 168 page publication*. The ERP portfolio covers both domestic and international estimates of technically recoverable oil, natural gas and other geologically-based resources.

According to the USGS, ‘oil and gas will remain important parts of the US energy mix over the next 10-15 years’ and the US will continue to need ‘independent, unbiased assessments’ of domestic and international oil and gas inventories to inform policymakers. Looking forward, the ERP is to increase the transparency of its assessments by releasing more raw and intermediate data products. In addition, new technology will be deployed to ensure that assessments can be updated as new information and analyses become available. These include data analytics to speed evaluation in the face of changing geologic and financial data. In particular, the pace of development of unconventional oil and gas means that the ERP needs the capacity to quickly publish updates to its assessments. Current ERP reporting strategies make this difficult.

Data management and custodianship are to grow in importance as more data are collected and the impacts of complex energy-related decisions need to be evaluated. The ERP maintains geologic energy resource-related databases of rock and fluid geochemistry analyses from sedimentary basins and geologic provinces within the US and the world. These are the only publicly accessible, national-scale archives for such data. However, the ERP product line could better inform decision making by including more resource assessment input data and data related to the environmental impacts of resource development. Another target for improvement is improving access to existing data for instance by digitizing core and log data. Additional challenges arise from database design, data migration and loading and incorporating proprietary data sets.

The ERP Future Directions moots the establishment of a US National geoscience data repository. While recognizing that this would be a ‘formidable task’ requiring substantial investments, the authors believe that advanced data science and analytics are ‘beginning to transform the ways in which organizations collect, analyze, and manage their data’. New tools for managing and evaluating data would enhance the ability of the ERP to address these challenges. Exploring these possibilities will be undertaken with input from product consumers and external experts in data information and management.

Currently, the US lacks a single source of consistent, national-scale, publicly available geologic data to support resource development, policy and regulation. Data is currently inconsistent and scattered across federal and state agencies and the private sector. While the ERP currently maintains some databases, stakeholders do not consider these as meeting their needs. Expanding ERP efforts to become the recognized public source of comprehensive information would be consistent with the ERP mission to provide reliable and impartial scientific information on geologic energy resources. The long-term goal (10-15 years) is to expand current data compilation, archiving, and dissemination practices and establish the ERP as the national custodian and disseminator of energy-related geoscience data for the United States.

* Future Directions for the U.S. Geological Survey’s Energy Resources Program. A ‘consensus study’ from the National Academies of Science, Engineering and Medicine/National Academies Press.


The Digital Twin in Oil & Gas, an investigation

Oil IT Journal looks behind the hype at ‘twins’ from ABB, BHGE, BP, eDrilling, Halliburton, Kongsberg, Lloyds Resgister, Maplesoft and Siemens.

In a recent Google groups discussion, Jon Awbrey opined (in a different context) that ‘in fields … where fundamental progress is rare if not non-existent, one way researchers can stave off a sense of stagnation is by playing musical chairs with terminology.’ This could be a characterization of the current use of the term ‘digital twin’, which, on the face of, it is a new buzzword for the ‘simulator’, one of the very earliest applications of information technology. Imagine if Exxon’s researchers back in 1955had coined the ‘digital twin’ term, marketing would have progressed 60 years overnight! Digital models of plants and reservoirs have been around for a very long time and have been widely used to optimize operations. Hitherto these have been mostly forward-modeling, physics-based models. The digital twin notion frequently extends this paradigm with ‘data-driven’ models derived from machine learning from large sets of historical data. Combining physical models with the data-driven approach is a common theme in digital twin literature, but exactly how this squaring of the circle is achieved is usually left to the imagination.

In our investigation we find rather a lot of marketing-style music but also, insights into the complexities of trying models together, flanging them up with reality, adding in some AI and making the twin do something useful. Definitions and implementations differ but there is commonality. Before the ‘digital twin’, multiple simulators were used to model various parts of a plant or process. Process parameters were changed in different control ‘loops’ from high frequency automatic local loops to longer term ‘big loop’ updates to plant parameters. The digital twin sets out to consolidate multiple models in a single environment. This environment additionally collects real time process data which is claimed to enable real time update of the model(s). Prediction of future performance and feedback to optimize the process, are further ambitious claims.

Whether or not this is feasible is moot. But the idea of consolidating multiple models into a single environment capable of ingesting real time data is clearly of interest. In our examination of the digital twin, we see two approaches to achieving the ideal. One is the ‘platform’ i.e. a more-or-less proprietary solution from a single vendor. The other is a standards-based approach using, perhaps, FMI, the functional mockup interface (see below).

A twin is often presented as behind a front-end display similar to the control room. But there is a critical difference here. In a traditional control room, the intent is to show an operator what is happening in the real world. A control room is an exercise in situational awareness. In a digital twin however, it may be hard to distinguish between what is real and what is simulated. The digital twin is intended to stay in step with reality. What happens when reality and the model drift apart is both a potential source of insight and confusion.

Halliburton - The Voice of the Oilfield

The digital twin was elegantly described by Michael Grieves at the 2018 Haliburton/Landmark iLife event as a system of systems. This is not exactly new. Grieves himself introduced the term back in his 2002 review of the origins of the digital twin concept. Grieves is executive director of the Center for Advanced Manufacturing and Innovative Design at Florida Institute of Technology. The FL IT was spun out of NASA, itself an early proponent of the digital twin notion. Simulators date back to the 1970s. Around year 2000, 3D digital representations were becoming available. Since then, the digital twin concept connects the virtual and physical worlds with bidirectional data and information flows. The digital twin is used to test and build equipment in a virtual environment until it performs. Front-running simulation i.e. design also ran.

In another webcast, Dale McMullin and Ed Marotta presented Halliburton’s Voice of the Oilfield and well construction initiatives that leverage a digital twin and the ‘system of systems’ approach. Here, models are connected via an ‘open co-operative infrastructure’. This supports well construction/completions from subsurface to topsides. Present day digital twin thinking adds AI, ML and analytics to models such that the twin ‘learns and updates itself’. The system of systems potentially spans reservoir to refining as described in a Deloitte Center for Energy Studies publication bytes to bbls 2017. The Voice of the oilfield, aka Well Construction 4.0 adds prescriptive analytics, combining physics-based and data-derived modeling to offer a ‘scientific foundation along with data-driven adaptability’.

ABB’s twin provides a ‘formidable’ digital data trail

For ABB, the digital twin is a ‘complete and operational’ digital representation combining PLM* data, design models and manufacturing data with real-time information from operations and maintenance. ABB envisages a common digital twin directory that points to data stored in different places to enable simulation, diagnostics, prediction and other use cases. ABB’s twin tracks the ‘formidable digital data-trail’ of CAD drawings, design and build information and equipment and configuration data. In addition to actual observations, the twin offers algorithms to calculate ‘non-observable parameters’. As an example, ABB cites its electromagnetic flowmeter. Previously this might have been referred to as a ‘virtual gauge’ but the terminological musical chairs now have it as a digital twin. To summarize, ABB’s DT is a ‘directory’ containing a digital image of physical equipment. This is claimed to go beyond a ‘static description’ of the plant but how the DT is kept in sync with reality over time and the use to which its simulations are put could do with some more explanation.

* Product lifecycle management.

Maplesoft - MapleSim and the Functional Mock-up Interface

A publication from Maplesoft describes the ‘virtual commissioning’ (VC) of manufacturing systems leveraging a ‘virtual plant model’. Maplesoft traces VC back 1999, when ‘soft-commissioning’ was used to debug parts of a future physical system aka the digital twin. Today, model-driven digital twins are widely used in design and optimization leveraging tools such as MapleSim allowing the creation of a model-driven digital twin at design time. CAD import technology has been key in making digital twins more accessible. Different models address different aspects of manufacturing. One may model the complex physics of a plant while another the computer control systems model. Since 2010, the Functional Mock-up Interface* (FMI) is used as a standard interface for a variety of model-based processes. The FMI standard organizes model data such that it can be shared across software tools. FMI format data is shared as a single file of containing variable definitions, system equations and other parameters. As of 2017, FMI is supported by over 40 common engineering tools. MapleSim 2018 is presented as a system-level modeling tool for designing digital twins for virtual commissioning and/or system-level models for complex engineering design projects. MapleSim 2018 provides greater toolchain connectivity with the ability to import models from more software tools, and adds support for FMI 2.0 fixed-step co-simulation and model exchange.

* Other similar functions are achieved with protocols such as the High Level Architecture (defense) and CoLan (petrochemicals). Simulation interoperability is a domain unto itself - see for instance SISO, the Simulation Interoperability Standards Organization.

BP - APEX, a ‘welcome new member of the family’

BP’s APEX simulation and surveillance creates a virtual copy of all BP’s production systems throughout the world. Apex is a production optimization tool that combines asset models. It doubles as a surveillance tool used in the field to spot issues before they affect production. In a rather unhelpful analogy, BP compares APEX with ‘a digital twin of the human body’ but where, ‘instead of arteries, veins and organs, APEX is programmed with data about each BP’s wells, their flow regimes and pressures, underpinned by physics-based hydraulic models.’

Apex is claimed to have sped BP’s simulation from hours to making continuous optimization possible. APEX is further reported as having delivered 30,000 barrels of additional oil and gas production a day in 2017, across BP’s global portfolio. More is expected in 2018. ‘The digital twin is more than just a virtual phenomenon, but a very welcome new member of the BP family.’

Kongsberg - Kognifai and the maritime Tesla

Kongsberg rolled-out its ‘innovative new digital twin concept’ at the 2018 Offshore North Sea conference. The ‘groundbreaking technology’ that connects the digital and physical worlds has undergone a successful feasibility study with Equinor. The twin is a virtual model of unmanned oil and gas production facilities that leverages Kongsberg Kognifai AI platform. The twin integrates disparate data together in a single, secure and user-friendly cloud-based platform.

Another take on Kongsberg’s twin came from Lars Meloe’s presentation at the 2018 GBC IIoT in oil and gas conference. Meloe considers the twin as encompassing AI/ML, static, historical and real-time data and coverage of technical to business usage ... all delivered through a context-dependent user interface. Kongsberg’s flagship Yara Birkeland autonomous container ship, aka the ‘maritime tesla’ embeds Kongsberg’s digital twin and Kognifai. Under the scupper decks is a digital twin comprising 3D models and K-Sim simulator models. Another example is Equinor’s Oseberg-H 30/6/H platform where Kongsberg’s digital twin, along with Autonomy, SAS on an Edge Gateway and a KognifAI data platform. Read a version of Meloe’s presentation online here.

Siemens

Speaking at the 2018 GBC IIoT conference, Elgonda LaGrange from Siemens’ Dresser-Rand unit described placed Siemens’ MindSphere platform and Comos engineering data model as comprising its digital twin, aka SimCenter. Luc Goosens added that the digital twin concept evolved out of system engineering and product lifecycle management (PLM). Siemens has gathered multiple disconnected simulators and applications (STAR-CCM+. NX CAE, NX NASTRAN, LMS and more) into its twin which is claimed to allow complex cyber physical systems to be designed and compliance-tested before manufacture. SimCenter ‘combines physics-based simulations with insights gained from data analytics’. Goosens concluded that ‘a digital twin without a closed-loop reality check is an illusion’.

Baker Hughes GE

Baker Hughes, currently a GE company, has deployed digital twin technology to track and optimize its supply chain. The twin considers factors such as part delays and weather disruptions to continuously update and share information across multiple organizations, ‘improving delivery times, reducing inventory costs and creating efficiencies’. The twin was developed by an interdisciplinary team from GE Global Research and BHGE. GE claims to have ‘more than a million’ digital twins already deployed across its business but believes that the application of digital twin technology to supply chain is ‘breaking new ground’. Curiously however, the BHGE announcement makes no mention of Predix, previously presented as GE’s solution to all software and cloud integration problems.

Lloyds Register/GE

In a curious announcement, Lloyds Register’s has granted ‘approval in principle’ (AiP) to GE’s Predix asset performance management (APM) system under LR’s new ‘digital compliance framework.’ The AiP means that GE is recognized as ‘providing assurance that Predix APM meets the data, technology and software requirements for a predictive system’. The AiP to ‘digital twin ready’ is the first level of approval on LR’s digital compliance framework. Further approval levels can be applied to the digital twin throughout its creation and deployment with the ‘approved’, ‘commissioned’ and ‘live’ notations. ‘Digital compliance’ was validated in a joint LR/GE ‘co-creation project’. It is unclear whether LR now considers itself to be a certification body or a marketing organization. A bit of both perhaps?

eDrilling

Norwegian eDrilling’s digital twin is used across well planning, drilling monitoring and real-time optimization. The tool supports real-time forward simulations to avoid drilling problems. In our quick spin through the digital twin landscape, eDrilling’s control room has to be one of the sexiest interfaces to ... either reality, or a simulation, or again, a bit of both!

In conclusion ...

The digital twin buzzword has more substance to it that ‘digital transformation’ that we reviewed last month. It is an interesting notion, but is it new? Industry has got used to handling multiple simulators for many specific tasks. Computer control operates at many levels – from local loops controlling how valves open and shut, to ‘big loop’ adjustments made with reference to an overarching model of a facility. What’s difficult to achieve is to bring these different models with their own scope, granularity and time frames together. And what is perhaps even more difficult is to match model performance to real time data and act on the results. Calling the whole shebang a ‘digital twin’ does not progress these issues much.


Editorial - Computer security in the cloud

Those dashing headlong to the cloud get a reality check as Google reveals, months after the fact, a breach in its Google + social network. Oil IT Journal Editor Neil McNaughton quizzes Google about the breach and contrasts the ‘cloud is secure’ approach with the plethora of bespoke oil and gas cyber security solutions we report on in this issue’s Cyber Security Round-Up.

At the EAGE earlier this year, geophysicists were bombarded with the notion that the cloud is where the smart folks are going. We were left with the impression that the move to the cloud is the very first, essential, step in the journey to digital nirvana-cum-transformation. Concerns about security in the cloud were brushed aside. Schlumberger’s Ashok Belani stated that “it is [Google, Amazon…] their profession to do this. Data is safer in the cloud than inside an IoC, let alone smaller companies. Google’s Gmail is one of safest platforms around.”

Of course, what Belani and nobody else knew was that some months earlier, Google had indeed been hacked but had decided not to tell anyone about it! OK, it was Google +, the company’s flagging attempt at a social network that was breached, but users of the ‘safest platform’ Gmail were also invited to use the ‘Security Check-up’ to see what other apps are linked to their account and revise their security parameters accordingly. The Guardian’s report on the incident has it that 438 different third-party applications may have had access to private information due to the bug. Google apparently has ‘no way of knowing’ whether they did because it only maintains logs of API use for two weeks. Ouch! That does not sound like a cyber security best practice to me. It doesn’t even sound like Google’s regular data retention which has been described as a ‘backup of the internet’ I guess there is one policy for stuff that Google plans to make money with, another for regulatory-sensitive data!

Even nearer to the geophysical bone, as we reported is the fact that Schlumberger’s Delfi uses Google’s Apigee API management platform to provide ‘openness and extensibility’ allowing clients and partners to add their intellectual property and workflows into Delfi. We asked Google if the flaw was in Apigee itself. We were assured not and pointed to the official release. This merits a good read through. It lets you know just what you are signing up for when you ‘accept’ the default T&Cs – chez Google and indeed with a labyrinthine network of unseen third parties.

Speaking at the 2018 CERA Week, Rice University’s Charles McConnell opined, re cyber security, that ‘No one has really got a great pathway or program, with everyone hunting in the dark. Everyone is searching for comfort, hoping that they are doing the right thing, with the right technologies and with support of the right companies and partners. Regulations do not exist and need to exist, and the leadership needs to be in place’. McConnell called for an ‘ISO-like’ cyber security standard for high performance industries. Well, good luck with that! McConnell also gave a gentle push to steer oil companies away from their traditional posture of ‘keeping data within their gates’ which ‘limits the usefulness of the digital ecosystem’. Maybe it does. These are oil companies after all.

Currently the world seems to be in denial about cloud insecurity. You may buy into the idea that the cloud providers are better than you are at cybersecurity. But on the other hand, there are billions of users of cloud data centers and even more anonymous IoT endpoints ready to be exploited. I guess it is easier for a CIO to buy into the ‘cloud is secure’ notion. The alternative is to work your way through the multiple cyber security offerings as exemplified in the latest Cyber Security Round-Up in this issue. BTW, Oil IT Journal has been tracking cyber security in oil and gas for over 20 years. From ‘deperimiterization’ to ‘re-perimeterization’ and now, well, what is the cloud? A cyber fortress or the next Maginot Line?

@neilmcn


Text Analytics APIs, a consumer guide

Oil IT Journal takes a spin through Robert Dale’s authoritative analysis of commercial NLP programming tools.

The Language Technology Group has just produced a report titled ‘Text Analytics APIs (TAAPI), a consumer guide’ an analysis of commercially-available hosted programming environments. TAAPI is authored by LTG’s natural language processing guru Robert Dale. The 275 page guide includes an exhaustive test of 26 APIs including Amazon Comprehend, Bitext, Google NL, Reuter’s Open Calais and TextRazor, which vary considerably in terms of breadth of functionality. Amongst the more fully-featured, standout performers are Aylien, MeaningCloud, Rosette and Lexalytics’ Semantria.

Text analytics (TA) is concerned with extracting information from documents and in automated document classification. The commercial offerings are evaluated according to ten key TA capabilities. These include the ability to extract entities (people, places...) from text, classification into categories (sport, business...), sentiment analysis (for or against...), summarization to a short text, and tagging documents according to concepts that may not be specifically mentioned in the text.

Most all of the APIs studied are accessed through an HTTP interface supporting cURL with output in JSON. Data can be supplied as a text string a document or an URL. The report summarizes API capabilities and pricing models along with a short ‘impressions’ paragraph that summarizes typical use cases and limitations. Some APIs come with industry packs (but not for oil and gas!) and for some include access to structured databases of company information. Tests on the different APIs show how automatic keywords and phrases can be extracted from text. The success or otherwise depends on how valuable the keywords phrases are to a particular domain. Results vary quite widely. Some tools fail to return the position of a keyword in a text or do not assess keyword relevancy limiting their usefulness. Other capabilities investigated include linguistic analysis (sentence splitting, syntactic analysis…), relationship extraction (with pre-defined such as ‘has acquired’ or ‘open’ relation extraction). Dale warns here that ‘relationship extraction is the bleeding edge of text analytics [..] requiring a degree of sophistication that is beyond the current state of the art’. This function is considered ‘aspirational’ for most vendors. However, ‘if a vendor provides targeted relationship extraction that is relevant for your business, this should be a major factor in your API choice’.

In conclusion, the decision to deploy a text analytics API clearly implies a significant commitment in developer resources. Selecting the best API for a particular use is clearly a major decision. TAAPI provides a wealth of material to guide this choice and avoid the pitfall of a poorly suited toolset. TAAPI costs $895 and is available from Language Technology.


Review - Demystifying OWL for the Enterprise, a review

Michael Uschold’s book does a good job of demystification. But claim of ‘growing take-up’ after a 15-year incubation period is questionable.

The big question that surrounds Demystifying OWL for the Enterprise* (DOE), a new book by Semantic Arts’ Michael Uschold, is, is it timely or too late**! The publishing blurb has it that semantic web technology stack has seen a ‘slow incubation period of nearly 15 years’ but that today, ‘a large and growing number of organizations’ now have one or more semweb projects underway. Oil IT Journal has been tracking semantic technology oil and gas since 2003 and has reported, in over 200 articles, on progress that has been faltering to say the least. Semantic technologies still attract enough R&D cash to keep students on either side of the pond in pot noodles, but industry at large would appear to be turning more to commercial ‘graph database’ tools in the (so far few) occasions where there is a pressing need to use something other than a relational database.

Notwithstanding these reserves, ontologies are key to information systems of all sorts and those embarking on any kind of natural language processing venture are likely to come across the arcane terminology of the field. Perhaps a better question to ask of DOE is, does it do a good job of demystification? For this non-specialist but curious reviewer the answer is yes. DOE focuses on ‘the 30% of OWL that gets used 90% of the time’. Moreover, this 30% can be ingested in Part 1 (74 pages). So, what is an ontology anyway? It is a model that represents some subject matter. And what is OWL? The claim is that OWL provides a schema for a particular domain that is more informative and flexible than the RDBMS schema. OWL sees the world as triples, subjects and objects connected by relationships. Google is an InstanceOf a Corporation. And a Corporation is AKindOf Legal Entity. From which a computer can figure that Google is a Legal Entity.

DOE goes on to explain namespaces which allow different ontologies to co-habit, and resource identifiers, which pin down objects and relationships into their namespace. The use of the term ‘uniform resource identifier’ URI is confusingly close to URL, especially as they can be one and the same. But DOE does a good job of untangling these. We dug deeper into DOE to seek enlightenment on another term that has always puzzled, ‘reification’. This is explained as a kind of work-around to allow for the mapping of one-to-many relationships in OWL. While this is clear enough, this intrinsic awkwardness does little to convince the reader that OWL has the upper hand over the RDBMS.

Where OWL has a good claim to primacy over the relational model is in the field of machine reasoning. This is achieved by applying logic to syllogisms – if all men are mortal and Socrates is a man then Socrates is mortal. That’s simple enough, but what follows is where the going gets hard. As hard, in fact, as formal logic. The assumption is that the computer will be able to ‘reason’ across a large set of data such that, again with the healthcare example, the reasoner will be able to figure ‘who are all the patients that Jane has given care to’. And this without any such direct relationship being built into the model. It is not hard to see how this could be extended to other fields such as geology with multiple overlapping categories.

Part 2 concerns the 10% you don’t use all the time. One field that we have covered in the past is the Upper Ontology, an overarching ontology that can tie sub-ontologies together. DOE recommends Gist Semantic Arts own ‘minimalist’ UO with a nod to BFO, the basic formal ontology that is said to be ‘widely used by scientists’. The short section on the UE fails to capture the heated debate surrounding these issues although in his conclusion, Uschold observes that ‘if there are two ontologists in the room there will be at least three opinions on how to model a given thing’.

DOE does a good job at demystifying OWL. But is the technology really all it is cracked up to be? Chapter 7 handled this question asking by enumerating OWL’s limitations. This chapter alone merits the book’s purchase by any contemplating an OWL project. The discussion of properties as property values shows how hard it is to model engineering units in semantic triples, suggesting some workarounds for such awkwardness.

* Morgan & Claypool’s series on the Semantic Web: Theory and Technology: Demystifying OWL for the Enterprise, Semantic Arts, Inc. Paperback ISBN: 9781681731278 eBook ISBN: 9781681731285 Hardcover ISBN: 9781681732831 May 2018, 264 pages .

** Tim Berners-Lee unveiled the embryonic semantic web at the first meeting of the W3C in 1994. OWL was introduced in 2004.


OVS Optimization Matters Conference, OMC 2018

One Virtual Source’s user group hears from EP Energy, Parallel Petroleum, and Southwestern on deployment of the popular oil and gas production management and surveillance suite. McKinsey, Emerson and Hitachi chip-in with OVS joint offerings.

OMC 2018 was hosted by Southwestern Energy earlier this year in The Woodlands, TX. OVS provides supplier-agnostic technology that connects to oilfield data sources along with packaged workflows targeting efficiency, data analysis and surveillance actions that lead to production improvements.

EP Energy changes operations in seven days

Following a reorganization early in 2017, EP Energy saw significant personnel reduction and rationalization in the form of a shift from distributed asset management teams and tools to centralized surveillance and operations. A demonstration by OVS flagship client ConocoPhillips convinced EP Energy’s Chris Josefy that it would be possible to ‘change how your company operates in 7 days’. OVS has been adapted and rebadged as EP’s Well360 asset surveillance and optimization platform. Prior to the upheaval, OVS was already deployed and its data was managed.

The residual issue was less a data problem than one of managing expectations. EP needed integration across its assets and subject matters along with better reporting concepts. Business Objects and Spotfire were considered before settling on OVS. OVS was selected because of its live data functionality, PDF/email automation and rapid implementation (one week!). Drilling metrics that were previously spread across OpenWells, Excel and Spotfire are now all in OVS. Questions raised in management meetings can be investigated in real time on live data, along with active tracking of capital costs and ‘financial visibility for all!’ Josefy is now working on HSE metrics, capital management in more complex operations and rig scheduling.

Emerson - PlantWeb IoT and well surveillance joint venture with OVS

Jose Jimenez presented Emerson’s approach to oilfield operations. Operational business improvement heralds a potential $7 billion value-add for Lower 48 operators from both an 8% hike in production and a 40% reduction in opex. The industrial IoT is a key component of the shift but ‘only 5% of companies have an in-depth IIoT strategy’. Enter the digital transformation, with Emerson’s PlantWeb IoT*, that will bring ‘best in class behaviors’. Among these is a shift from inappropriate time-based maintenance to an ‘ideal mix’ of predictive and preventive procedures. Automation likewise has a major impact on worker safety and environmental compliance. OVS and Emerson are currently working on joint well surveillance workflows that combines Emerson’s sensor data with other client data.

* PlantWeb long predates the current IoT. Our earliest reference goes back to 2006.

Hitachi Vantara’s ‘Hudson’ AI for the oilfield

OVS is also looking at a joint venture with Hitachi’s Vantara unit to apply the latter’s Hudson big data/AI solution to the oilfield, starting with production optimization with improved forecasting and failure prediction modules. David Smethurst and Derek Beckford described ongoing proof of concept trials using sample data provided by OVS. Hudson is already in use in the manufacturing industry and Hitachi sees similar potential for intelligent asset management and optimization through a technology collaboration. Hudson applies data modeling and active monitoring for continuous digitization of business operations and process improvements. The solution is claimed to address ‘problems with complex and at-scale data, which human or existing technology cannot resolve easily’. Oil and gas targets include unplanned downtime, information overload (un-analyzed sensor data) and operator overload (managing thousands of process alerts per day).

McKinsey - Digital ’18, SAAK and the ’smartphone dividend’

McKinsey’s Richard Ward presented on digital trends in oil and gas, speculating on requirements for the next generation platform. The present decade has seen a shift from the Halliburton/Schlumberger duopoly to the current ‘single dominant player’ that is Schlumberger’s Petrel. Looking ahead to the 2020s, the big uncertainties concern the cloud’s suitability as a scientific computing platform, real time workflow integration, the impact of advanced analytics and smartphone app-based solutions. The picture for the Lower 48 is also driven by unconventionals special needs.

Ward described the ’smartphone dividend' that is benefitting heavy industry including oil and gas. McKinsey’s ‘Digital ’18' blends IoT/robotics, advanced analytics and mobile. Digital ’18 sees ‘the end of cut and paste’ as reporting tasks are increasingly automated. Operations and maintenance are seen as a sweet spot for (offshore) digitization and workflow automation. The IoT is enabled by low cost measurement systems, Ward cited the AeroVironment PetaSense vibration monitor, a $25 device, and the replacement of dangerous physical inspection by drones. Ward also referred to McKinsey’s own ‘SAAK*’ solution, based on OVS technology. Ward envisages a shift from traditional engineers working with Excel to ‘hoodie’ engineers using R, Hadoop and Spark. An ‘interesting story’ is being told as independent (like EOG and Oxy) frame ‘post cost cutting’ competitiveness around digital technology.

* "Simple Access to Data Automated Knowledge" - McKinsey’s wrapping/offering around an OVS core.

Parallel Petroleum - OVS-based workflow automation

A presentation from Parallel Petroleum’s Robert Archer fitted into the post-downturn optimization-through-digitization paradigm. Parallel’s new requirements included transparent integration of production surveillance across corporate and field offices. Parallel integrated OVS’ new SQL database with in-house systems (OFM, WellView, Excalibur and ClearSCADA). The new solution has brought ‘dramatic’ transparency between field and corporate offices and, for the first time, lease operating expense (LOE) reporting at the well level. All for the cost of a ’single production tech for a year’. Archer concluded by enumerating the huge time savings that were made necessary by staff reductions in the downturn, and which were enabled by workflow automation in OVS.

Southwestern Energy - OVS displaces legacy toolset

Jim Vick from the meeting host Southwestern Energy (SWN) reported on its partnership with OVS that has seen the replacement of both an in-house developed tool and a commercial package by OVS that now integrates information from WellView, production data and geosteering. The solution is now deployed at all SWN assets across production, reservoir and completions teams.

OVS news - Workflow Libraries, new web client

OVS’s own Tim Loser outlined current and future developments including OVS Workflow Libraries (OWLs) and a new OVS Web client. OWLs are packaged workflow solutions addressing specific topics such as asset management, field data management. The new OVS web client is a completely re-architected user experience supporting dynamic dashboarding and data discovery. For more on OVS 18 see the recap.


ECIM 2018 E&P Data Management Conference, Haugesund

Introduction - Equinor’s Omnia and Digital Subsurface Laboratory. Data quality and the cloud. Agile plugs open source for upstream. New ‘Society for Petroleum Data Managers’ announced!

Highlight of the 2019 ECIM data management conference was Equinor’s announcement of a Digital Subsurface Laboratory (DSL) (DSL) which is to drive digitalization in Equinor’s subsurface domain by ‘applying AI and data science technology to create value from key subsurface datasets.’ At the heart of the DSL is Omnia, Statoil’s cloud-based data platform that is to run as a ‘subsurface data lake’ in the Microsoft Azure cloud.

Along with the push to the cloud, Equinor, Teradata and others signaled a renewed focus on data quality as a prerequisite for AI/ML. Another apparent trend is the consecration of open source software for the corporate environment, with a keynote from Agile Scientific’s Matt Hall and endorsement of open source tools by Teradata (a technology partner in the DSL) and others. Shell’s Lars Gaseby announced the creation of a new ‘Society for Petroleum Data Managers’, based in Stavanger. The SPDM has support from ECIM, CDA and EU major oils. Finally, ECIM is a dense, information-packed venue with several parallel sessions. We will be reporting more from the 2018 event in our next issue.

Equinor’s Omnia and Digital Subsurface Lab

Tina Todnem (Equinor) and Duncan Irving (Teradata) presented Equinor’s subsurface digital transformation. Digitalization is said to have a $2billion potential contribution to Equinor through drilling automation, subsurface illumination, production optimization and more. Usually, such a transformation takes ‘5 to 10 years’ to show value, Equinor plans to speed this up by running its Omnia data platform as a Microsoft Azure subsurface data lake*. Microsoft’s machine learning and analytical toolset will be deployed across the E&P data spectrum. To date, prospects and fields are studied with ‘large, complex grid-based models’. Equinor wants to add more empirical data to the mix, ‘not losing the model but adding data-driven, computer assisted optimization’, a ‘hugely multi-disciplinary task’. The idea is for an integrated decision-making framework that embeds the carbon footprint of oil and gas development.

The tricky part in building the ‘digital subsurface superhighway’ is data prep and obtaining good training data sets. This relies on combining open source tools, data standards, APIs and tacit information. The initiative involves collaboration between data managers, data engineers, data scientists. Equinor is a ‘huge fan’ of open source software as witnesses by the ‘OPM’ reservoir simulator.

Teradata’s Duncan Irving provided some more details on the Digital Subsurface Lab, a ‘virtual’ joint venture between Teradata, Equinor and others. The DSL is to apply data science to optimization. But what to optimize? Current methods focus on an individual well or field by optimizing a few KPIs. But ideally, we should look at a bigger picture that embraces incentivizing personnel, safety and maximizing production. Enter the DSL’s ‘system of systems’ spanning subsurface, HSE, facilities and portfolio management.

One current issue is the fact that data management in the upstream does not yet have the right skill sets and fails to provide data in a usable format and of suitable quality. Irving advocates once and for all ‘proper’ management of data and metadata that will support ‘analytics on data we can trust’. Areas in need of attention include standardized vocabulary, file-level data validation and log edits and corrections. All of which must be done before machine learning can work. Quality comes when data management meets domain expertise. The DSL operates in 3-6 month sprints to minimum viable products. The DLS is also working with Energistics, Halliburton’s Open Earth and the recently announced Open Group’s Open Subsurface Data Universe.

* But as we observed in our last issue, data residency issues have meant that the data lake is to be constructed in Norway.

Troika - Tempering the enthusiasm

Troika’s Jill Lewis picked up on the data quality issue with reference to the SEG’s quality-targeted standards for seismic data. Automation, machine learning and data science may be sexy, but ‘nothing can be achieved without standards’. For too long, seismic workflows involve editing and futzing with seismic data. It is time to change and standardize. Now all are on board in principle. However, contractors still lay down the law with their proprietary formats which hamper the application of groundbreaking developments in data science. At the workstation, similar proprietary environments (Petrel, OpenWorks, Kingdom) make the cost of integration huge. The Latest SEG formats are machine readable. Users should be knocking on the SEG’s door but in fact, SEG-Y R1 has seen ‘almost zero implementation’. Lewis and others (notably Stanford’s Stuart Levin) have provided machine readable SEG-D and an upgraded SEG-Y R2 to accommodate output from SEG-D. But yet again, people did not adopt making Lewis ‘very upset’. Even users of SEGY R2 tweak the header’s metadata in idiosyncratic fashion. ‘So now I get very angry!’ Seismics is at the heart of most oil and gas big data machine learning activity from finding patterns in data to automating analysis of well logs, production data and life of field passive seismics. Lewis wound up citing a recent IEEE article on ‘Why the future of data storage is still magnetic tape.’ Mag tape, by the way, is used extensively in the cloud.

Shell – data quality standards

Henk Tijhof reviewed Shell’s 10 principles of data management, first published in 2003. Today, only the ‘legal compliance’ principle is showing green. This led to a change of direction in Shell’s technical data management and a renewed attempt to ensure that next generation systems work as advertised. Here, standardization is the ‘next innovation’. Meanwhile Shell has disbanded its central data team and combined data management with operations.

In the big data arena, Tijhof reported that Shell’s experience to date shows that the focus needs to be on data streams not on the data lake. Today, the aims of Shell’s data managers are to ‘eliminate, simplify, standardize, automate’ (ESSA). This is done through data-driven workflows, eliminating silos and using workflow consultants to improve business processes. Shell’s LEAN-inspired ESSA was presented back in 2011 at APQC.

A data scientist’s take on the Digital Subsurface Lab

Laura Frolich and Larissa Glass (Teradata) outlined some of the projects they have been working on in the Equinor Digital Subsurface Laboratory (DSL). One direction is to eliminate or at least minimize data prep. Typically, this involves leveraging data from real time MWD/LWD systems alongside applications such as Oilfield Manager (OFM) and data from the Norwegian Petroleum Directorate (NPD). Other generic data sources also ran … Excel, Ascii and more proprietary formats. The data science tenet here is to consider raw data as ‘immutable’, that which must be recorded as is and not touched. This can be stored along with process workflows that generate reproducible results. Problems arise still with data quality – missing/wrong values, naming conventions and poor documentation.

Current ‘smart’ workflows in Excel may be popular, but quickly become unmanageable as one error is fixed and now there are two spreadsheets. There is a big difference between what can be done in Excel and what will be done in Excel. Data science can be used to figure out what’s happening in Excel and replicate it in Python. Ten lines of Pandas dataframe Python code can combine OFM data with NPD metadata for ingestion into a predictive model. The LAS-IO utility also ran. The workflow adds standard column mapping and naming in a data cleansing pipeline prior to output in a ‘configured ADS*.’ For those still keen on Excel, data export in CSV remains an option. The DSL aims to stop reinventing wheels through agile techniques, parsing, templates, QC, profiling, analytics and visualization. The aim is for reproducible results and a reusable, trusted code base. Unit tests, documented APIs and inline comments also ran. Moreover, agile supports remote work, there is ‘no need to co-locate’. The DSL team is spread around the EU. Finally, yet another take on the data lake metaphor. The authors see the DSL as ‘a water conduit as opposed to a leaky bucket.’

* An analytical dataset, a Python data frame for analytics, a half-way-house between raw CSV files and the repository.

Comment – It seems as though data science is a return to the scripting environment of the late 20th Century, with Python stepping in for the Unix shell.

Schlumberger Delfi and the polycloud

Paddy Dineen provided more details on Schlumberger’s new data ecosystem for its Delfi cognitive environment. Dineen observed that contrary to the received view, management never bought into the ‘data has value story’. This is (hopefully) set to change with greatly improved access to relevant data, workflow automation and a ‘unified understanding of risk including HSE’. Currently, key decisions are based on and/or captured in email and PowerPoint rather than in a geoscience application. This is changing with ‘digitalization’ machine learning and cloud-based workflows. Dineen stated confidently, ‘the cloud(s) are secure’, particularly with added security from specialist providers such as CarbonBlack and PaloAlto Networks. Dineen cited ThoughtWorks position paper on ‘polycloud’ solutions that pick and mix services and solutions across Amazon and Google.

Delfi, Schlumberger’s ‘cognitive’ E&P environment will leverage Google for its data ecosystem while its ‘cognitive’ environment spans Google and Microsoft Azure. The Apigee API for polycloud data flow got a mention. Dineen further opined that ‘inter-cloud replication might be a way to optimize’ data use. The data lake mandates immutability. This is hard to achieve on premise, which is where the extensibility of the cloud comes in. So far, little big data/AI is done in industry because the environments are not adapted. Dineen showed Delfi slideware combining just about every imaginable data type.

From a Petrel endpoint, access is possible to various processing services running in the background. Data relationships leverage the ‘well known entity’ concept to blend corporate data with other (IHS) datasets. A WKE widget pulls up well and related information. ‘This is not rocket science and should have been automated 20 years ago’. Delfi is currently at the stage of agile development of minimum viable products. These are already in proof of concept trials with several oils.

In the Q&A, Dineen was asked if SeisDB, ProSource and so on run inside Delfi. They do not, but they can be used to ingest data. ‘In a few years such services will likely be embedded in Delfi’. Dineen also clarified that ‘polycloud’ does not mean cloud agnostic. ‘We currently consider Google as best for analytics and Amazon for data storage’.

Arundo analytics on false positives and video analytics

Simon Francis (Arundo Analytics) reported that some data scientists have left the industry because ‘you haven’t got anything for me to do’. But as compute and storage costs fall, doing analytics on hundreds of thousands of pumps worldwide is getting feasible. Deep learning is taking over from Excel. Rather than starting out with a data lake, Francis recommends starting small and focused. Many data science projects end at the PowerPoint stage as they prove hard to scale to production. One example is temperature and vibration data analysis of rotating equipment.

Current AI systems tend to flood operators with zillions of reports that ‘something is breaking’. Static alarms do not capture failures adequately. It is better to roll-in multiple data sources P&ID, sensor, maintenance – and to do this quickly! ‘No 18 month projects!’ Arundo deploys a Dell Edge gateway and OPC-UA communications. One non oil and gas use case has been using ML on video footage of a weaving loom to spot when the thread gets to the end of the reel. Similarly a video cam can be trained while pointing at a flowmeter dial! Another use case involved predicting compressor failure. This was successful in that it predicted a breakdown three weeks ahead of time, but, unfortunately, ‘no one was looking at the screen!’

Dell EMC a roadmap from ‘democratic’ to enterprise AI

David Holmes began by tempering the general enthusiasm for AI as a panacea for data managers’ problems. Currently, as reported in Forbes, ‘80% of data science has little to do with data science’. This recalls Shell’s year 2000 finding that geoscientists spent ‘75% of their time wrestling with data’. In today’s hackathons, getting up and running with datasets is the hardest part. For example, at the Dell/Agile AAPG hackathon, it took participants a couple of hours futzing with data in Excel before AI could be tried. Even when you have dealt with such issues, there are multiple, more or less incompatible AI/ML frameworks.

All of which has not stopped the growing underground community inside corporates using home brew Linux workstations with GPUs. This is great, but begs the question as to how such work can evolve into a scalable enterprise system. For Dell EMC the answer is the cloud, possibly Dell’s own Virtustream Enterprise cloud where Jupyter notebooks can be managed, models tracked and controlled. IT also has a role providing reviews from formally-trained computer scientists. But combining the whole open source based toolset is not something that adds value to the business. Enter the recently-launched Dell EMC AI platform a bundle of Domino Data, Cognizant, H20AI and more.

The new/citizen data scientist represents an important new community in our industry. These folks can solve problems in a way that would be previously unimaginable. In any case, ‘folks no longer want to sit in front of Petrel all day!’ Better examine and test data in new and interesting ways. Holmes called out Enthought Earth Science Analytics and (again) the Agile hackathons (seven this year) for which there is huge demand. Senior execs show up with laptop and hack Python code. Dell EMC’s Ready Solutions for Big Data add data provenance and algorithm management and governance to the underground movement.

Loxodrome – adding open source, graph technology to Esri

Grahame Blakey and Nick Fosbery’s Loxodrome startup is to offer ‘insight through integration’, adding open source technology to corporate clients’ stacks. Loxodrome is an Esri partner and also uses Logi Info, Neo4J, Apache Solr and lots more stuff. The idea is to bring the combined power of geospatial and IM and the business. This could for instance, address a new ventures problem when a farm-in opportunity arises that requires accessing data across Petrel, IHS and more. Loxodrome has used the community/developer edition of Neo4J and its Cypherquery language. This has led to an easy to deploy rich, flexible data model of seismic surveys, which is claimed to be better than an Oracle RDBMS. Loxodrome is now building an E&P data portal with Geocortex and Logi Info, running atop Solr, SAFE FME and Esri. For more on Neo4J see our report from the 2017 Graph Connect event.

Kzetta - and the answer (to everything) is blockchain!

Khalid Abbas, of Lean methodology consultants Kzetta.com sees blockchain as the panacea for, well just about everything to do with the digital transformation. Blockchain is to ‘transform the information asset’ with decentralized, anonymous, time stamped data and information. Use cases are seismic reports, cyber intrusion detection and more. One blockchain flagship is the Ponton Enerchain*, a putative EU energy trading platform trialing with Total and (many) others. Abbas ventured that seismic data, including prestack could be ‘put on blockchain’. Questioned on the energy sustainability of blockchain, Abbas opined that he was ‘against bitcoin’ and that there was no answer to this question. In oil and gas there will be a private blockchain, ‘not Ethereum’.

* We tried contacting the Ponton site through the online form and received a warning about an insecure, badly configured website and then a 404.

Teradata – Back to future with one big DB

Jane McConnell observed that current initiatives (like Omnia) are moving ‘back to the future’ with one big database blending subsurface and topside data from engineering documents. This mandates joining-up different parts of the business. But here, current data management organization does not help as folks still work in different silos, perhaps each with its own IT. Buy (instead of ) build caused the problem and you ‘can’t put a sticky plaster over the top!’

So, back to building your own stuff. As an example, McConnell cites the use of open source parsers such as the DLIS parser from the Agile-maintained SubSurf Wiki. McConnell also opined that doing data quality, fixing CRS/UoM issues up front is a good idea! And, as we have heard before (from Teradata in fact) you (still) do need a data model, or, at least, some agreement on terminology. It can be hard for subsurface and topside people to talk to each other. What is an ‘asset’ anyway! Operators need to plan for turnover and change – and ‘don’t outsource’.

Halliburton new business model for co-innovation

Lapi Dixit, Halliburton’s Chief Digital Officer is a believer in Industry 4.0 and cyber physical systems. Automation and AI will form the bedrock of open interoperable systems. Dixit reprised Halliburton’s OpenEarth initiative as a ‘new business model for co-innovation’. One example is in drilling automation where competing metrics and objectives can be reconciled with predictive models. The trend is for more sensors, edge computing, machine learning and an E&P digital twin running on a cloud agnostic open data platform that is ‘not limited to single vendor.’

Schlumberger – AI to underpin everything in the industry

Steve Freeman, Schlumberger director of AI believes that the disruptive technology will ‘underpin everything in this industry’. ‘You all need to be inside this space’. The traditional view of the upstream involves taking lots of data, doing complex stuff and giving oneself a pat on back! So, no way this is amenable to AI. Wrong! AI applies closely here. Enablers (all are free) are classification, deep learning, but still, with the human in the loop. Seismic interpretation currently takes too long. So, train a convolutional neural net to pick faults. The data specialist does the bulk of the work and hands over to the interpreter for the final detailed pick. Top salt pick time in the Gulf of Mexico is down from 4 weeks to 4 hours for 95% of the area of interest.

Petrophysics is also an obvious target except that data is ‘over constrained’. Freeman cited Woodside’s 2018 SPWLA presentation that sped log interpretation ‘from weeks to hours’ with 90% accuracy. Better still the machine ‘knew’ when it was not confident of its work. First pass interpretation is handed over to a petrotech for final QC.

NLP on reports is another promising area, Freeman cited the recent work on the UK O&G Authority’s scanned data set, now searchable from Petrel and capable of showing a heat map of lost circulation as detected from well reports. Again, the Apigee API got a plug. Schlumberger is reporting ‘proven value from AI with a 3X speedup on simulation times and 90% off petrophysical interpretation time. ‘If you don’t believe, think about a new occupation.’

Earth Science Analytics - ML for geoscience

Eirik Larsen presented Earth Science Analytics machine learning offering for geoscientists. Large industry datasets (such as the NPD’s) are underused because expert interpretation is costly and slow. Machine learning can help with interpretation and in performing inversion and rock physics in one step. ESA’s GUI for user friendly ML software offers an ‘Alexa’ style assistant capable of answering questions like ‘show me porosity distribution of wells in quad A.’

The next ECIM will be held from 16-18 Sept 2019.

Folks, facts, orgs ...

Information Coalition, ARMA, Implico, American Petroleum Institute, Atwell, Badger Meter, Brown & Root, Clean Energy Fuels, Fugro, Gravity Oilfield Services, Harris Corp., Hunting PLC, Kymeta, Milestone Environmental Services, MIT, Premier Oilfield Group, CSA Ocean Sciences, Carnegie Mellon Software Engineering Institute, Shell Silicon Ranch, Simmons & Company, Swagelok, Railroad Commission of Texas, ValTek Industries, ConocoPhillips, Neptune Energy, Indegy.

Following its merger with the Information Coalition, Jocelyn Gunter remains ARMA CEO while IC president Nick Inglis onboards as executive director of content and programming.

Tim Hoffmeister is now CFO and co-MD at Implico. He hails from Cognizant Technology Solutions.

The American Petroleum Institute has appointed Debra Phillips as VP of its Global Industry Services division. She was formerly with the American Chemistry Council. Bill Koetzle joins as VP of Federal Affairs. He was previously with Chevron.

Matt Rosser is VP of Oil and Gas operations at Atwell. He hails from TRC Solutions.

President Kenneth Bockhorst is now CEO at Badger Meter succeeding retiree Richard Meeusen who remains Chairman of the board through 2019.

Brown & Root has appointed Eric Balkom as Executive Director Engineering and Kole Ambeau as Executive Director Construction in the Western Region.

Total Marketing Services president, Momar Nguer and SVP Philippe Montantême have joined Clean Energy Fuels’ board of directors. Co-founder T. Boone Pickens is to retire as director of the board and becomes director emeritus.

Mark Heine has been promoted to Fugro CEO and Chairman of the Board of Management. He will succeed Øystein Løseth who is stepping down for personal reasons. Løseth will act as an advisor until the end of 2018.

Gravity Oilfield Services has appointed Rob Rice as CEO and President. He was previously with Archrock.

Dana Mehnert is now president of Harris Corp.’s Communication Systems business. He succeeds retiree Chris Young.

Roger Findlay is to lead the newly launched Hunting PLC TEK-HUB at the company’s Badentoy, Aberdeenshire facility.

Kymeta has named Rash Jhanjee to VP of Sales EMEA & Indian subcontinent, Paul Mattear to VP of Sales and Business Development, and Scott Glass to Sales Director.

Milestone Environmental Services has hired Glenna Pierpont as VP of HR and Richard Leaper as VP of Sales and Marketing. Noelle Selin is now director for the Technology and Policy Program (TPP) at MIT.

Founder of Rybkarock, Sau-Wai Wong has joined Premier Oilfield Group as VP of Technical Software.

Bob Erickson is now Northeast Region Manager and Senior Scientist at CSA Ocean Sciences. He hails from ESS Group.

The Carnegie Mellon Software Engineering Institute has named Heidi Magnelia to CFO, replacing retiree Peter Menniti.

Shell’s Silicon Ranch solar unit has appointed Jim Bausell as its Executive VP for Business Development.

Simmons & Company has hired Ryan Todd as senior research analyst and head of E&P research, and Blake Fernandez as senior analyst for energy equity research.

Swagelok has selected Solon (Ohio) for its new Global Headquarters and Innovation Center.

Danny Sorrells is now assistant Executive Director and Oil and Gas Division Director at the Railroad Commission of Texas.

ValTek Industries has appointed current CEO Shane Hamilton as Chairman and Christopher Hibbert as President.

Al Hirshberg is to retire as ConocoPhillips’ executive VP, Production, Drilling and Projects. Following Hirshberg’s retirement Matt Fox has been named executive VP and COO and Don Wallette as executive VP and CFO.

Armand Lumens has joined Neptune Energy as CFO. He hails from Louis Dreyfus Company.

Cyber security specialist Indegy has announced the appointment of Joe Scotto to CMO and Todd Warwick as VP of Sales, Americas.

Xtreme Drilling chairman Douglas Dafoe is now on the AKITA Drilling board of directors.

Deaths

ARMA International has announced the death of its former Executive Director Jim Souders at the age of 80.


Done Deals

Akita Drilling, Xtreme Drilling, Akselos, Innogy Ventures, Shell Ventures, Altair, Simsolid, Applied-Cleveland, STS Consulting, ARMA, Information Coalition, BHGE, ADNOC Drilling, Vela Software Group, Coreworx, Drillinginfo, Oildex, EMAS Offshore, Halliburton, Barree Software, Indegy, Innovex Downhole Solutions, Buckhorn Casing Equipment, NetApp, StackPointCloud, Pacific Drilling, RS Energy Group, Petroleum Policy Intelligence.

AKITA Drilling has acquired Xtreme Drilling in a paper and $45 million cash deal, financed in part from a credit facility from ATB Financial.

Akselos has completed a $10 million financing round led by Innogy Ventures with Shell Ventures as co-investor. The monies will be used to develop structural analysis software, big data analytics and machine learning. Akselos’ digital technology has been ‘emerging’ for some 15 years and is marketed through an exclusive license with MIT. The ‘digital twin’ technology was further developed at the Swiss Federal Institute of Technology.

Altair has acquired Simsolid’s ‘almost magical’ technology that combines design and structural simulation geometries.

Applied-Cleveland has acquired STS Consulting’s inspection and integrity management unit that services North American midstream customers.

ARMA, the American records management association, has taken over the Information Coalition, organizer of the annual Information Governance Conference and purveyor of best practices in information management.

BHGE has acquired a 5%, $11 billion stake in ADNOC Drilling.

Constellation Software’s Vela Software Group has acquired Coreworx. Constellation boasts consolidated revenues exceeding US$2.7 billion and some 14,000 employees worldwide.

Drillinginfo has acquired Oildex, an Accel-KKR portfolio company.

EMAS Offshore has appealed the Financial Supervisory Authority’s delisting of its shares on the Oslo Bors (Norway’s stock exchange).

Halliburton announced the acquisition of Barree Software’s Gohfer fracture modeling software for conventional and unconventional well completion design, analysis and optimization.

Indegy has raised $18M in new finance to accelerate ‘go-to-market’ for its industrial cyber security portfolio. The Series B round was by Liberty Technology Venture Capital, a subsidiary of Liberty Media with participation from international energy and services firm Centrica plc, O.G. Tech Ventures and existing investors.

Innovex Downhole Solutions has acquired the assets of Buckhorn Casing Equipment. Innovex is backed by Intervale Capital, a private equity firm that invests in oilfield manufacturing and service companies.

NetApp has acquired StackPointCloud, a provider of multi-cloud Kubernetes as-a-service. The combination, aka the NetApp Kubernetes Service is described as a ‘complete, cloud-based stack for Azure, Google Cloud, AWS and NetApp HCI’.

Pacific Drilling announced progress in its Chapter 11 proceedings as the bankruptcy court approves the Company’s $700 million deal with a third-party financial institution and a ‘backstop commitment’ from creditors. The court also approved implementation of the 2018 key employee incentive plan.

RS Energy Group has acquired UK-based Petroleum Policy Intelligence, a provider of expertise in Middle Eastern and North African energy policy and of key players in the global oil market.


Going, going, green

Methane, emissions control, CCS, climate, politics ... from Stanford, Bluefield, Bluesource, Office of Fossil Energy, NETL, IOGP, ISO, IFPen, Aerovia, NanoVapor, NAP, Oxy, White Energy, WellDog, Virginia Tech, Carbon GeoCycle, ExxonMobil, IEA, Integrated Informatics, Equinor, Opus 12, Texas Alliance of Energy Producers, Texas Railroad Commission.

A new study from Stanford University, published in the Journal Science, finds that US oil and gas methane emissions are 60% higher than the EPA reports. The study was led by the Environmental Defense Fund (EDF) researchers, with co-authors from 15 other institutions including Stanford’s School of Earth, Energy & Environmental Sciences. Fifty oil and gas companies provided site access and technical advice. The researchers measured emissions at over 400 well pads in six basins, scores of midstream facilities and conducted aerial surveys covering large swaths of oil and gas infrastructure. EDF recently announced plans to launch MethaneSAT, a purpose-built satellite designed to measure and map human-caused methane emissions almost anywhere on earth. Due to launch in 2021, MethaneSAT will help both countries and companies track problem areas, find solutions, and monitor their progress.

Bluefield’s satellite-based methane detection technology is claimed to detect natural gas leaks by measurement of ‘distortions’ in sunlight reflecting from the earth surface. Bluefield recently got VC backing from Village Global, a VC backed by Jeff Bezos, Bill Gates and Mark Zuckerberg.

Quebec-based Inlandsis is to support Bluesource’s methane reduction program in Alberta. The program reduces methane emissions from upstream oil and gas operations and will generate offset credits under Alberta’s Carbon Competitiveness incentive regulations. Bluesource is to replace thousands of pneumatic devices across Alberta with low-emissions valves. The MRP is projected to reduce some 3 million tonnes of carbon dioxide equivalent by 2022. The first offset credits are expected to be delivered this fall.

The US Department of Energy’s (DOE) Office of Fossil Energy (FE) has selected five new projects for $11.3 million federal funding for cost-shared R&D. The projects are supported through the funding opportunity DE-FOA-0001792, ‘Novel and enabling carbon capture transformational technologies’. The National Energy Technology Laboratory (NETL) is to oversee the projects whose focus is mainly on capture of CO2 from electricity generation (as opposed to sequestration).

Earlier this year, NETL announced a joint venture with the industry-backed ‘Our nation’s energy future’ (ONE Future) to research methane emissions from natural gas production and delivery. The initiative is to enable the development of US shale gas and ‘move the nation closer to a broader policy goal of energy dominance’. NETL is using its expertise in methane life-cycle analysis (LCA) to benchmark member companies’ methane performance against the industry average. LCA encompasses the environmental, economic and social attributes of energy systems, from extractionthrough transport and use. The LCA model is claimed to be ‘one of the most sophisticated energy models that, in combination with other analytical data, enables policymakers and the public to discern the impact of technology-policy choices’. A report, ‘Industry partnerships and their role in reducing natural gas supply chain greenhouse gas emissions’ is available from NETL.

The UK-headquartered IOGP has signed-on to the Climate and Clean Air Coalition’s ‘methane guiding principles’ (MGP). The MGP emanates from a coalition of international bodies including the Environmental Defense Fund, the IEA and the UN. The principles focus on reducing methane emissions and improving accuracy of methane emissions data and increasing transparency. The IOGP’s commitment does not as yet extend to its members although BP, Chevron, Eni, ExxonMobil, Qatar Petroleum, Repsol, Shell, Statoil, Total, Wintershall and Woodside have signed independently.

A new ISO standard is claimed as a ‘powerful new weapon’ in the fight against climate change. ISO 14080:2018, greenhouse gas management and related activities provides a framework and methodologies for GHG reduction. The standard is confidential and costs CFH158.

IFPen and partner Aerovia report deployment of their jointly-developed GasMap methane detection service at an underground gas storage site. GasMap uses Aerovia’s QCNose onboard a vehicule to sample methane concentrations at ground level and in the atmosphere. Data is integrated with real time weather feeds and modeled to distinguish between industrial and biogenic gas sources.

NanoVapor has announced a ‘disruptive’ technologies to reduce hazardous fuel vapors (VOCs) in aviation, oil and gas and industrial markets. NanoVapor’s proprietary vapor suppressant and scrubber is optimized for hydrocarbons such as gasoline, diesel and jet fuels, and crude oil. More from NanoVapor.

Carbon capture and sequestration

The US National Academies Press has just published the 13 page Proceedings of a 2017 workshop on carbon mineralization and the subsurface storage of CO2. There are four storage/uses of CO2: (1) injection into depleted oil and gas reservoirs, (2) use of CO2 as part of enhanced oil and gas recovery, (3) injection into deep saline formations (onshore and offshore) and (4) use of CO2 in enhanced coal bed methane recovery (and possibly shale). Stanford researcher Sally Benson put the potential global storage capacity at ‘between 5,000 and 25,000 giga tonnes*’ of CO2 and reported that ‘large-scale projects for capturing CO2 from anthropogenic sources and storing it underground are expanding worldwide**.’ The workshop includes state of the art reports on CCS projects and computer modeling exercises from around the world.

* The upper end number is around half of worldwide CO2 emissions from fossil fuels.

** We have previously reported on a marked slow-down in CCS trials – see our 2017 editorial, ‘COP23 BECCS, FECCS and the future of fossil fuel.’

A Stanford University analysis published in the Journal Joule proposes a model for how relatively small government payments could encourage carbon capture and storage (CCS). The proposal is for payments to companies performing EOR with CO2 from refineries, power plants and other sources contributing to climate change. A tenfold increase in the amount drawn from these sources could shrink the nation’s climate emissions by 10%, ‘even when accounting for the additional oil extraction made possible by injecting all that carbon’. More from Stanford.

Occidental Petroleum and White Energy have embarked on a feasibility study of CO2 use in enhanced oil recovery. CO2 from White’s ethanol facilities will be transported to the Permian Basin for EOR. The project results from the passage of the US Future Act and is designed to be eligible for 45Q tax credits and California’s Low carbon fuel standard CCS protocol. The Future Act, which became law in February 2018, addresses the conversion of CO2 emissions from industrial sources to a commodity product ‘that can be stored in a secure geological formation through EOR’.

The US Department of Energy’s (DOE) Office of Fossil Energy (FE) has selected two projects to receive approximately $7 million in federal funding for R&D into underground CO2 sequestration. Beneficiaries of funds from DE-FOA-0001829, aka ‘developing technologies for advancement of associated geologic storage in basinal geo-laboratories’ are U Illinois (stacked greenfield and brownfield ROZ fairways) and U North Dakota (Williston Basin CO2 field lab).

WellDog, Virginia Tech and Carbon GeoCycle have announced verification of carbon dioxide sequestered in underground rock formation. WellDog’s downhole geochemical Reservoir Raman System showed that CO2 injected over the last two years successfully flowed into all of the targeted coal seams. The project, located in Buchanan County, Virginia, involves injecting over 13,000 tons of CO2 into unmineable coal seams at depths of 900 to 2,000 feet with the goal of ‘storing carbon dioxide while simultaneously enhancing natural gas recovery’. More from Virginia Tech.

Fluor Corp. has signed an agreement for the use of the test facility at Gassnova’s technology center at Mongstad, Norway. The companies are to trial a new chemical solvent to separate carbon dioxide from industrial flue gases.

Climate

ExxonMobil has joined the Oil and Gas Climate Initiative, a global initiative to provide practical solutions to climate change mitigation. The 13 member-strong OGCI targets carbon capture and storage, methane reductions and energy efficiencies. ExxonMobil is to expand its R&D into long-term solutions to reduce greenhouse gas emissions. These include a 15% reduction in methane emissions by 2020 and 25% reduction of flaring. OGCI members are BP, Chevron, CNPC, Eni, Equinor, ExxonMobil, Occidental Petroleum, Pemex, Petrobras, Repsol, Royal Dutch Shell, Saudi Aramco and Total.

According to the International Energy Agency, the world is currently not on track to meet the main energy-related goals of the 2015 COP21 sustainable development scenario (SDS). The IEA has therefore tweaked the model to propose a new SDS involving a ‘major transformation’ of the global energy system, that delivers on the three main energy-related goals. The IEA’s scenario includes some implausibly ambitious projections for electricity access in the third world.

Negawatts and other greenish stuff

ISO has updated its energy management standard. ISO 50001:2018 features updated terms and definitions and clarification of energy performance concepts.

A NETL-led team has released Gogi, the global oil and gas infrastructure database. Gogi sets out to mitigate natural gas infrastructure failures with an inventory of oil and natural gas infrastructure information. Gogi identifies over 4.8 million features (wells, pipelines and ports) across 194 countries with data on the age, status, and owner/operator. Gogi is available from the NETL Energy Data eXchange (EDX), along with other resources.

The Newfoundland and Labrador Innovation Council has awarded funding to Integrated Informatics for the development of a personnel and asset tracking data management system for marine emergency response. The system will target marine oil spill response with real-time tracking and GIS technology.

Among the ten companies selected for Equinor’s Techstars Energy Accelerator program is Opus 12, a spin-off from Stanford University. Opus 12 ‘combines CO₂ and water to produce ‘higher-energy carbon-based products.’ Thermodynamics means that the process is ‘energetically uphill’, so electricity is required to ‘drive the reaction’. It’s not clear what problem Opus 12 is trying to solve as the output from the process is … another hydrocarbon. Along with Equinor, Techstars is backed by Kongsberg and McKinsey.

Politics

The Texas Alliance of Energy Producers, commenting on the EPA’s proposed changes to the regulation of methane leaks seeks to ‘repair’ over-reaching federal regulation on methane leaks in the oil patch. TAEP president John Tintera observed ‘methane is natural gas, virtually the same gas that we cook and heat our homes with every day. It is a very common gas emitted by wetlands and livestock. It is a by-product of life itself.’ While oil field methane leaks ‘should be detected and fixed’, heavy-handed federal regulations are considered ‘punitive, costly and unfair to the small independent producer’. At issue is the inclusion of stripper wells with marginal gas production under the new regulatory régime.

In an address earlier this year, reported in the Texas Tribune, Texas Railroad Commissioner Wayne Christian opined that ‘the science on climate change is not settled’. Christian railed against folks who see oil and gas as an antiquated energy source that will soon be replaced with so-called ‘green’ alternatives. ‘Fossil fuels are going to remain our primary source of energy for the foreseeable future. In my opinion, that is not a bad thing.’

Whatever your opinion on these issues, recent events in France have shown how hard even a modest move towards a carbon tax (along with an equally modest reduction in a speed limit on country roads) can be to get past the masses. A few centimes of a Euro/liter increase in the gasoline and diesel tax and a reduction from 90 to 80kph brought thousands of ‘gilets jaunes’ into the streets to complain that a) gas was too expensive and b) they were being denied the right to use more gas by driving faster.


Machine Learning and Artificial Intelligence Congress 2018, Houston

LBCG event hears from early adopters of ML/AI in oil and gas.

London Business Conferences Group’s Machine Learning and Artificial Intelligence Congress 2018, held in Houston earlier this year heard from some early adopters of AI/ML in oil and gas. Along with the growing interest in these compute intensive technologies comes a desire to revolutionize onshore Scada systems, replacing incumbents’ expensive proprietary infrastructure with DIY systems that leverage consumer/hacker-style technology (think Raspberry PI) and pumping data straight to the cloud, avoiding again, legacy (and expensive) proprietary wireless and satellite links. So, what ‘disruptive’ startups are engaging in this data revolution? Well let’s start with BP.

BP Lower 48 - failure analysis on the Raspberry Pi

Eric Penner presented an ‘intelligent operations’ initiative underway at BP’s Lower 48 Onshore unit. Intelligent operations (IO) is a portmanteau term encompassing new ways of working, using technology to automate routine activities. Analytics-backed logistics maximizes time available for on-site problem solving and ‘makes us a safer, more environmentally responsible operator.’ Here the focus is root cause failure analysis that Penner boldly states can be used to ‘understand our vendors’ businesses better than they do*’. RCFA mandates more data from operations which is where the Raspberry Pi, a $50 DIY computer, is used as a ‘disrupting alternative’ to traditional RTU/Scada systems. The Raspberry Pi underpins an ‘open platform’ that commoditizes endpoint devices and promises the economical automation of all of BP’s wells. The Pi is set to ‘shave millions’ from BP’s automation budget by eliminating costly legacy scada systems, technologists and wireless LANs with ‘consumer-style technology hooked into the cloud!’ Other components of BP’s IO include the FieldBit Hero visual collaboration platform for field services. BP was the first onshore oil and gas company to deploy the augmented reality, smart glasses that empower technicians to solve complex problems on-the-spot. More aligned with the ML/AI theme of the conference is BP’s ‘Arrow’ progressive logistics model and work management solution that uses a BP-developed algorithm to provide a value-optimized route for pumpers, reducing site visits by up to 50%. Arrow automates the task of efficiently deploying resources and eliminates wasted effort.

* This claim merits examination. While an operator may know more about the operating environment of an equipment item, the manufacturer has access to potentially many more data points from multiple clients. The issue of failure data sharing is somewhat fraught.

Exco Resources - NodeRed, the Pi and Hackster!

Johnathan Hottell (Exco Resources) is Raspberry PI enthusiast, although he nuances the new low-cost solutions. Traditional Scada ‘is not going away for real-time control’, the technologies complement each other. Hottell recommends teaming a subject matter expert with a data scientist. One use case is an investigation into well liquid loading. NodeRed’s flow-based programming for the IoT. This runs on a Pi and polls legacy devices. NodeRed can convert and forward messages using many different protocols. Exco’s well liquid lodading ML experiment is described on the Microsoft Azure ML Gallery. A full write-up with more on NodeRed and the PI is available on the Hackster hobbyist website. Ignition Scada also ran as did the Microsoft Azure ML Studio and Cortana AI gallery. Exco’s ML-derived liquid loading classifier is now deployed as a web service. Exco has also used machine vision to check for instance if a flare is still burning or if there is unauthorized vehicular access to remote well site.

* Industrial internet of things.

Digital Transformation - eScience and ‘rock star’ software engineering

Mark Reynolds’ Digital Transformation startup is working on optimizing artificial lift with ML. Artificial lift has come a long way since the days of data logging and manual control of pumps to today’s situation where engineering means ‘watching everything all the time’. Reynolds cited Microsoft researcher Tony Hey’s presentation on ‘eScience and the Fourth Paradigm’ aka data-intensive scientific discovery and digital preservation. ‘eScience is the set of tools and technologies that support data federation and collaboration’. Driving the eScience transformation is the enterprise architect, an experienced, interdisciplinary oil and gas engineer and a ‘rock star’ software engineer with a decent understanding of statistics.

Williams - CodeExpert and Waze optimize logistics

Ryan Stalker (Williams) observed that material movement logistics workflows have a tendency to inefficiency. Some are poorly engineered from the start, others get inefficient over time as entropy sets in. ML and AI offer opportunities to optimize workflows by reduce human intervention and automating inventory management, reducing stockpiling and starvation. Williams has worked with Calgary-based Code Expert to add ML smarts to its Maximo materials management system. The solution also leverages the Waze smartphone driving app to optimize equipment and material movement. Waze’s ML-based navigation software adds crowd-sourced information on road hazards, traffic patterns and police alerts.

The main limitation to the application of ML/AI in workflow optimization is the effort involved in digitizing legacy hard copy records which are required for algorithm training, validation, and testing. Stalker warns that ‘broad organizational problems won’t be solved by ML and AI’, although these tools can excel when pointed at specific workflow problems. ML’s usefulness decreases with problem dimensionality. Another potential problem arises from the regulator. In the EU for instance, new General data protection regulation extend to the ‘explainability’ of deep learning. Witness GDPR Article 13, Paragraph 2(f) requiring that people be informed of ‘the existence of automated decision making and [be provided with] … meaningful information about the logic involved’.

California Resources - XSPOC data, k-means classifier and rod pump optimization

Mohammad Evazi (California Resources) had a stab at estimating the ‘size of the prize’ of optimizing rod pump operations. There are around 1,000,000 oil wells worldwide on sucker rod pump with an annual failure rate of 0.2-0.6 per well. The average cost of failure is about $30,000. Do the math! [We did, that is around $12bn/year!]. It makes sense to invest in collecting and understanding your pump-off controller (POC) data.

Today, CRC collects information on load, strokes per minute and casing pressure in Theta’s XSPOC database. But the bulk of historical data required for AI/ML is held in analog-format dynamometer cards. Dynocard data is a health indicator for a rod pump wells. CRC stores over 100,000 cards per day. These are analyzed with a k-Means classifier and translated into time-series visualizations. These enable well diagnostics and failure prediction. CRC has built its Dynocard classification model which is now used for well optimization. CRC Is now working to more accurately determine failure intervals and label the root cause. Card data is now permanently captured in a ‘failure data lake.’

More from London Business Conferences.


Sales, deployments, partnerships ...

National Oceanic and Atmospheric Administration, Google, Seeq, TechnipFMC, Sirius, Accenture, SAP, Aker Solutions, Equinor, Turbulent Flux, AspenTech, Advantech B+B, CUI Global, Samson, Drillinginfo, Landdox, Dassault Systèmes, Fluor, IBM Watson, Emerson, Fiber Optic Sensing Association, Weisz Bolivia, Maverick NextGen Energies, PwC, Perigon, Seequent, SAP, Accenture, Capgemini, Deloitte, Shell, Microsoft, C3 IoT, Texas Advanced Computer Center, iRODS Consortium, Teradata, Sasol, Saudi Aramco, Total, Google, Woodside, ENN Group, Aker BP, BHGE, ADNOC, Pemex, Rock Flow Dynamics.

The US National Oceanic and Atmospheric Administration is an early adopter of Google Dataset Search, adding some 35 petabytes of ocean records to the open data archive.

Shell’s work with Seeq’s cloud-based advanced analytics applications is now available to all customers. Seeq’s Microsoft Azure-based solution enables engineers and scientists in process manufacturing organizations to analyze, predict and share insights to improve production outcomes. Shell’s process engineers use Seeq to work with OSIsoft PI data.

TechnipFMC has joined Norway’s Sirius R&D organization, housed at the University of Oslo. Sirius is the Center for scalable data access in oil and gas. TechnipFMC and Equinor have extended their partnership encompassing the full scope of TechnipFMC products and services – leveraging ‘next generation technology and digitalization’.

Accenture and SAP have teamed to develop a SAP S/4HANA Cloud solution to ‘digitally transform’ upstream oil and gas operations.

Aker Solutions has signed a global agreement with Equinor for current and future subsea projects across safety, quality, technology, execution and cost domains.

Turbulent Flux has secured an R&D agreement with Aker BP to develop a transient multiphase flow simulator aka a ‘stable production advisor’.

AspenTech and Advantech B+B SmartWorxare to team on robust, ‘lower-cost’ Industrial Internet of Things connectivity solutions. AspenTech’s Aspen Capital Cost Estimator software has been selected and deployed by SK Engineering & Construction.

CUI Global’s Orbital unit has signed an agreement with Samson to distribute its GasPT, VE Technology and GasPTi analyser globally outside the North America and the UK.

Drillinginfo has partnered with Landdox to streamline customer workflows. Users can now access DI Basic and DI Plus data sets as layers inside their Landdox maps.

ExxonMobil is to deploy Dassault Systèmes’ 3DExperience-based Capital Facilities Information Excellence platform.

Fluor is to use IBM Watson to predict, monitor and measure the status of engineering megaprojects from inception to completion.

Emerson is now member of the Fiber Optic Sensing Association (FOSA).

Weisz Bolivia has fixed a ‘loss of communication’ problem at five ENAP Argentina offshore platforms in the Magellan Straits with an Ignition Scada deployment.

Maverick NextGen Energies, PwC and Shell have partnered to create a model of the energy company of the future, built around a digital core founded on SAP’s HANA public cloud.

Perigon has partnered with Seequent, developer of the ‘Leapfrog’ 3D geothermal modeler, to market Perigon’s iPoint wellbore data management and visualization software to the geothermal Industry.

SAP has teamed with Accenture, Capgemini and Deloitte to accelerate customer adoption of SAP S/4HANA Cloud and to co-develop innovative solutions for target industries.

Shell has expanded its strategic collaboration with Microsoft Azure by selecting C3 IoT as its artificial intelligence (AI) platform to enable and accelerate its digital transformation globally.

The Texas Advanced Computer Center has joined the iRODS Consortium.

Teradata has partnered with Sasol to create a ‘robust analytical ecosystem’ that supports the company’s vision for creating a data-driven business strategy.

Saudi Aramco and Total have signed a joint development agreement for the front-end engineering and design of a giant petrochemical complex in Jubail, on Saudi Arabia’s eastern coast.

Total is working with Google on AI for oil and gas subsurface data analysis. The research will focus on computer vision-based seismic interpretation and automated analysis of technical documents with natural language processing technology. Total geoscientists are to co-locate with Google’s machine learning experts at Google’s ‘advanced solutions lab’ in California.

Woodside has signed a cooperation agreement with ENN Group to explore a broad range of potential business opportunities.

BP and Aker BP have formed a strategic technology venture alliance to explore joint innovation and technology opportunities and seek out potential venture capital investments.

BHGE and ADNOC have signed a strategic agreement to expand ADNOC Drilling’s capabilities into the integrated drilling and well construction segment and to extend BHGE’s presence in the UAE.

Pemex has made Rock Flow Dynamics’ tNavigator reservoir simulator available throughout the company. Oman Oil Co. E&P has also acquired tNavigator licenses for use in development planning of its oil and tight gas fields.


Standards stuff ...

ECCMA, Authoritative Legal Entity, Association of National Numbering Agencies, Global Legal Entity Identifier Foundation, Energistics, EPIM, World Wide Web Consortium, FIDO Alliance, IFRS, XBRL, INSPIRE Geoportal, Sustainability Accounting Standards Board, Global Reporting Initiative, International File Exchange, PIDX, SAP Ariba.

ECCMA has announced a shared materials master data (eSMD) registry tied in to the Authoritative Legal Entity registry of trading partners. ECCMA has also announced eMDV, a web-based cataloging tool to manage material, service and product master data to ISO 8000 and ISO 22745 standards.

The US Association of National Numbering Agencies (ANNA) and the Global Legal Entity Identifier Foundation (GLEIF) are to map between their systems to improve transparency by linking the issuer and issuance of securities. ISINs are issued in more than 200 jurisdictions while LEI has issued over a million records. The linkage is said to help with ‘know your customer’, (KYC) due diligence processes.

Energistics has announced the 2019 Executive teams for its upstream standards portfolio.

Norwegian EPIM standards organization has announced that its SAM-X web map service for seismic survey planners and other stakeholders is now live. Sign-on has been upgraded to EPIM-ID, a common authentication method across all EPIM services.

The World Wide Web Consortium (W3C) has announced a ’standards milestone’ in its push for simpler, stronger authentication on the web. W3C and the FIDO Alliance have advanced the Web Authentication (WebAuthn) standard to ‘candidate recommendation’. WebAuthn is a ‘phishing-resistant’ web API for secure authentication across sites and devices.

The IFRS has released its 2018 Taxonomy Business Rules for use with software tools supporting XBRL Formula. The business rules are used to validate IFRS reports.

The XBRL financial reporting standards organization has released ‘XF’, the XBRL Formula specifications. XF is an XLink-based general purpose rules language for XBRL. An XF tutorial is also available.

The EU has launched a new INSPIRE Geoportal for discovery and use of EU environmental geospatial data sets. Inspire data covers ground water, transport networks, population, land use and air temperature. The Geoportal ingests metadata from the (currently 36) officially registered national data catalogues of EU Member States and EFTA countries.

The Sustainability Accounting Standards Board (SASB) and the Global Reporting Initiative (GRI) are to align their non-financial reporting standards with the recommendations of the Task Force on Climate-related Financial Disclosures. The latter is a disclosure framework that provides financial market participants with information as to how climate-change issues may affect investment. More from GreenBiz.

A recent meeting of the IFLEXX (International File Exchange XML) petroleum standards body discussed the alignment of the IFLEXX standard, mainly used in German-speaking countries, with international PIDX standard. IFLEXX is an open source format for data exchange between oil and gas companies.

SAP Ariba has joined the Petroleum Industry Data Exchange standards body, PIDX.


Cyber security round-up

Studies, standards, warnings and entreaties from Carbon Black, ASUG, Darktrace, NIST, ISO/IEC, Honeywell, CERT, Rockwell, Marlink, DNV GL, EU, IBM Watson.

An action-packed cyber news section with multiple ’standards-based' recommendations from NIST, CERT, ISO, NCCoE. Reports and surveys from CarbonBlack, ASUG and DNV GL. New EU MISP alert sharing project announced. Commercial news and novelties from Honeywell, Rockwell, Marlink and IBM.

Carbon Black quarterly incident response report

The latest quarterly incident response report from Carbon Black sees nation-state cyber attackers becoming more sophisticated and increasingly destructive. Attacks now frequently wipe log files from compromised machines to avoid forensic analysis and detection. Attacks are ‘industry agnostic’ and affect a wide range of industries. Some 11% are reported as directed at the oil and gas sector. Attackers direct their best capabilities at industrial bases and tech service providers - especially high-tech researchers in aerospace, power generation, oil and gas, and nuclear. Moreover, half of all leverage ‘island hopping’, putting customers and partner’s systems at risk.

SAP Survey shows ‘concern’ for SAP system security

A survey carried out by the Americas’ SAP Users’ Group (ASUG) found 80% of IT/security practitioners to be ‘very or extremely concerned’ about internal security of their SAP systems. Executives were much more sanguine, with only 25% concerned. The survey concludes that ‘many companies using SAP may overestimate the security of their SAP-based workload’. The survey also pinpointed difficulties in consistent review of access security and governance, with current manual review tending towards ‘rubber stamping’. The ASUG report suggested replacing manual access review with for instance, ERP Maestro’s automated solution.

Darktrace’s self-learning IIoT/Scada security app

At the 2018 GBC IIoT in oil and gas conference, Sam Alderman-Miller presented Darktrace’s’self-learning’ cyber defense application for IIoT/SCADA environments. Today’s scada systems typically expose ‘massively outdated’, unpatched protocols. Cyber, as experienced by the IT community, is a novelty in the scada world where attacks have risen sharply. Making AI work across ‘wonderfully unique’ control systems required more than a cookie-cutter approach. There are ‘no training sets of data’ and their is ‘no time for a three-month proof of concept’. Darktrace’s industrial immune system use AI to crunch massive data sets, define ‘normal’ and watch for anomalies in real time. Darktrace also deploys Bayesian estimation and unsupervised machine learning on network traffic.

NIST publishes Glossary of Key Information Security Terms

NCCoE, the US National cybersecurity center of excellence has released an online repository of some 6,700 key information security terms and definitions extracted from its publications and interagency reports. NIST has also published SP 1800-5, ‘IT Asset Management’, an ‘example solution’ that allows an organization to centrally monitor its IT asset portfolio and to determine, for example, which devices are vulnerable to the latest threat. NCCoE also kicked-off a project to study cyber security in utilities and oil and gas. A 13 page project description is available. Initial project partners are ForeScout Technologies, Tripwire, Dragos, Splunk, KORE Wireless, TDi Technologies, FoxGuard Solutions and Veracity Industrial Networks.

ISO/IEC 27005:2018 cyber standard update

The newly revised ISO/IEC 27005:2018, IT security techniques provides a framework for managing cyber risk in compliance with the earlier ISO/IEC 27001 recommendations.

NIST releases criticality analysis model for IT/OT

NIST Internal Report (NISTIR 8179) describes a criticality analysis model to prioritize programs and systems according to their importance to the organization and the impact of failure to implement. The methodology is said to apply to organizations that rely on third-party products and services from IT/OT suppliers.

Honeywell’s secure media exchange and ICS Shield

Speaking at the 2018 Honeywell User Group meeting Eric Knapp showed how to protect against USB-borne cyber-attacks with secure media exchange (SMX). USB devices are the entry point for almost 40% of control system penetrations. Most malware comes as a Trojan inside a PDF or Office document. Despite widespread awareness of the issue, real world tests have shown that most all ‘found’ drives are indeed connected to the system. Enter Honeywell’s Secure media exchange which monitors systems for ‘rubber ducky’, ‘bash bunny’ and other USB-vectored exploits. Download Knapp’s presentation here.

In a separate announcement, Honeywell has released a multi-site industrial cybersecurity solution leveraging its ICS Shield solution for industrial control system cybersecurity. The new managed security services protect connected sites from evolving cyberthreats. Honeywell acquired ICS Shield developer Nextnine in 2017 and now claims over a million industrial nodes globally.

SEI CERT releases SCALe source code flaw analyzer

The CERT Division of the Software Engineering Institute (SEI) at Carnegie Mellon University has released SCALe, a source code analysis application, the first release of the tool as open-source software. SCALe audits software in any source code language to alert programmers to flaws that may lead to vulnerabilities. CERT also provides guidance for secure development in C, C++, Java, and Perl.

Rockwell Journal argues for network segmentation

A Rockwell Journal article describes open, unsegmented networks as a ‘gift to cyber attackers’. Author Josh Kass paints network segmentation as a damage limitation exercise that avoids a possible ‘pivot’ from a vulnerable point of entry to access more sensitive data or devices. Segmentation also limits damage from internal threats such as a disgruntled employee or human error, such as an incorrect system change. Network segmentation should be part of every company’s industrial security strategy.

Marlink’s Cyber Detection solution for offshore

Marlink has announced a real-time cyber threat detection solution, ‘Cyber Detection’, for the maritime industry. Cyber Detection monitors outbound and inbound network traffic to display threats via a web-based dashboard. Compromised assets may be remedied using Marlink’s Cyber Guard solution with optional assistance from Marlink’s Security Operations Centre (SOC).

DNV GL’s 2017 (we missed it before!) cyber security in oil and gas how-to guide

For the record, a publication from DNV-GL which escaped our notice when published in 2017. DNVGL-RP-G108 is a 53 page instruction manual for the implementation of cyber security in the oil and gas industry based on IEC 62443.

EU MISP project shares threat intel

The EU-backed MISP project is an open source platform for sharing threat intelligence. MISP creates software, develops taxonomies, warning-lists and galaxies and releases practical standards to solve information sharing challenges. MISP is funded under the Connecting Europe program.

In conclusion ... IBM’s 8000 cyber security patents, on governments and hackers.

If all of the above has you a little worried, reflect on this additional pitfall. At the 2018 IBM ‘Think’ event held in Paris, a short and dramatic presentation had a ‘hacker’ boast that with a ‘small black box’ he could take over every smartphone in the room. The ‘hacker’ then whipped off his hoodie to reveal that he was IBM’s EU head of computer security. All very amusing. Less so was another boast, that IBM holds 8,000 patents on computer security! Quite a change in tack from the days when IBM was a goodie-goodie in the open source community with a ‘$1 billion’ investment in open source. One wonders how much of this investment has ended up in the proprietary, patent-protected offering that is Watson for Cybersecurity.

Finally, an observation. Have you noticed that whenever a serious hack occurs, or a new piece of nasty malware is discovered, it is invariably referred to as being ‘almost certainly’ produced by an unfriendly government – North Korea, Iran, Russia. This respect for the power of governments in the field of cyber insecurity contrasts with the poor light in which government is held more generally in (often) the same circles.


Software, hardware short takes ...

ABB Ability, HPE, Amphora, Dell EMC, eLynx Technologies, Dynamic Graphics, Exprodat, GeoLogic Systems, Getech, Honeywell, HydroZonix, INT, Larson, eDrilling, OSGeoLive, Quorum Software, Recon Technology, Rextag, TIBCO Software.

ABB Ability EdgeInsight, a new service offering running on HPE Edgeline hardware, is being piloted at ‘a number of oil, gas and chemical customers’. EdgeInsight collects data from disparate field devices and sensors, converts field protocols into ‘one common protocol’ and serves standard output to the IT infrastructure. EdgeInsight does not allow access into the field network allowing data to be shared safely beyond individual sites. Data is merged in the field to ensuring the same timestamp and context across vendors and protocols.

Amphora’s Symphony commodities trading and risk management software is now Amazon AWS compliant. Users can now run Symphony in the cloud on a pay-per-use basis. Amphora also announced a ‘certified native’ integration between Symphony and CubeLogic’s RiskCubed counterparty ‘know your customer’ onboarding workflow and credit risk management solution.

Dell EMC’s Elastic Data Platform is a ‘cost-effective solution’ that extends an organization’s existing big data investments with workload-specific infrastructure, intelligent software and end-to-end automation. The EDP is delivered by Dell EMC consulting.

eLynx Technologies (elynxtech.com) has announced ‘Predictive Analytics as a Service’, a suite of predictive maintenance software products to forecast oilfield problems. eLynx is the ‘fortuitous beneficiary’ of 20 years of industry knowledge gained from monitoring over 40,000 wells across all major US basins. ‘Potent’ data analytics was developed through partnerships with the University of Tulsa Tandy School of Computer Science, Tulsa University Artificial Lift Project and Microsoft.

Dynamic Graphics has released EarthVision 10 with updated modules that eliminate the need for the MKS Toolkit. EarthVision 10 provides an enhanced interface for generating well curve displays, shapefile import enhancements and other visualization tweaks. Also new is a Python-based toolkit for custom development.

Exprodat has released an ArcGIS Pro version of its Data Assistant for the import, export and manipulation of E&P interpretation system data. Data Assistant for ArcGIS Pro also enables geoprocessing and integration with Esri’s ModelBuilder/Toolbox.

Version 8.8 of GeoLogic SystemsGeoScout adds support for planning and visualizing horizontal well surveys. A new Well Profile Viewer module supports well surveys in the context of formations, contours, downhole events, completions and logs. Users can compare multiple surveys to choose the best option before drilling, pick formation tops, import grid files and create reference surveys to compare well bores.

Getech’s Globe 2018 includes new palaeo-surface geology layers to aid understanding of the character, quality and distribution of potential reservoirs in an area of interest. New thermal mapping content enables temperatures and hydrocarbon maturation studies of a basin. New climate profile analytic tools streamline basin modelling workflows.

Honeywell’s vCenter Server Appliance is migrating its operating system from Windows to a ‘new default' purpose-built Linux OS, optimized for virtualization. The new OS is reported to bring a ’significant reduction’ in boot and application startup times and a 3x performance improvement over vCenter Server 6.5 on Windows.

HydroZonix’ hardware and software solution covers the whole oilfield water management lifecycle. Data streams into the cloud from tank batteries, separators and the wellhead for real-time produced water reuse, bacteria mitigation and more. A RESTful API allows for integration with third party systems and mobile endpoints.

INT has announced INTViewer 2018 with new features including a ‘beachball’ display of microseismic foci.

Larson is working on a TECH SVG project to offer scalable vector graphics (SVG) from its VizEx Viewer. SVG brings vector graphics to the web without the need for a plug-in. Download the draft TECH SVG paper here.

eDrilling’s V 10.0 release of WellPlanner adds dynamic simulations and continuous model update with current operational parameters. The tool provides accurate lift, fluid, cutting and temperature distribution estimates. Other benefits include reduced safety margins and reduced risk when drilling wells with tight margins. WellPlanner is available as a standalone product or as a microservice.

Version 12.0 of the OSGeoLive GIS software collection adds support for Lubuntu 18.04 Long Term Support and now leverages the Transifex Translation tool for documentation generation. OSGeoLive is a Lubuntu-based distribution of open source geospatial software, available via a virtual machine, USB or DVD.

Quorum Software has launched the myQuorum Data Hub which pulls and cleanses data from myQuorum business applications and stores it in the cloud for access by business intelligence application such as Microsoft Power BI. The Data Hub is claimed to avoid the challenges of maintaining databases, reports, connections, and data models. Decoupled from the core application infrastructure, Data Hub makes critical operational and financial data available to business users, decision makers, and analysts.

Beijing-headquartered Recon Technology reports 41.1% year over year revenue growth, in part from its Future Gas Station service company that supplies applications and data operations to, inter alia, PetroChina.

Hart Energy unit Rextag is enhancing its Global Energy GIS database with more parameters, pipeline diameters and financial information along with a streamlined user experience. As of 2019, the database will link through to SEC filings, profit and loss statements and other public filings.

TIBCO Software has announced A(X) Experience for TIBCO Spotfire, an AI-driven analytics offering that data exploration, natural language processing, machine learning and for real-time support for streaming data. AXE recommends data relationships, uncovers patterns in data and makes ‘prescriptive suggestions’. AEX plugs into real time data streams such as OSIsoft PI, Witsml, Bloomberg, Reuters and FIX. AEX also supports Apache Kafka and Eclipse Mosquitto, Apache Spark streaming and more.


Neo4J extends graph database with NLP and AI functionality

User Caterpillar reports 27 million parsed and tagged phrases in document repository.

A new release of the Neo4j graph database embeds natural language processing functionality used by flagship customer Caterpillar. Neo4j 3.5 now offers artificial intelligence (AI) and machine learning (ML) functionality. A talk by Caterpillar’s Ryan Chandler at the 2017 Neo4J GraphConnect event in New York showed how the graph database is a foundation for enterprise AI applications, capturing facts and relationships among people, processes, applications, data and machines.

Chandler, Caterpillar’s chief data scientist, has applied natural language processing to a 100,000 document data repository of Caterpillar’s supply, maintenance and repair operations. The new functionality in Neo4J enables the large-scale analysis of text for meaning representation and automatic reading at scale. According to The Data Warehouse Institute, around a half of all enterprise data is unstructured. This is the knowledge that Cat wants to tap into. There are two schools in language processing that leverage ‘dependency’ and ‘constituency’ structures. But both of these are graphs, so the overriding principle is, ‘parse your text into a graph’.

Document repositories grow constantly, in business intelligence systems there is always that ‘next report’. Caterpillar ties different documents together by linking part number, facility identifier and so on. One use case is the development of a natural language dialog systems that allows queries such as ‘how many of this particular part were shipped to Asia?’ Semantic analysis splits text into components, numerical counts, nouns (truck), verb (sold) and RegEx dates. Queries can expand to ‘how many trucks were manufactured around this date and shipped to Asia?’ This requires a dictionary of synonyms - build, produce, manufacture … all built into the graph. The Google speech to text API allows queries in natural language.

The next step is to ‘read at scale’ to extract more meaning, especially from warranty documents, an ‘excellent primary source’. So if a document reports ‘engine knocking’, an oil test can be initiated, and the root cause and recommended solution identified.

Caterpillar has parsed and tagged some 27 million phrases in its repository. A pipeline comprising a Python NLP toolkit, ML classifier and ‘R’ leverages the WordNet lexical data and Stanford’s ‘S NLP’ dependency analyzer. Half of Cat’s items were already tagged and used to train the other half. S NLP was found to be a great improvement over a naive ‘broken is bad’, ‘bucket is equipment’ approach. For example, proximity may reveal that it was the bucket adapter that was broken.

The approach is computationally expensive, especially with 27 million documents. But even a shallow parsing can extract meaning at scale. The open source WordNet was a ‘free’ bonus for fine tuning and constraining definitions. The next step is to add in some VR with the Oculus (and Unity game engine) and filter on time for diachronic document search. Cat is now working with the NSCA on a theoretical model for graph/text/analysis and more ‘semantic analysis at scale’.

For its part, Neo4J has added full-text search into the graph, enabling text-intensive graph applications such as knowledge graphs, metadata management and bill of materials along with AI extensions to its ‘Go’ programming language.

More from Neo4J. Watch Chandler on YouTube.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.