French IT consultancy Octo Technology (an Accenture partner company) held a one-day event in Paris recently that set out to ‘break the spell’ of translating AI initiatives into production. Octo’s Karim Sayadi recommends not building a data lake, data science does not need one to start and projects ‘will fail many times before succeeding’. ‘Exploration, uncertainty and failure is the norm’. AI has given rise to several myths. It is not a solution to everything and one size does not fit all. Such myths are a ‘bad pattern’, making it harder to promote AI and leading to projects that end at proof-of-concept stage or perhaps as the ‘never ending PoC’.
Actually, the solution is simple. Build, observe and evaluate (with the business) and then start over. Start with a data lab and a couple of terabytes of information. A proper data lake comes later and is a choice not to be taken lightly. Use simple ‘explainable’ algorithms and evaluate with good metrics including false positives and false negatives. Finally, do not force AI where more simple approach may be all that is needed.
Yacine Benabderrahmane added that data science may initially represent a simple workflow but that the passage into production is much harder. Enter the Octo Academy, a methodology for producing maintainable and reusable data science solutions and (perhaps) a ‘data-driven’ enterprise. However, ‘hype is in the air’ and much ML is rolled-out on a shaky foundation. Octo’s approach is to deploy minimal ML (just what is needed) on a solid data foundation. In the end, it is ‘craftmanship’ that is required. This implies standard coding techniques, adapted for AI and avoiding building a software ecosystem ‘that is so complicated that you may as well just shoot yourself now!’ Other insights included the need for good quality data (AI is not a data quality tool). Using big IT systems is not an objective. Real time should not be a priority because it is very hard to implement.
There is often a ‘War of Thrones’ between IT, data science and the business, with each blind to the others’ capabilities and needs. This ‘leads to failure and recriminations’. Often data science and the business get along, but IT is more problematic, as it can be distant from the business and peripheral to data science. Octo’s claim is to bring all three together. All need to be involved collectively. One final warning was ‘beware of the super-polyvalent data scientist’ although this was not expanded upon. Octo’s approach is to build collaborative ‘feature teams’ to drive harmonious development and deployment.
Total’s Arnaud de Almeida and Sophânara de Lopez presented on predictive maintenance of industrial equipment. Total is beginning to integrate digital and AI in its production operations. The initial use case revolved around predicting failure on subsea electrical submersible pumps, widely deployed on Total’s Angolan developments in up to 2000 meter water depths. Early detection of abnormal pump behavior (drift) and knowledge of failure modes can maximize pump life and optimize logistics.
Total started with a PoC using historical data on pump pressure, motor speed and logistics, operational and failure post mortem reports. An in-house PoC using data from 10 wells proved successful. This has now been scaled-up to run in a hosted data lake. Today, some 100 wells are monitored with between 30 and 100 sensors per pump. All sensors are modeled and compared with reality. Even weak signals may indicate an issue. Random forest supervised learning was the main AI deployed. The Apache AirFlow workflow manager is key, running in Docker containers for scalability. ESP models are provisioned in a continuous delivery process, with weekly training and daily predictions. Data is time stamped with Python Arrow and saved in the Azure data lake. Grafana, PyCharm and Docker/Jenkins and Octo’s Data Driver also ran. Total recommends that data scientists brush-up on software craftmanship by adopting devops. The solution is now deployed in Total’s local well operations centers and smart rooms for remote monitoring by experts. More from the Octo blog (in French).
A group of Irish R&D establishments, with help from Tullow Oil have has released ExSeisDat (Extreme-Scale Seismic Data), a set of open source parallel I/O and workflow libraries for petroleum seismology. The libraries address the data handling requirements of modern terabyte-scale seismic surveys. ESD addresses the problems associated with standard SEG-Y files with a easy-to-use, object-oriented libraries that are portable and open source and with bindings available in multiple languages. A low level parallel I/O library, ExSeisPIOL bridges the gap between the SEG-Y file structure and file system organization.
A higher level seismic workflow library further simplifies operations by implicitly handling all I/O parameters. ExSeisFlow targets prestack processing of single traces, individual gathers or entire surveys. Functions include out-of-core sorting, binning, filtering, and transforming. The libraries are optimized to handle spatial and temporal locality and are said to fit well with burst buffer technologies such as DDN’s Infinite Memory Engine (IME) system. ExSeisDat on IME delivers a significant improvement to I/O performance over standalone parallel file systems like Lustre. ExSeisDat was developed by researchers from the Irish center for high-end computing, Lero, DDN Storage and Tullow Oil. The work is published as an open access paper in OGST, the IFP Energies Nouvelles’ research publication.
A popular notion that is often used to justify the ‘disruption’ of digitalization is that digits will enable new business models. If you consider that using blockchain to settle transactions in oil and gas is a new business model well, good luck with that. But in general, as I have said before, it is hard to see quite how more digitization can truly disrupt an industry whose business is to produce, transport and refine oil and gas. Seeking a digital technology that is going to change the way oil and gas does business is putting the cart before the horse. Much better to sit back and think about what new business models (or even better, new businesses) there are out there and then, if needed, tailor a digital solution to the problem.
The difficulty is that oil and gas has for a long time been seeking new business models, increasingly so as it is attacked from many sides as a polluting, CO2 generating twilight industry. These new businesses usually involve something like, buy a battery manufacturer, a windfarm or photovoltaic operation. The problem here is that, being an oil and gas company does not really put you in a much better position than say, venture capitalists or other investors and it likely puts you in a worse position than the incumbent utilities working these new spaces already. Oil and gas really needs some new business that dovetail, in a profitable way, with oil and gas production and that, if possible, is green. And it so happens that I have one such business teed-up and waiting to go.
Speaking at the 2018 ‘Meet the Projects’ meeting of CATO* chez TNO in Utrecht, Circular Energy CEO Arnold de Groot presented an elegant solution to a three-way problem, of simultaneously producing stranded offshore gas, sequestering CO2 and load balancing the intermittent output from offshore wind farms. Oh, and, by the way, making money while doing it. The Dutch North Sea makes for an excellent testing ground for Circular Energy’s approach. The sector has a significant number of stranded gas fields, ones that are too small or awkwardly placed in regard to infrastructure to justify being put into production at current gas prices. There are also a large and growing number of offshore wind farms. And the Netherlands government’s climate legislation is strongly supportive, with CO2 reduction targets of 50% by 2030 and 95% by 2050 compared to a 1990 baseline.
Circular Energy’s grand plan is to burn gas on an offshore production platform above a stranded gas field, to produce electricity on-the-spot, to capture flue gas CO2 from the generators and to pump it back into the reservoir. Circular Energy has modeled the propagation of CO2 in producing reservoirs and deemed it problem-free and has patented some aspects of the simultaneous injection of CO2 and production of natural gas from the same reservoir. The electricity produced is exported to nearby offshore windfarms (BorWin, Gemini, DolWin, etc.), feeding into the existing electrical grid and providing scalable load-balancing to the windfarm. When the all the gas has been produced and the reservoir filled with CO2, the whole caboodle jacks down, moves over to another field and repeats the process.
There is an ongoing debate in the press over the cost of carbon capture and storage (CCS) which some believe to be prohibitively high. de Groot will have none of this. Circular is economic at current EU carbon prices and the activity also receives revenue from selling its electricity. A further advantage over other CCS approaches is that the stand-alone platform needs no long CO2 flow line, unlike other ‘end-of-pipe’ solutions. Circular expects to benefit further from volatility in power prices, ‘we can supply our electrons to the grid at the best time’. Overall a 12% rate of return is expected, assuming a modest rise in electricity prices.
We have been skeptical about the very high capex involved in purpose-built CCS installations. de Groot puts the cost of the US PetraNova plant at over $1 billion for 25 years at 1.4 megatonnes/year. Circular’s is far cheaper at €83 million for 25 years at 0.6 megatonnes/year CO2 sequestered.
A €5 million initial fundraising in September 2018 was undersubscribed, in part because a number of prospective investors advised that a larger float would get a better reception. Consequently, Circular is about to announce a scaled-up, €100 million fundraising to kick-off an initial pilot codenamed ‘Cranberry.’ Cranberry is to cost around €300 million and Circular already has a 40% joint venture partner in EBN, the Netherlands state oil company. Other JV partners have been identified, leaving Circular with a current stake of around 30%, along with operatorship of the facility. The company is open to other partners joining the consortium.
Does digitization have anything to do with this? Of course it
and there is plenty of new stuff for oil and gas and process control
engineers to get their heads around, as oil and gas
gets electrified. We offer you a quick start guide
to digital transformation of the electrical industry-at-large elsewhere
in this issue and,
in our first issue of 2019, we will be reporting more from
* CO2 Afvang, Transport en Opslag (CO2 capture, transport and storage), the Dutch national R&D programme for CO2 capture, transport and storage, a consortium of nearly 40 partners including Gasunie, NAM (Shell/Exxon JV), Schlumberger, Shell, Wintershall, cooperate.
Dmytro Perepølkin (Equinor) asked ‘is spatial special?’ and proposed to make it less so by democratizing geospatial analytics with R. R now provides a geo capability, notably with R-Spatial and Simple Features. Tidy Data got a plug as an open source data munging environment. More from the project page.
Landmark’s David Seymour and Lars Vatne presented work in progress on a well planning decision support system. This extends Landmark’s DecisionSpace/Compass/EDM toolset with a configurable, business automation engine. The tool performs workflow management and pings people who have not done their bit on time. In general, it is better to solve inconsistent data at source, preferably by contract. Since (most) all companies use SAP, it would be a great idea to get contractors to deliver price schedules in SAP format and get composite quotes direct from SAP. The BPM tools have worked with Landmark data, on Petrel, SAP and ‘anything in a JSON object’. The tool is under trial to automate data loading and well planning in a major project on a mature field with the aim of cutting well planning time to under a week.
Interica presented its updated PARS project archival solution, now certified for public and private cloud storage. The company reports ‘first significant take up’ of data written into the public cloud. This leverages new functionality for data verification, encryption and segmentation to avoid size constraints and optimize transfer rates. Interica is cloud-agnostic and has been tested in multiple hybrid scenarios including running the app in the cloud searching for data locally and archiving to remote object/blob storage. Data management automation has also been enhanced in response to the corporate de-manning. The first automation app is ARBA (automated rule-based archiving), used to generate archiving rules from project metadata and archival policies. Users can archive and/or delete thousands of projects and datasets at petabyte scale automatically. A new API makes Interica PRM and PARS more ‘open’. Also new is the Interica OneView platform, an HTML 5 map window into multi-cloud live and archived projects and data. The company is also working on a new graph database technology stack with GQL and Apache Tinkerpop that promises enhanced performance and flexibility for metadata management.
RoQC provides an elegant solution for Petrel (and Open Works) data audit and management with mathematical, logical and geoscience-based checks of inter alia, cartographic reference system, units of measure and naming standards. All are rolled-up into data quality surveillance KPIs. ROQC automatically interrogates project data to uncover and report anomalies and data quality over time. For Norwegian users, projects can be synced and matched with NPD data.
Kadme presented its Whereoil data management platform, said to be a hybrid of a data lake and business intelligence solution. Whereoil is a key technology component of Petronas’ PiriGIS solution. A data pipeline feeds Whereoil and PiriGIS queries this via Kadme’s RESTful API. A similar configuration connects Whereoil to the NPD’s online data stores. Kadme also got an enthusiastic endorsement from Russian TatNeft that has built a petabyte scale archive with help from Whereoil.
Cognite has set out to conquer a world of ‘siloed data’ where service providers rebuild hundreds of data connections every year and, in an average large oil and gas company, some 40% of sensors are ‘orphaned’ and can’t be mapped to equipment tags. Cognite builds a complete digital representation of current and historical industrial reality with a horizontal data platform across systems and assets. Cognite understands systems from Arundo, GE, Sintef, Honeywell, Siemens and more. In August 2018, Aker BP signed a ‘data liberation contract’ with Cognite covering its Framo pump portfolio. Framo’s maintenance is now tuned to actual pump performance and better aligned with Aker BP’s requirements.
IO Data has developed a comprehensive GIS-based seismic data management solution that provides access and overview of offline and online data. IO MapClip provides access to all seismic data through a GIS interface while IO SubSee manages video data from underwater installations. IO dares to leverage non-Esri technology in its GIS, specifically a combo of Angular, Golang, HTML 5, Postgres and PostGIS. Connectivity is secured with a HTTPS encryption and a JSON Web Token/OAuth combo. Mapclip’s open source GIS technology is claimed to provide value for money while IO adds its seismic and data management know-how to secure data storage with Cegal.
At the 2018 Annual meeting of the Society of Exploration Geophysicists, Jay Hollingsworth compered a real-world collaborative test of Energistics’ Resqml as a vehicle for cross vendor reservoir model data sharing. The Kepler project is a multi-vendor pilot that demonstrated the value of the Resqml standard in earth model data transfer. Partners in the BP/Shell-backed project included CMG, Dynamic Graphics, Emerson (Paradigm and Roxar), IFP/Beicip and Schlumberger.
Reservoir modeling typically involves data sharing between different disciplines, each with their own preferred software tools. Prior to Resqml, data was easily corrupted or lost during transfer due to different model footprints. Resqml addresses these issues by enabling different model components (objects) to be moved safely between software tools. The Kepler project used data from BP’s Na Kika Gulf of Mexico development and involved an operator sending a geological model to a joint venture partner who generates an alternative facies scenario and pore volume estimate. This was then passed back to the original model for additional fluid flow studies and visualization of the time-stepped results. Kepler demonstrated exchange real field data, across six different vendors using several applications. The demo was enabled with help from technology providers Amazon and RedHat along with the Society of HPC Professionals.
Amphora’s AssayLockeris a data warehouse and analytics tool for multi-commodity traders to analyze historical data, identify trends and optimize decision-making across quality attributes.
Arria NLG has announced release 2.0 of its Studio for BI with new functionality to transform business graphics into a natural language narrative. Arria also recently announced that its NLG Artificial Intelligence engine now powers Ernst & Young’s natural language generation global portal that allows EY employees ‘to make better and faster decisions’. On the downside, a report in the New Zealand National Business Review had it that mid-November 2018, Arria was ‘burning cash’ as it targeted an NZX listing, and that its late accounts showed ‘balance sheet strife’ at the software company.
Badley Geoscience has released ‘ORC’, its optimized resource calculator. ORC provides a rapid workflow for calculating volumetrics of structural and stratigraphic traps using mapped surface data and/or digital images of maps. ORC can be downloaded for free until May 2019.
Barco’s new ‘Ideation Wall’ is a hardware and software combo that offers an easy-to-install collaboration tool to support ideation sessions*. The Wall embeds Barco’s ClickShare CSE-800 wireless collaboration solution.
* Aka meetings!
BHGE has announced ‘EngageSubsea’, an asset lifecycle management solution that optimizes planning, execution and remote asset management. EngageSubsea is claimed to offer ‘up to’ a 20% reduction in maintenance costs and a 5% reduction in downtime through predictive analytics.
Billington Petroleum Technologies has announced DOSS, a digital operator support system that offers a real time data management platform in which first-principle thermodynamic tools such as Petro-Sim, LedaFlow, Olga, HySys and K-Spice can interoperate. The platform connects to existing asset infrastructure via OPC UA, Modbus, SQL and data historians to enable co-visualization of real time and simulator data. DOSS-generated data is written to the plant historian and made available to any endpoint. BPT DOSS promises adaptable, open data integration with ‘no future limitations’.
Ikon Science’s RokDoc 6.6 release adds new 4D reservoir monitoring functionality with support for corner point grid data that enables integration of static (geomodel) and dynamic (flow simulation) property models into 3D/4D shared earth models. These can then be used for 4D time lapse feasibility studies and ‘close the loop’ workflows. RokDoc can estimate reservoir elastic properties and their dependency on stress, temperature and fluid saturation. Ikon is also working to capture and aggregate rock property information and generate high-quality training data for machine learning.
The 8.3 release of Emerson/Roxar’s Tempest enhances automation and user productivity through greater compatibility with industry-standard reservoir simulators and easy embedding of simulation workflows and models within Emerson’s ‘big loop’ workflow.
Halliburton has announced ‘Cerebro’ a drilling bit with built-in sensors that captures key drilling performance data at the bit. Cerebro records vibration and motion data while drilling, to pinpoint where bit damage occurs and/or where optimal performance is not being achieved. The system identifies several common drill bit factors including lateral and axial vibration, torsional resistance, whirl and stick-slip which can negatively impact drilling speed and reliability.
Interica has announced epShare, a new Microsoft SharePoint platform to address subsurface data challenges. epShare leverages data lakes, virtual taxonomies, GIS and a PPDM database in a ‘fast, intuitive and flexible platform’ for subsurface data management.
Petrotechnical Data Systems has announced the 2018.10 release of Ava Clastics sedimentology and analogue database software, which now includes access to the Deep-Marine Architecture Knowledge Store (DMAKS). Developed by researchers at the University of Leeds, is now available to Ava Clastics subscribers.
Intel has announced the Neural Compute Stick 2 (Intel NCS 2) a USB thumb drive-styled component for designing AI algorithms and prototyping computer vision ‘at the network edge’. The NCS includes Intel’s Movidius Myriad X vision processing and the OpenVino toolkit.
Laser Technology has partnered with ProStar Geocorp to provide a digital mapping and data collection solution integrating ProStar’s PointMan with LTI’s TruPulse 360 professional-grade laser rangefinder.
IntOp is now offering its analytics for text-based data from the Microsoft Azure Marketplace with connectors to SharePoint and OneDrive. Flagship client DNO Norge uses IntOp’s tools on E&P data from the Norwegian Continental Shelf.
Arundo Analytics and Dell Technologies have released an IoT Bundle for Oil & Gas, a hardened, maritime class-certified unit, purpose-built for advanced computing in remote, rugged environments. The Dell Edge Gateway 5100 computers come pre-loaded with Arundo Edge Agent software.
Drillinginfo’s OptiFlo Gas is a map-based front end, forecasting and modeling tool for the US natural gas market.
SPEE, the Society for Petroleum Evaluation Engineers, recently conducted a survey of software tools used in oil and gas property evaluation and reporting. The raw survey data is available along with a presentation on the survey results.
Schlumberger’s Concert technology brings real-time surface and downhole measurements, data analysis and collaboration capabilities to well testing. Concert integrates well test data via wearable technology, wireless sensors and video cameras. Real-time data collection and communication is said to accelerate testing operations while significantly reducing the personnel required.
Spectro Scientific has released a new generation of its FluidScan handheld oil analyzer. FluidScan uses transmitter infra-red light to analyze a fluid sample. Results are compared with a built-in fluid reference library for rapid, on-site analysis of in-service lubricants. The new analyzers have upgraded digital electronics and faster embedded processors with new software and calculation algorithms that speed the analysis process. FluidScan’s library includes almost 800 oils and greases of a wide range of chemistry and brands. A new water index parameter helps users track dissolved water trends in used grease. FluidScan can be deployed as a standalone unit or as part of a Spectro MiniLab configuration.
Tatsoft’s FactoryStudio 8.1 is a new version of the .NET-based rapid application development platform for scada, IIoT and other real-time system. The company has also announced EdgeHMI for local displays and an IIoT-gateway for remote data collection. FactoryStudio also supports mobile endpoints with an HTML5 client and an iOS native app.
Weatherford’s ‘Vero’ uses artificial intelligence to minimize tubular running safety risks and validate well integrity ‘with absolute certainty’. The solution employs two new proprietary features: AutoMakeup technology, which precisely controls the makeup of tubular connections automatically and AutoEvaluate software, which continually assesses torque against original equipment manufacturer specifications.
Wood Mackenzie has used a ‘cloud-native, open-source route’ to develop its new Lens platform. Lens exposes both WoodMac’s own data and clients’ in-house and third party data spanning oil, gas, chemicals, metals, mining, power and renewables. Lens leverages Amazon’s AWS services to provide a scalable and secure platform for the delivery of large industry datasets and analytics capabilities.
IP.com has added the Society of Petroleum Engineers’ OnePetro library of scientific papers to its ‘InovationQ’ discovery platform. OnePetro is an online library of technical literature for usptream oil and gas exploration holding some 200,000 papers from 20 publishing partners. IP.com enables full-text indexing and search across OnePetro and other content, notably from the the Institute of Electrical and Electronics Engineers (IEEE) and the Institution of Engineering and Technology (IET). IP.com claims a database of over 100 million patent and related documents. Its InnovationQ tools are said to be offer ‘robust’ prior art search of global patent authorities and IBM’s technical disclosure bulletin. IP.com uses patented neural network machine learning technology in its ‘semantic search’ platform. The system now includes corporate tree data from S&P Global Market.
The SPE is also working with i2k Connect to deliver ‘a new experience’ for finding and analyzing information on SPE.org, PetroWiki, and OnePetro. New AI-based technology ‘combines machine learning with subject matter expert knowledge to automatically tag documents with accurate and consistent metadata’.
Gregg Le Blanc (OSIsoft) opined that OSIsoft knew, ‘long before it was popular’ that being data-centric is key. But sensors may have odd naming conventions, units of measure issues and data gaps. The new data lakes can quickly become data silos. The answer, ’select your IoT platform, your cloud of choice and bring them together with OSIsoft Cloud Services’. Elsewhere OCS is presented as ‘built on Microsoft Azure, although we imagine this is not an exclusive relationship.
If the OCS is OSIsoft’s cloud management solution, at the IoT edge, as we reported last year from London, a parallel ‘Pervasive data collection’ (PDC) offering, along OMF, the OSIsoft message format, provide connectivity and edge processing to stand-alone kit such as vibration sensors. Chris Nelson (OSIsoft) reported that OMF is now supported by third party libraries (Open FogLAMP from Dianomic). OMF’s small footprint enables a persist data store for the IoT edge asset.
Javier Valdes (Iberdrola) provided insights into the complex energy market in Spain (and in Europe). A huge transition is underway with more electricity generated from clean sources (renewables and nuclear). In electricity, the first big change happened in 1988 with EU liberalization. Now the change is around decarbonization, the ‘empowerment’ of the customer and integration of electrical systems. In Spain, renewables have risen but plateaued since 2013. And they are cannibalizing non-renewables as demand is not growing. Today 20% of electricity is renewable and 36% of this is wind. But wind is poorly connected and needs backup power. Nuclear is not great as a backup as it needs to run constantly. Coal is not considered an energy of the future and Iberdrola has asked for permission to close its coal plants. Combined cycle gas generation are not designed for low/intermittent use* and are loss making. They may be closed too unless a fixed payment for combined cycle operations (like in the UK) can be negotiated. ‘Efficiency, electric vehicles, heat pumps (!) and combined cycle gas are the keys to the future’. OSIsoft is a partner with Iberdrola on data analytics at combined cycle gas plants to enable them to start quickly when the wind drops.
* Although see below for TransCanada’s ‘peaker plants’
The discussion forum on digital transformation revealed what we already surmised that, although vendors and IT departments are enthusiastic about the movement, management remains cautiously skeptical. Digital transformation has proved harder than was thought five years ago. For AstraZeneca, digital transformation is an interesting term that can terrify senior management. Does it translate into ‘let’s give IT loads of money’? It is a great buzzword but rather dangerous. For management, phone and email is more important and IT has a hard time just doing the basics. Some attempts to push digital whether management liked it or not had have been poorly received. What are the basics? Capture data and enable its use. Which translates into avoiding data hording or non-capture. ‘These basics are a positive thing for us’.
OSIsoft recommends taking small steps in the transformation. Some attempts fail because of the focus on the ‘shiny objects’ of artificial intelligence and machine learning. It is better to start with the data you already have and avoid starting big data initiatives without a business question to answer!
For Cargill the transformation is ‘absolutely about changing how you do your business’, towards a situation with no operators on the shop floor, no truck drivers. Digital transformation is a ‘North Star’ for the journey. But, ‘many can’t cope with the big journey’. Some have been doing this for 20 years and as ‘what more can I do now?’ But today, people understand technology change better.
Accenture’s global clients are on the transformation journey to improve operational effectiveness and also to find new sources of revenue. Some are striving to do both, trying new ideas and failing fast. While this can be problematical in risk averse industries, ‘you have to try’.
Brandon Perry (OSIsoft) stressed the importance of quality as a prerequisite for data science. Averages in PI and Excel may be ‘quite different’, apparently a common pitfall. Delving deeper into the nature of PI data streams, ‘you may have heard the term time series’. Data may not be evenly spaced, there may be gaps that need interpolation. PI captures quality tags such as ‘no value’, ‘modified’ and ‘annotated’. Other issues come from using PI’s option to compress data. Perry cited Nina Thornhill’s 2004 paper that found that ‘compression kills data’. So what should you do to enhance data quality? Check and understand the impact of data compression, filtering and sample rate. Add sensor metadata to PI assets, cleanse raw data and tag ‘no data’ correctly.
When using the PI Data Link to Excel, ‘interpolation may not be the best way to go’ it may be better to use time-weighted aggregates. Also, use PI Event Frames to delimit process states and derive aggregates inside frames. While all this is undoubtedly important, we were thinking ‘what about Nyquist?’, we were not alone, others were muttering ‘what about Shannon?’ There is quite a lot of ‘prior art’ here worthy of consideration. Another potential issue we spotted was how PI adds units of measure to its data points. This appears to be by overloading an ascii data field viz: ‘270°F’, a common, but surely not a best practice!
Keary Rogers and Ionut Buse presented TransCanada’s massive gas transport system that carries some 25% of US natural gas. Infrastructure includes 200 compressor stations and 800 other units. Gas is now flowing south to the Gulf Coast for petrochemical plants and LNG export. One particular issue is the need for quick gas to ‘Peaker Plants’ that compensate windfarm output drops in no wind situations. Behind all this is quality real time data, enabled by meticulous attention to bad values, stale data and flat lining sensors. For TransCanada data QC is (or should be?) amenable to a machine learning approach although this is ‘work in progress’. This is not the first time we have heard of ML being used as a corrective to data quality issues. While there are undoubtedly cases where this makes sense, the temptation to allow bad data to enter the system, on the supposition that it can be fixed later on, is clearly to be avoided!
David Cameron, from Norway’s Sirius R&D establishment, Evgeny Kharlamov (University of Oxford) and Brandon Perry (OSIsoft) proposed a joint venture/consortium to investigate a semantic digital twin. Sirius’ goal is ‘scalable data access in the oil and gas domain’, which, in principle should be a prerequisite for a digital twin. But what is the digital twin? There are many definitions and applications. The process industry has been doing multi-physics probabilistic simulations to mirror and predict its plants for over twenty years. Now consultants and marketing departments have discovered the digital twin and ‘they are everywhere’. They exist in automotive and aerospace (and in oil and gas already) but they are in reality, ‘systems held together by tape’ and are ‘too large and unmanageable’. There is a need for a scientific basis for these systems of systems.
Evgeny Kharlamov observed that today, PI Asset Framework is used to describe a plant and could be used as the basis of a digital twin. But Kharlamov believes that the digital twin would be better supported with a semantic model which would allow for wider open-ended use across machine learning, data science and analytics. Enter the semantic web and a graph database of process models combining ‘physical, digital and cognitive’. Now ‘there is no need for PI AF, just use a semantic model’. Tools of the semantic trade include RDF, RDFa, SKOS, SPARQL, OWL (and more). ‘Semantification is a trend’ Semantic equipment models have already been created, notably with Siemens in the EU Optique semantic project.
Brandon Perry floated the idea of a R&D consortium to develop a ‘cognitive understanding of our equipment’ to ‘receive predictions and warnings’, drop new apps into a twin or allow self-organizing applications. Perry acknowledged that industrial ontologies are tricky and have met with a mixed reception to date. This consortium should make them practical and ‘augment the physical world by mimicking the physical asset such that it might pass the Turing test’. The Big Data Value Association was cited in this context.
Cameron summarized that ontologies can be incredibly complex, some good, some not so good. They need to integrate with corporate knowledge structures. The consortium plans pilots with EPCs and vendors in oil and gas building a semantic backbone, faceted user interfaces and standards. There are ‘lots of standards out there’ to enable ‘better data science and hybrid analytics’. Sirius’ focus is the upstream, field management, topside facilities and lifecycle modeling but with new EU funding this may extend to process control.
Comment: It is surprising that semantics and RDF are presented to this community without reference to the huge amount of somewhat unsuccessful prior art in the field – as Oil IT Journal has diligently reported in over 200 articles since 2003.
John Wingate said not to believe the hype around machine learning. Industry has been doing this for years. Wingate’s company Toumetis is a machine learning boutique and a practitioner of applied data science that has been using neural nets for 20 years or so. Today these run quicker, there are new algorithms but it is still ‘just ML’. Toumetis’ Cascadence is applied ML for planning, ranking and forecasting, all fueled by data services. Industrial ML workloads require comprehensive storage, a Python API and modern web technologies. OSIsoft’s Laurent Garrigues showed how OSIsoft Cloud Services connect multi customer sites to technology partners including Toumetis without duplicating data. OCS is a new cloud platform, built (on Microsoft Azure) from the ground up, with Toumetis inside.
Wingate added that the tricky part of the ML equation is getting good labelled training data which is expensive and requires humans in the loop. The human attention span bottleneck makes it preferable to use unsupervised ML and a modern user interface to reduce the labelling workload. Enter the Cascadence Asset Modeler, a cognitive computing solution that accelerates the transition from a flat PI tag structure to a robust and dynamic PI AF with ‘a much more detailed description of equipment, its hierarchical relationships and associated metadata’.
Ali Hamza and Peter van den Heuvel presented a business perspective of real-time operations at Shell, updating the 2017 presentation. Hamza is global head of Shell’s wells reservoir and facilities management WRFM unit. Shell’s strategic IT themes are ‘everything to the cloud’, data and analytics, collaboration and mobile, legal/regulatory/cyber. The business guides IT investment decisions and digitalization. Shell already has 5 petabytes (out of 8) in the cloud and the PI System is ‘at the heart of our digitalization roadmap’. PI reads are of the order of 3.5 trillion/month, a steep increase over last year following the introduction of advanced analytics. Shell has begun an analytics proof of concept on its 500,000-strong portfolio of valves, migrating its legacy data to Microsoft Azure for analytics. PI data quality is key here. Shell is tracking the PI System roadmap to the cloud closely. BG integration was also a major undertaking. The whole BG landscape is now in the cloud. Also, everything is now done on thin clients. ‘We don’t want any more desktop because of the overhead of updating multiple endpoints’. 2018 saw Shell’s PI Vision flagship deployment on the Prelude FLNG vessel.
Hamza observed that the upstream and integrated gas businesses produce huge volumes of data, far too much for humans to analyze. A lot is expected from digitalization but this is ‘neither new, nor a one off thing nor only in the future’. Digitalization is driven by decreasing cost of sensors and data storage in the cloud, better AI and more compute power. Along with PI, Petex and Energy Components got a call-out. Petex Gap was used to model and understand failing wells and mitigate a 500 bopd production loss by tuning separator pressure and adjusting anti-foulant rates. The idea is simple, the technology available, it was just a matter of using it! Analytics and ML represent a new era in our industry. Shell’s work on control valve incidents (a $6 million loss in 2015) involves a ‘deep dive’ into performance data and collaboration across IT, engineering, maintenance. Shell has built a PI-based data quality system that now underpins its digital oilfield and analytics initiatives. Shell is now working with OSIsoft on a data governance solution embedding ISO 8000 standards for data quality, KPIs and PDCA remediation*. Heuvel added that Shell has learned from other initiatives that addressed devops, trainings and device management to avoid folks asking, ‘why did you develop this? we did not ask for it’. The way forward involves a closer relationship with the business, global roll-outs and fit-for-purpose, stable software. In the Q&A, Hamza opined that having units of measure in PI tags is ‘simple but really important’.
* Demming’s plan-do-check-act.
Samantha Ross and Rob Sutton explained how, following an enterprise agreement with OSIsoft, BP has been working on a solid foundation for PI AF. In general, technology has matured and hardware infrastructure is in place such that there are no more server throttles or scan rate limits on PI. A new generation of digital users and a social shift to tech adoption means more use of Excel and PI Data Link. Today PI a is a ‘regular conversation point’ in BP. A no-trips policy also means more use of software to reduce deferrals and abnormalities. Business analytics 'reduce cognitive load’ while automated surveillance identifies weak signals and generates insights. The new technology needs new visualizations and a ‘break from the legacy of the last 30 years’. BP’s new data team has already built severable minimum viable products (MVP) for analytics using PI Element templates and PI Visions.
HMI development has been challenging for the last 20 years. The objective is for a 90% reduction in legacy visualizations less text and deadbanding*-configured data hiding. BP uses a ‘task-based design’ and ‘human factors’ approach (but the screens shown were not very pretty – see below!) One success was reported from the North Sea when weak signals in analytics triggered new insights into failure modes – creating a ‘clear air of excitement’. BP is ‘turning dark data into leading indicators’. AF and PI Vision are key and are breaking legacy design, challenging resistance. In the Q&A BP acknowledged that its new generation visualizations could be improved. ‘They still look like grey scada images’. Seemingly PI Vision offers a ‘limited toolbox’ and BP plans to deploy more sophisticated visualizations real soon now. The MVP approach has allowed BP to deploy sandbox solutions in just a few days. But BP is not going to let operators do their own PI Vision screens. Development is centrally-controlled. BP does encourage users to come forward with ideas. False positives from analytics can be reduced by deadbanding on time or percentage values.
* Actuator thresholds which trigger data transmission.
Tauna Rignall presented DCP Midstream’s business transformation with PI (already a highlight of the 2018 PI World in San Francisco). DCP was confronted with familiar operator problems. Its data architecture was focused on process control and operations, analytics and reporting were afterthoughts. There was no centralized and normalized set of operational data in the company and copies of data were shared, in spreadsheets, with multiple parties. Rignall’s presentation focused on a company-wide PI System, deployed under an enterprise agreement. The thorough deployment included standard naming conventions, a rigorous PI AF structure and a governance capability. All delivered by a team of subject matter experts, PI product team and making heavy use of PI AF/PI Vision templates. DCP has now created an ‘Energy Lab’ unit for rapid development of digital solutions using the PI System. These are deployed in the ICC, the integrated collaboration center. One interesting facet of DCP’s infrastructure is the ongoing use of Windrock’s Spotlight application an ‘IIoT-enabled’ application for advanced machinery analytics. What we find interesting about the way Spotlight has been embedded into DCP’s PI infrastructure is that it shows how a business solution can be successfully used in such a ‘foreign’ framework. We can across a similar edge deployment last year with Setpoint (see below).
Those concerned with PI’s challenged eye candy (BP?) may went to check out Seeq’s ‘fantastic’ visualization tool. Brian Parsonnet showed how Seeq can be used alongside PI AF to improve data quality and avoid the ‘garbage in, garbage out’ syndrome. Seeq claims to handle data gaps, flyers, noise, interference, sensor drift, timestamp alignment, incorrect interpolation, round-off and bad units. Seeq offers a ‘once…and done’ data stream approach as opposed to data movement with ETL*. Seeq is trained by a subject matter expert before deployment.
* Extract, transform, load.
Dominique Florack (Dassault Systèmes) outlined a collaboration between OSISoft and Dassault-Systèmes to maximize the ‘virtual plus real advantage’ by combining 3D Experience and PI. There is a huge untapped market for the digital twin across city planning, mining and massive cruise ships. Today, ‘few understand the connection from real to virtual and the data platform’. This is an 'unstoppable revolution’.
A chat with a Caterpillar engineer offered us another data point in our ‘how big is your data’ quest. Caterpillar records 1 HZ data from its largest marine diesels from some 200 sensors. That would likely be less than a megabyte per day. Big, but not so big as GE has claimed for its jet engines. Our Caterpillar contact was also circumspect about the new big data/analytics hype. Both Cat and GE have been recording sensor data for quite a while, using somewhat straightforward noise thresholds to indicate excessive component wear. In another cryptic note, we recorded that ADNOC’s ‘Panorama’ triple 50m screen ‘AI and big data center’ is ‘now only used for training’.
Finally, the Windrock deployment made us think back to the 2017 OSIsoft London conference. Much talk of ‘big’ data is ‘forward looking’ in so far as the future deployment of a large number of low cost (?) sensors will enable as yet unproven algorithmic techniques to perform better than… well, better than what exactly? Last year this was B&K’s Setpoint monolithic application that has been doing the run-life predictive thing for quite some time and integrates the OSIsoft ecosystem in a rather elegant way, bridging the gap between, low frequency PI data and HF vibration data at the edge. It looks like Windrock plays a simlar role and likewise can integrate the big data/cloud environement with help from PI.
Read these and other presentations on the conference minisite.
Anadarko has appointed Bob Gwin to president. Benjamin Fink succeeds him as EVP, finance and CFO. Robin Fielder is now SVP midstream.
The American Petroleum Institute has appointed Rolf Hanson as VP State Government Relations.
Paul Mansky has joined Arria NLG as its US-based CFO.
Amy Heintz has been promoted to the newly created Technical Fellow position at Battelle.
João Araujo, COO of Braemar, is to lead the operations of the newly-signed strategic partnership for marine services in Latin America between Unique Group and Braemar.
Steven Green is now president of Chevron North America E&P, succeeding retiree Jeff Shellebarger.
Clariant has announced the opening of a new state-of-art laboratory at its facility in Midland, Texas.
Colony Capital and HB2 have formed Colony HB2, a new energy-focused investment management platform lead by president and CEO Michael Bertuccio.
Roy McNiven is now CSI Compressco’s VP operations. He was previously with Nabors.
Kelly Raper is president of Desert Downhole Tools. He was previously co-founder and president of Priority Artificial Lift.
Brian Anderson is to head the US DOE’s National Energy Technology Laboratory. He was previously director of the West Virginia University Energy Institute.
DNV GL has commissioned the ‘world’s largest’ industrial explosion test chamber at its Spadeadam research and testing unit in the UK.
Larry Culp is the new chairman and CEO of GE.
Paul Cleverley, previously with Flare Solutions, is founder of startup Infoscience Technologies.
Mark Peterson has been promoted to Oceaneering’s VP corporate development and investor relations.
Natural Resources Canada is now a ‘strategic member’ of the Open Geospatial Consortium.
Perma-Pipe International Holdings has appointed Bryan Norwood as VP and CFO following Karl Schmidt’s retirement. Norwood hails from Key Energy Services.
Petrofac and its partner Takatuf Oman have inaugurated the TPO Oman training center to train the next generation of the Sultanate’s oil and gas workforce.
Dallas Smith is now CEO with Petrotranz.
Piper Jaffray has rebranded its Simmons & Co. unit as Simmons Energy.
PRCI has formed a technical committee to work on the design, construction, and integrity management of subsea risers, flowlines and umbilicals. The Subsea TC is chaired by Jamey Fenske of ExxonMobil, with Vice-Chair support from Ludovic Assier of Total and Farzan Parsinejad of Chevron.
Darrell Williamson has joined ProStar Geocorp as chief of sales. He hails from FyrSoft.
Charles Goodman is now executive chairman of Quorum Software’s board. Perry Turbes is to retire as CEO, remaining as a member of the board.
Thomas Driscoll has joined Rose & Company as MD of its New York office. He was previously with Barclays.
The Railroad Commission of Texas has named Jeremy Mazur as director of government relations.
Chad Robinson heads-up SCF Partners’ Canadian office. He hails from Resource Merchant Capital.
David Lucarelli has been promoted to VP human resources at Swagelok.
Bart Thielbar is CEO and president at SAP specialist Utegration. He succeeds Bin Yu, now chairman of the board. Thielbar hails from Capgemini.
Kevin Fletcher has been elected as President of WEC Energy Group.
Bryan Ellis is president at Wild Well Control succeeding Freddy Gebhardt.
Janet Yang is now EVP and CFO at W&T Offshore. David Bump is EVP, Drilling, Completions and Facilities and William Williford is EVP and General Manager of Gulf of Mexico. Bump and Williford now jointly assume the duties as of former SVP and COO Tom Murphy who left the company.
Jean-Christophe Flèche has been named director of communications and institutional relations at IFP Energies Nouvelles. He takes over the position from Marco De Michelis, now special advisor to IFPen president Didier Houssin.
Teradata has appointed COO Oliver Ratzesberger to its board.
Paul Smith is CFO at Wintershall DEA.
Lance Loeffler has been promoted to CFO at Halliburton following the departure of Chris Weber.
Aveva has appointed Lisa Johnston as CMO. She hails from Vista Consulting Group, a Vista Equity Partners unit.
CDA has confirmed a deal with the Oil and Gas Authority to launch the UK’s first National Data Repository for offshore geoscience data.
Ryder Scott has announced the onboarding of Mark Nieberding (senior petroleum engineer), Inty Cerezo (senior petroleum geoscientist), Cecilia Flores (senior PE), Mariella Infante (senior PE) and Sara Tirado (senior petroleum geophysicist).
Roddy Urquhart is MD of Industrial Internet of Things boutique Flicq’s first EU office in Aberdeen.
NOW Inc. reports the death of company’s board member, Michael Frazier.
Ametek has acquired Spectro Scientific in an approx. $190 million deal. Spectro will join Ametek’s Electronic Instruments Group.
Amongst various long-term agreements made recently, GE has granted its BHGE unit access to GE’s ‘digital software’ and technology along with various operations and pricing arrangements. The agreements are a prelude to the intended ‘orderly separation’ of BHGE from GE. The companies have agreed on a sale by GE of part of its BHGE stake which will however maintain GE’s stake above 50%.
Fracking software boutique Cold Bore Technology has closed an investment by Rice Investment Group. Cold Bore’s IIoT-based electronic completions recorder and remote frac operating system are delivered as a ‘SmartPad’ service.
Geospace Technologies has acquired the intellectual property and related assets of OptoSeis from PGS Americas. OptoSeis is a fiber optic sensing technology used in marine permanent reservoir monitoring and as a ‘viable technology’ for large-scale, cabled land seismic data acquisition. The deal involves an initial $1.8 million cash payment with up to $23.2 million to follow over a five-and-a-half year earn-out period, paid from revenues generated from the OptoSeis business.
Ikon Science has received an undisclosed investment from Great Hill Partners, a Boston based private equity company with over $5bn under investment. The deal is said to help Ikon achieve its next level of strategic growth in both products and services.
Chevron Technology Ventures and Energy Innovation Capital led an $8 million funding round in favor of control system cybersecurity specialist Mission Secure. Funds will be used to accelerate growth and expansion of Mission Secure throughout the energy, defense and transportation sectors.
Nine Energy Service has acquired Magnum Oil Tools.
PDI Software, a provider of enterprise software solutions to the convenience retail, wholesale petroleum and logistics industries, has acquired Outsite Networks, a loyalty company that has served the US convenience retail sector for over 18 years, carrying-out over 6 billion transactions.
SCF Ventures has made a ‘growth equity investment’ in Ruths Analytics and Innovation (Ruths.ai), developer of Petro.ai, a ‘seamless combination of big data, data science and chat’ that enables the analytics-driven digital oilfield.
SAP has acquired Contextor, a provider of ‘robotic’ process automation solutions that will help SAP accelerate deployment of its ‘Leonardo’ machine learning portfolio.
Seequent has acquired Geosoft in a move which will see the merger of the Leapfrog and Oasis Montaj geological software tools.
Siemens digital factory unit has acquired Mendix, a developer of a cloud-native, low-code visual application development environment that will ‘increase growth and accelerate’ adoption of MindSphere.
Validere has raised $7 million in seed funding in an investment round led by Sallyport Investments. The monies will fuel Validere’s US expansion and further the development of its AI-powered blending, logistics, and trading optimization platform.
Versa Integrity Group has acquired the assets of Maintenance & Turnaround Resources, a provider of asset integrity management and non-destructive testing services to the oil and gas industry.
Weatherford has sold its laboratory services and and geological analysis business to a group led by CSL Capital Management for $205 million in cash.
The 2018 Oil and gas Machine Learning symposium, held in Houston, was primarily a vehicle for disseminating Geophysical Insights’ (GI) Paradise use cases but the meet extended beyond the GI brief to encompass ML from third parties including IBM, Microsoft along with presentations from Anadarko, Shell and Repsol although the latter three were not available for this write-up. GI was founded by Tom Smith back in 2010* and was pretty well first out of the blocks with commercial machine learning in seismic interpretation. In 2017 GI added technology from the Assisted Seismic Processing & Interpretation (AASPI) consortium at the University of Oklahoma to its portfolio.
Fabian Rada (Petroleum Oil and Gas Services*) gave an introductory run-through of the use of Paradise, involving selection with principle component analysis of attributes of interest and using neural net-based self organizing maps to relate this to seismic information that is below traditional spatial resolution. Paradise relates reservoir geobodies to a ‘neuron number’ obtained from the classifier. Particular neurons are related to significant facies. The resulting classification volume can be interpreted with conventional 3D interpretation tools and further calibration with logs turn SOM maps into net reservoir estimate.
* GI rep in Mexico.
GI’s Ivan Marroquin introduced Paradise ‘ThoughtFlow’, a new platform to provide a user-friendly GUI around its ML solutions. ThoughtFlow enables unsupervised ML, combining attributes to reveal natural clusters in attribute space and preforming geobody extraction from a seismic facies fingerprint. The technique also performs fault picking while SOM identifies the ‘most performant’ predictive model.
Dania Kodeih presented Microsoft’s Azure ML used across oil ands gas. Of the many applications presented we noted document library analysis with ‘coordinate-bound search’, OCR-based document classification and machine reading of text to visualization with Int’s Ivaap. Microsoft also presented its ersatz Paradise-like DNN ‘featurizer’ running ResNet on labelled image data in the Seismic Atlas*. The system has also been tested on the Dutch F3 block seismic cube.
* An industry-academia partnership led by Aberdeen Uni, Leeds and NERC. Last updated in 2014.
Geophysical Insights president and CEO Tom Smith gave an ML seminar, showing the new tools and ways of thinking taht are now available to geoscientists. The Paradise ML classifier identifies natural clusters in attribute space, providing noisy images of geologic features. A natural cluster is the image of a geobody produced by the selected attributes which has stacked with other geobody images in the same location in attribute space. Some 30 or so attributes are available for classification and analysis. Smith related the technique to seismic sequence stratigraphy which subdivides the seismic section into reflection packages. These are used to interpret environmental settings and lithofacies from seismic data. Simple geobodies are autopicked by machine learning in attribute space. Interpreted geobodies are constructed from these by hand editing and construction rules, ‘all seismologists are in the model building business’. Smith recommends that instead of (subjectively) discarding ‘bad’ models, it is preferable to measure the probability of successfully fitting the data. Interpreters can then present the model and discuss its probability of success.
Sumit Gupta showed how IBM’s Corpus conversion service used to build a ‘cognitive petrology’ from various bibliographic sources (Elsevier, AAPG, CC Reservoir ...). This leverages semantic extraction (with IBM’s SPSS SmartReader) from PDFs, such that the system ‘understands’ geological relationships. A complete ‘PowerAI’ ‘open source-based’ ML stack was developed. Really large workloads can use IBM’s GPU-based SnapML.
Bill Abriel (Orinda Geophysical) discussed ML in seismics and suggested that this would be a good subject for a cross-industry project, perhaps along the lines of the SEG’s SEAM consortia. He also announced an upcoming joint SEG, SPE, AAPG ‘Digital Transformation’ conference to be held in Austin in June 2019. More from the symposium home page.
For more on the founding of GI read our 2010 interview with Tom Smith.
Speaking at the 2018 ‘Re:Invent’ Amazon Web Services user group meeting in Las Vegas, Mehdi Far outlined how Amazon’s cloud is ‘transforming’ upstream oil and gas with hosted machine learning. One customer use case compared on-site seismic pattern recognition model training that took 8 hours on a fairly high spec PC with CUDA graphics. The same took a mere 30 minutes on a single EC2 instance. Amazon’s SageMaker ML platform’s ‘out-of-the-box’ templates were used to train and deploy a customized deep learning model for seismic applications. More generally, AWS compute resources are claimed to cut vanilla seismic processing times ‘from months to days’. Far’s salt pick demo ran on a public domain seismic data challenge set by TGS and available on Kaggle. Another application revolves around ingestion and management of logging operations where multi-terabyte data sets are analyzed in real time for MWD and geosteering applications. These leverage AWS storage and ML solutions which allow for millisecond model updates. AWS provides data management and analytics to clients without major in-house resources.
BP’s Paul Schuster and Alaa Nasser with help from AWS’ Paul Burne outlined BP’s ‘quantum leap’ towards a cloud-first network. To support its thousands of remote sites, BP has re-architected its operating model for delivering network services. Achieving high-bandwidth low-latency connectivity between BP and AWS has involved a major revision of security segmentation, access policies, trust boundaries and connectivity to untrusted external networks. BP has developed a modular backbone providing data center-independent and carrier-neutral services proxied centrally. Granular segmentation, malware protection and fine-tuned access policies have led to trust in the cloud. AWS has enabled speedy deployment of a performant cloud. Key learnings include the benefits of distributed services, ‘centralized by exception’ and similarly the use of cloud-native products, ‘augmented by exception’. The merits of a ‘Devops-aligned’ operating and sourcing model and an ‘enterprise-wide and continuous’ architecture were also vaunted. BP is now working to extend its solution with support for cloud federation. Another Amazon presentation covered ML predictive quality management in the downstream. More on the AWS Re:Invent portal.
OmniTrax Energy Solutions and 1845 Oil Field Services have formed ShaleTech Transport to provide cost effective, safe and reliable mine-to-well-head sand supply chain logistics solutions.
Software AG and Dell Technologies have extended their partnership to bundle Software AG’s Cumulocity IoT Edge with a selection of Dell’s servers to provide a plug-and-play solution and simple configuration for an ‘instant IoT’.
Implico has combined its SAP consulting expertise in the oil and gas business and cloud solution iGOS (Implico Global Operation Services) to integrate 88 service stations into Certas Energy’s existing system landscape in France.
Shell has selected Bluware’s Headwave platform to help drive innovation and accelerate its digital transformation initiatives.
Bluefield has signed an agreement with Flux Lab to supervise and support its sensor demos for clients.
Canonical has signed a technology partnership with Eurotech for internet of things enablement. Ubuntu is now available on Eurotech’s liquid-cooled high performance embedded computers.
Kongsberg Digital has been awarded a contract by Gaslog to deliver a comprehensive LNG simulator package designed to enable new levels of safety and operational training for GasLog’s professional crews.
eDrilling’s digital twin is being deployed on Belimbing Deep (South Sumatra), for automated monitoring and real time optimization of exploration drilling.
Geofacets, Elsevier’s GIS-based information solution for exploration and development, is now a silver partner with ArcGIS provider Esri.
IBS Software is to implement its iLogistics, a cloud-based next-generation oil and gas logistics management solution to China’s Anton Oilfield Services in Majnoon, Iraq.
KBC (a Yokogawa company) has teamed with HTRI to provide the Oil and Gas industry with advanced capabilities in heat exchanger simulation and rating. Combining HTRI XSimOp and KBC Petro-SIM will enable more efficient delivery of high fidelity operational and design studies for process units.
Kongsberg Maritime has been awarded a contract by Stena Drilling for a complete upgrade of the Stena Carron drillship using Kongsberg’s new Kognifai digital platform.
Stress Engineering Services is to deliver a state-of-the-art real-time drilling riser and wellhead monitoring system, in conjunction with its condition based maintenance process to Noble’s Tom Madden drillship.
Kongsberg Maritime has received three separate orders for a total of five Hugin autonomous underwater vehicle systems from Ocean Infinity, increasing its inventory to 15.
Oracle Oil and Gas has engaged Blade Energy Partners to design its planned Eagle Ford wells.
Orbital Gas Systems is working with partner Samson AG to sell its GasPTi technology to Chinese midstream companies.
Rockwell Automation and PTC have signed a strategic partnership to help companies transform operations with digital technology by providing an integrated IIoT platform.
SPE and IOGP have signed a memorandum of understanding to share technical knowledge.
Superior Drilling Products is to trial Odfjell’s Drill-N-Ream well bore conditioning tool as it enters the Middle East market.
TechnipFMC has signed a 5-year surface technologies frame agreement with Chevron that covers the exclusive supply of surface wellhead equipment and service in the United States and Canada.
INT has partnered with Total to build a new giga-cell reservoir grid renderer. The Octree-based technology allows for large reservoir rendering on regular workstations or laptops.
Yokogawa Electric Corporation has announced a ‘go-to-market’ agreement with Sphera Solutions to leverage Sphera’s ProSafe-RS SIS risk management software and information services across the plant lifecycle, from design through construction or renovation and on to operations and maintenance of safety instrumented systems.
Those embarking on an Internet of Things project need to touch base first with Greasebook’s Rachael Van Horn aka the ‘Wench with a Wrench’. Her latest entertaining post highlights the sometimes huge differences between what is on the screen and what is happening at the tank farm.
Independent Data Services has been tasked by the International Association of Drilling Contractors to design, build and support the digital schema for the new drillers’ tour sheet ‘DDR Plus’, a WITSML 2.0-compliant solution for the secure sharing and storage of daily drilling report data.
PODS has released an informative graphic titled ‘constructing the pipeline open data standard next generation data model’ along with an online explainer of the next gen (aka v7.0) model.
The Open Geospatial Consortium (OGC) has issued a request for information to help shape its marine spatial data infrastructure concept. OGC has also teamed with the Khronos consortium of hardware and software vendors to work on geospatial standards for virtual reality, simulation and 3D content services. Thirdly, OGC is seeking public comment on the candidate Hierarchical Data Format V 5 (HDF5) core standard for complex, time/space variant multidimensional datasets such as point clouds.
The international OneGeology mapping consortium has upgraded to Open Layers V 3.0. OneGeology data can be access and manipulated in GIS analyst desktop clients like QGIS and ESRI Arcmap.
The W3C’s Spatial Data on the Web Interest Group has published a working draft of ‘extensions to the semantic sensor network ontology.’ SSNO allows unambiguous referencing of an observational ‘feature-of-interest’, sample or data collection, for interoperability.
World-wide-web inventor Tim Berners-Lee has launched a campaign to temper some of the negative effects of his creation (read Facebook?). Berners-Lee is calling on governments and companies to pledge to make connectivity more affordable while protecting privacy, democracy and mental health. Oh, sorry, Facebook and Google are among the early signatories of TBL’s ‘Contract for the Web’ so that’s OK then!
It may seem strange that a company with a 50-year experience of high performance computing is outsourcing its data sharing functionality but that is, in effect, what French geophysical specialist CGG is doing as it ‘reinvents’ its way of working with Dropbox Business. CGG’s Frederick Himmer, speaking at a recent meeting of France’s CRIPbusiness association, explained how back in 2015, CGG noticed that there was widespread ‘unofficial’ use of Dropbox for collaboration. CGG surveyed its users and found that Dropbox was enabling new ways of working that were not provide by CGG’s existing IT services. Having established the use cases, CGG worked with Dropbox Business to put governance and security and admin. CGG’s ‘official’ Dropbox is now up and running with around 1,000 licenses ‘used constantly’, some 6.4TB stored and 8.6 million connections/year. 120 GB files are shared on the system which is supported in a breakthrough collaboration between IT and the business. A typical use case is a land seismic survey where plans are shared with the client, third parties and surveyors in a ‘collaborative and intuitive’ solution for a distributed workforce. In another use case, CGG has sped its auditing function, saving one week per audit per auditor. Dropbox enables single file working, with ‘no more versioning’ and, a drastic reduction in email. Contractual documentation can be discussed and signed online. All development is driven by user needs and the system meets expectations of new hires to whom email is apparently, ‘old fashioned’.
In another (non oil and gas) CRIP presentation Sylvie Charissoux explained how Exide Battery has moved from Microsoft Office to Google’s G-Suite. The rationale for the change is cost saving, simplicity, collaboration functionality and ‘secure data protection and individual privacy(!)’ The move was not without challenges. It represented a big change for users who had been on Office for 30 years. In Germany, where ‘Google bashing’ is prevalent, the move was badly perceived. The Great Chinese Firewall is also problematic. And the CIO left the company!
Stockholm IT Ventures has announced a blockchain-based ‘tokenization’ deal with Netoil, Inc., a Cayman Islands-based diversified oil, gas and banking group. Netoil founder Roger Tamraz who is also chairman of SITV said, ‘We've been looking at blockchain technology to solve some of the business challenges we face at Netoil. We feel this is a great first step towards disrupting the oil business and bringing the kind of positive change that is much needed in this marketplace.’ The ‘tokens’ represent a corporate bond with a face value of €300 million, ‘secured’ by Netoil’s oil and gas production business. SITV describes tokenization as a blockchain ‘killer app’ with an ‘€1 trillion of assets’ to be tokenized in the next 12-18 months. More from SITV.
NIST has released NISTIR 8202 a Blockchain Technology Overview providing a high-level technical overview of the technology. The 68 page report suggests that the novel technology should be investigated with the mindset of ‘how could blockchain technology potentially benefit us?’ rather than ‘how can we make our problem fit into the blockchain technology paradigm?’ NIST explains the confusing interplay between identity management and transactions, which are in general anonymous. The (very large) amount of energy used is also problematical. More from NIST, although at the time of writing, NIST and many other US government websites are shut down pending resolution of the wall issue!
The Carnegie Mellon Software Engineering Institute has just published Obsidian, a secure programming language for blockchain applications. Obsidian is designed to fix defects in current blockchain programming issues that pose a risk to the adoption of cryptocurrencies and other blockchain applications.
We asked the SEI/CMU’s Eliezer Kanal what he thought of our recent contribution to the blockchain debate where we raised the issue of connecting a blockchain transaction to a physical object. Kanal essentially agreed with our analysis, ‘You are correct, the supply chain use case has a significant problem with respect to representing physical goods digitally ... blockchain enters the picture after the digitization happens and can (potentially) prevent subsequent malicious manipulation of the database, associating an identity with each transaction. Once something has a digital representation, we have blockchain benefits. It doesn’t solve the first part, though. Your points are completely correct. There are other problems with blockchain. Implementing a blockchain is really, really difficult to do correctly. Mining is mind-bogglingly wasteful and other techniques which seem to do the same thing actually allow for a lot of maliciousness. This removes some of the most significant benefits of a blockchain entirely, turning it into a really complicated distributed database. We’ve had these for decades! Other frameworks under development involve a lot of handwaving and don’t really acknowledge the difficulty. In general, moving things to the blockchain turns out to be incredibly expensive. Hence the weird concepts like ‘half-blockchain’ applications, which have none of the benefits of blockchain and all the complication.’
The Maritime Blockchain Lab (MBL), a unit of BLOC, (Blockchain labs for open collaboration) has won support from MIT’s ‘Solver’ program. MBL was selected for its work on shipping emissions monitoring, reporting and verification solutions. These have been built upon the blockchain-based Marine Fuel Assurance prototype developed with funding from the Lloyd’s Register Foundation. Solve is an initiative of the Massachusetts Institute of Technology that 'advances lasting solutions from tech entrepreneurs to address the world’s most pressing problems’.
Hangzou, China-based VeChain has partnered with Chinese energy and gas companies ENN Energy and Shanghai Gas to pilot a ‘blockchain-enabled’ liquified natural gas solution. The VeChain ‘Thor’ blockchain will be deployed at the online Greatgas.cn LNG trading exchange and support qualification certificates and SKU inspection reports. VeChain claims to connect blockchain technology to the real world with a ‘comprehensive governance structure, a robust economic model, and IoT integration’.
Siemens has joined the Energy Web Foundation an alliance that sets out to develop blockchain applications for the energy industry The ‘non-profit’ organization’s mission is to accelerate the commercial deployment of blockchain technology in the (electrical) energy sector. With a growing membership of corporate affiliates, technology partners and strategic investors active in the energy industry, the EWF claims to be a leading alliance for blockchain developments specific to the energy industry’s needs. As part of the EWF organization, Siemens aims to shape the future of blockchain-based, transactive energy applications, new prosumer-centric use cases as well as business models around operation of distributed energy systems, microgrids and financing.
Comment: MBL, EWF and other blockchain-based solutions for energy are often conflated with enabling the ‘transition’ to a greener world. This is a curious association in view of the unconscionable amount of energy used in mining. We stand by our analysis of blockchain as BS. Judging by the 50% drop in the Nvidia share price since September 2018, widely reported as due to a decline in purchases of GPGPU hardware for bitcoin mining, it could be that we have already passed peak blockchain.
Those interested in following-up on the new business model for oil and gas outlined in this month’s editorial, or indeed in devising their own ‘next big thing’, need to check out electricity. We have done, visiting the prestigious CIGRE trade show in Paris. The 2018 CIGRE was subtitled ‘Digitalisation, big data and the future of the power industry’. New power generation technologies (some digitally-enabled) are already impacting oil and gas as electric power may flow offshore to provide power to a large production platform. Or in the opposite direction, if the generation is offshore, perhaps associated with CCS as per this issue’s editorial. Another interesting facet of electrification is coming from the ‘microgrid’ movement. This is usually presented as an agglomeration of energy sources (grid, windfarm, photovoltaic) and sinks (residential, industrial, vehicular) whose use is managed and optimized with ‘smart’ technology. The microgrid concept could equally apply to a large offshore platform, or to a refinery or large-scale shale development where energy costs are a significant factor in profitability.
Cigre is the French acronym for the International Council of large electric systems, an almost 100 year-old international professional body founded in 1921. Not all that long ago, CIGRE was perceived as a somewhat staid body in a slow moving business but its brief has been revolutionized with the advent of green energy. Cigre foresees a similar revolution with smart grid, microgrids and increasing ‘digitalization’. We chatted with a rep from Schneider Electric whose EcoStruxure microgrid offering has application in offshore operations where a combination of diesel, photovoltaic, wind energy and battery storage can optimize energy use. The EcoStruxure microgrid provides advanced power control and management functionalities from a simple system of a microgrid controller and scada system. Microgrids can be deployed on brown and greenfield sites and as connected microgrids with a switchable ‘islandable’ capability. A microgrid ‘advisor’ performs dynamic electrical topology computations and measurements in real time to overcome challenges in the microgrid’s changing electrical topology. Energy management functionalities include load sharing, load shedding, black start, load restoration and battery energy storage system (BESS) for photovoltaic production.
There is an increasing interplay between electrical and oil and gas. In October 2018, Norway’s Minister of Petroleum and Energy, Kjell-Børge Freiberg, officially opened Equinor’s (formerly Statoil) power-from-shore solution which will provide 100 MW of electric power to the North Sea Johan Sverdrup field for an estimated 50 years. Powering the platform from without using fossil fuels means that Johan Sverdrup will be ‘one of the most carbon-efficient’ fields in the world when it comes onstream later in 2019 with a CO2 footprint of 0.67 kg per barrel, saving some 460,000 tonnes of CO2 per year. ABB delivered the HVDC equipment for converter stations onshore and offshore. The system uses direct current for electricity transmission over the 200 km long cables. Equinor’s release did not say how the electricity was generated, but Norway is fortunate in that some 95% of its electricity comes from hydroelectric.
A release from ABB provides another slant on offshore electric systems, ABB’s long step-out systems for subsea pump and compressors enable ‘economic hydrocarbon recovery under extreme conditions’. Electrical power for pumps, booster stations or compressors is supplied from an offshore platform or an onshore facility. Variable-speed pumps designed to accommodate pressure decline across a field’s lifetime require sophisticated variable frequency control. Qualification of such systems is critical in view of their long term deployment. For the Asgard project, a three phase 20 MW cable simulator was built to represented field conditions of a 47 km subsea power cable. This enabled analysis and mitigation of resonance in the system.
Kongsberg Maritime has chosen a battery storage system from Swiss Leclanché to power its new fleet of electricity-powered vessels. First off the slips will be the Yara Birkeland, the world’s first autonomous and electric container vessel with a 5MWH storage system.
One problem with shale production is the vast amount of natural gas that is flared along with the oil. All that energy going up in smoke! PW Power Systems and US Well Services have a neat solution to this problem: use the gas to generate electricity and ‘use the electricity to frack the wells’. Electricity comes from PWPS’ 30-megawatt FT8 MobilePac aero-derivative gas turbine generators. The generators are a component of USWS’ patented Clean Fleet system. Actual fracking is performed by conventional hydraulic pumps, powered by electric, rather than diesel, motors. USWS also recently announced its third ‘electric frac’ contract with Apache Corporation.
Speaking at a recent meeting of the Object Management Group-hosted IIoT in Energy forum in Seattle (more in our next issue), Gerardo Pardo-Castellote, (RTI) introduced the Industrial Internet Consortium’s microgrid communication and control testbed for distributed energy resources. The system provides a real-time, secure databus to facilitate machine-to-machine, machine-to-control center and machine-to-cloud data communications. The Microgrid testbed leverages the OMG’s DDS-TSN (time-sensitive network) protocol. Co-author, Wipro’s Manjari Asawa told Oil IT Journal ‘the Microgrid Testbed design based on TSN could be used for off-grid/offshore oil and gas platform deployment. We are now evolving the design to focus more on the connected design with distributed energy resources to optimize demand response capabilities of connected grid’. The system is now installed as a permanent testbed at National Instruments’ IIoT Lab.
French utility RTE recently announcd Power System Blocks, an ‘open source high performance computing framework’ for grid planning and monitoring. Powsybl is part of the LF Energy Foundation, a Linux Foundation project that supports open source innovation in the energy and electricity sectors.
The Siemens-backed Internet of Energy, a specific IoT for the energy sector, is ‘gaining ground’. The IoE uses data generated by today’s smart assets to intelligently network them and improve efficiency, reliability and profitability throughout their life cycle. Digitalization, data networking on MindSphere takes grid operation to ‘an entirely new level’. The benefits in terms of predictive planning, enhanced network operation and value creation beyond the provision of grid transmission capacity are ‘tremendous’.
Switzerland-based Marmot Passive Monitoring Technologies is inviting participation in a consortium to investigate continuous monitoring of geodynamic phenomena for operations control and risk assessment. The consortium leverage Marmot’s acoustic monitoring concept, the 5D Quantum Monitor (5DQM) in various fields including oil and gas production from conventional and shale reservoirs. 5DQM is a directional seismic data acquisition system coupled with a ‘cognitive’ data management system leveraging an artificial intelligence-based forensic data base. More from Marmot.
ExproSoft has initiated a joint industry project to assess the reliability of BOPs used for subsea drilling in Norway. Currently, BOP failures and testing are major contributors to rig downtime. Previous studies have shown some 6%-8% of rig time is used for repairing and testing the BOP, potentially exposing operations to unnecessary risk when the BOP is pulled. The JIP will analyze data from Norwegian wells spudded during the period 2016–2017 that experienced BOP failures, maintenance and well kicks. Data from daily drilling reports will be structured and analyzed in ExproSoft’s systems. The Norwegian Petroleum Safety Authority is supporting the project, providing drilling reports from participating operators. Current partners include Equinor, AkerBP, Lundin, VNG Norge, Faroe Petroleum, and Wellesley Petroleum. Late entrants to the JIP are still accepted. More from ExproSoft.
The UK-based International Oil & Gas Producers association (IOGP) has issued a Recommended Practice report* on the ‘selection of system and security architectures for remote control, engineering, maintenance, and monitoring’. The report was prepared by the IOGP’s instrumentation and automation standards subcommittee’s remote operating center task force. The 44-page document provides common definitions and guidelines on system architectures and security controls for remote functions such as remote operation, remote monitoring, remote engineering and remote maintenance of industrial control and automation systems, basic process control systems, safety instrumented systems and monitoring-only systems.
Remote operations limit the number of personnel on a hazardous production site and can facilitate access to vendor or operator subject matter experts. However, remote operations bring risks, such as cyber security, communications and compliance. The roles and responsibilities of remote and local operations are set out. The IOGP recommended practice offers high-level guidance on remote functions implemented at oil and gas facilities and a suite of system architectures for process control systems, safety instrumented systems and other engineering packages. The report is pitched at design considerations for pre-FEED (front end engineering design) efforts.
* IOGP Report 627 (2018).
Speaking in the conditional maintenance track at the 2018 National Instruments' NI Days event in Paris, Lodovico Menozzi described a ‘big shift’ in maintenance from manual to automated online data collection. Although this is not without its disadvantages, ‘walking around a plant can provide awareness of future issues’. But the new approach allows 24x7 analytics and remote diagnosis of captured waveforms by SMEs from ‘anywhere with network access’. NI provides connectivity with OSIsoft PI, OPC UA, Asset 360 (Black & Veatch), Avantis Prism and more. NI tools are used to view trends and ‘displace’ other software for vibration, temperature and motor current signature analysis from the voltage bus. IR thermography and EM signal analysis detect hot spots and insulation probs in transformers. All this is now delivered from a single software tool CompactRIO, deployed at the network edge. One oil and gas use case is the flagship Texmark Chemicals refinery of the future.
Thierry Romanet presented a rejuvenated ThreatScan, BHGE’s real time pipeline impact detection system. Hydrophones along the line use local real time processing spectral analysis to detect signatures from e.g., digging. It is not practical to send all the data to the cloud because of bandwidth limits. Real-world tests with diggers breaking into (empty) pipes give source spectra types. These are convolved with attenuation data on pipe type, fluid and burial conditions. ThreatScan has been around for a while. Back in the day, the system was drowned out with nuisance alarms. Today, improved firmware and neural net algorithms can distinguish noise types from normal operations. The system was completely redesigned in 2017 with Step Automation and the software ported to an NI sbRIO-based computer. Google Map and Beacon also ran. In April 2017 the system detected its first real impact on an offshore pipe. Subsequent ROV and pig surveys found damage 18 meters from the estimated location.
Less digital but of interest is Optel Thevon’s specialized optical mires and cameras for very high speed (1.5MHz) surveillance of rotating equipment. The tools provide real time torsion vibration monitoring at the edge.
Finally, a spectacular video of a future (or futuristic?) railway link from Beijing to London that had champagne drinking passengers travelling on the upper deck with automated cargo handling systems beneath them. China’s CRRC railroad operator plans to do the whole 8,000 km trip at 400kph for a 20 hour journey time. CompactRIO will be doing the onboard edge computing looking for small signals from the wheels with machine learning applied to pattern recognition and prediction of failure. Watch video.
Speaking at the Energy Conference Network’s 2018 internet of things in oil and gas (the 4th annual event), Steve Sponseller from PTC’s Kepware oil and gas unit and Doug Smith from refiner Texmark presented on digital twins in oil and gas, with a special mention of Texmark’s ‘refinery of the future’ (RotF). The Internet of Things-based program combines sensor data aggregation, real time operations intelligence and analytics at the edge to support automated feedstock replenishment, predictive and condition-based maintenance and HSE.
Sponseller’s talk presented the contributions of PTC/Kepware and Rockwell, although many other companies are involved in the effort*. The Rockwell/PTC contribution to the RotF is ‘comprehensive and flexible’ IoT offering leveraging Rockwell’s industrial control technology along with Kepware’s ThingWorx multi-vendor connectivity.
The RotF comprises various ‘IoT’ projects including Lidar point-cloud-based ‘as is’ video mapping, an equipment health dashboard, ATEX-certified tablets and wearable situational awareness technology in the form of Realware’s HMT-1. The hands-free Android tablet-class wearable computer for industrial workers blends the real and virtual worlds showing for example, a P&ID view of equipment to a mobile worker.
Sponseller also presented work done with Australian Nova’s on a digital twin proof of concept for the upstream that combines data analytics, augmented reality, artificial intelligence and machine learning to optimize operation and maintenance, perform monitoring and forecasting.
Another PTC-backed IoT solution comes from Tory Technologies, in the form of ‘Smada’, supervisory measurement acquisition and data analytics, a ‘boosted’ scada for the IoT. More from Kepware and from ECN.
* RotF is a joint venture between Texmark, HPE, Aruba, ENIOS Technologies, National Instruments, OSIsoft, DXC Technology and Intel. More from Texmark.
AspenTech blogger Mike Brooks has provided a critique of ‘remaining useful life’ (RUL) as a metric for predictive maintenance applications. While RUL is a commonly accepted KPI, Brooks claims that it is a ‘flawed concept’. Traditional maintenance is organized around statistical assumptions of failure mechanisms, mostly related to normal wear and tear. However, studies by ARC Advisory Group have shown that over 80% of asset failures are caused by ‘errant process’ rather than age-related issues, making conventional preventive maintenance strategies ineffective. Examples of such errant processes include operator error, lightning strikes and voltage spikes or episodic cavitation in a pump. Most of these random events make the reliable prediction of RUL impossible. For more on AspenTech’s approach to prescriptive analytics and equipment failure read the white paper on ‘Low-touch’ machine learning for asset management.
In a MathWorks article, Aditya Baru has no qualms about using RUL as a KPI and suggests three ways of estimate RUL for predictive maintenance. MathWorks’ Predictive maintenance toolbox computes RUL depending on what data is available - lifetime machine data, run-to-failure histories and threshold values of know failure condition indicators. Computing RUL may leverage proportional hazard models and probability distributions of component failure times. For example, a battery’s discharge time may be estimated from past discharge rates and covariates, such as operating temperature and load. The MathWorks approach involves physics-based forward modeling and Kalman filtering. More RUL examples here.