Oil IT Journal: Volume 22 Number 3


Meme of the month!

The old ‘digital twin’ product lifecycle management concept gets new lease of life in big data/AI offerings from GE, IBM, Grundfos, Ansys, Siemens, Schneider Electric, Seeq, Amec, ABB, Emerson.

The term ‘digital twin’ (aka a ‘model’) originated in the product lifecycle management community over a decade ago before falling out of favor. The arrival of big data/AI/Internet of Things, has given the DT a new lease of life (see Google Trends.) We already covered the recently announced GE/Maana DT last month. Now, the DT has become a huge marketing meme.

For IBM, this means mashing up its Watson artificial intelligence solution and internet of things into a shiny new DT as described by Chris O’Connor in a video from IBM’s recent ‘Genius of Things Summit’ in the new Watson IoT headquarters in Munich. O’Connor mentioned en passant, an oil pump as a candidate for a digital twin solution that will roll in Watson IoT, machine-to-machine communications, natural language processing and more.

Speaking of pumps Grundfos, a pump manufacturer, is to use Ansys’ simulation software to ‘harness the power’ of the Internet of Things and to create ‘complete digital twins.’ DTs will ‘improve product quality and performance, enhance productivity and maintenance and reduced unplanned downtime.’

Siemens also popped up with a DT recently with the announcement of DT functionality in its STAR-CCM+ software now in release 12.02. STAR-CCM’s new photo-realistic renderings are said to help engineers ‘promote understanding and inspire confidence’ in full physics models.

Schneider Electric and partner Seeq have announced ‘Profit Advisor,’ that also proposes a DT solution for big data/analytics that evaluates the financial performance of operations in real time. Seeq rolled out V17 of its IIoT/visual data analytics toolset at the 2017 OSIsoft Users Conference this month, citing Devon Energy as oil country poster child.

Amec likewise has rebaptized its Asset information hub as a DT offering as presented at a recent Cfihos meet chez BP from which we will be reporting in our next issue.

While not a DT per se, ABB’s announcement of the ‘commercial launch’ of ABB Ability fits the new meme. Ability, unveiled at the ABB Customer World in Houston this month, includes ‘Smart Sensor,’ a DT-style solution that mirrors real time sensor data to the Microsoft Azure cloud. Shell Oil is a user.

Emerson’s ‘iOps,’ integrated operations and augmented reality, to roll-out at the 2017 Cera Week, likewise mirrors the DT concept.

More on GE’s interesting take on the DT in our report from the 2017 GE Oil & Gas Annual Meeting in Florence below.


AMEC sold

Wood Group buys engineer Amec Foster Wheeler in 'opportunistic’ £2.2 billion deal.

Aberdeen-headquartered Wood Group is to purchase engineer Amec Foster Wheeler in an all share, £2.2 billion deal. Wood Group chairman Ian Marchant described the combination as ‘transformational, accelerating our strategy and creating a global leader in project, engineering and technical services.’ The deal will create an ‘asset-light, largely reimbursable’ business of greater scale and capability across the oil and gas and other segments.

Amec Foster Wheeler’s shareholders will become shareholders in the combined group and will, ‘gain from the enhanced operating capabilities, and synergies, a stronger balance sheet and from Wood Group’s progressive dividend policy.’ Pre-tax cost synergies from the deal are put at £150 million per annum by the end of the third year. A report in the Financial Times described the deal as ‘particularly opportunistic’ in view of Amec’s £1 billion plus net debt.

Those interested in Wood Group’s transition from a small Aberdonian fishing business to an international oil and gas behemoth, should listen to the 2013 BBC podcast interview, where Ian Wood describes pitching the North Sea to his skeptical fisherman dad. Download the 80 page proposal here.


New website sponsors and a reflection on our ’sustainability'

OilIT.com welcomes new sponsors Dell-EMC, Halliburton/Landmark, Teradata and PNEC Conferences. Editor Neil McNaughton reports progress in enhancing the Journal’s online resource while recognizing there is much more mileage to be had from 20 plus years and over 2 million words of focused reporting on oil and gas IT and data. Change is in the offing...

First a big thanks to our new website sponsors Dell-EMC, Halliburton/Landmark, Teradata and PNEC Conferences whose flagship E&P data management event takes place in Houston in May. Sponsorship support is import to us especially in the downturn. Fortunately for us these companies are taking the long term view, as indeed we are.

~

On which note, I thought it might be a good thing to reflect on our own future in the ‘lower for longer’ industry scenario. Writing and publishing a journal involves an information pipeline. As you probably work in oil and gas, I expect you know what a pipeline involves. If I was in marketing I would explain the obvious with say the example of the newly OK’d Keystone pipeline which is to bring Canada’s oil sands production down to refineries in the US gulf. Assuming that the Canadian companies buying up the oil sands production can still afford to cook them up and that the Gulf refineries are not at full capacity with revitalized Permian production, in which case the stuff may end up in the tanks in Cushing.

This of course does not help you understand how the information processing process works but right now, writing at the start of the current issue, it feel more like a brim-full tank of information than a freely flowing pipeline.

With over 20 years doing the business the process has become standardized if not terribly efficient. We visit hundreds of websites every month, receive hundreds of emails per day and almost as many press releases. Atop all this is our own content generated from conferences which we attend, either in person or virtually.

Several thousand ‘items’ of various degrees of interest from ‘zero’ to ‘scoop’ flow in prior to every issue. I say this in case you think that we are just sitting here waiting for the next press release to pop into our inbox ready for an arduous cut and paste.

So that’s what goes in. What comes out is what you are looking at now, either on screen or in print. A careful selection of what we believe to be relevant to the oil and gas IT and data community. Well, actually rather more that a selection since we do a massive and growing information processing, editing and (mostly) deleting stuff along the way.

In our early years the process produced something like 25 articles per issue. A number that has not changed significantly, although the word count has risen some, from around 7,000 per issue in 1996 to 10,000 today. What has changed, and what continues to evolve in a way that we believe better serves our readers, is the information intensity.

In 1996, 25 articles was just that. Today, the same number of ‘articles’ in our last issue actually represented almost 100 distinct information items. These have been sourced as above, edited for relevance and length. Comparing our early numbers with today’s publication it is clear that value-add of the editorial process has increased significantly.

The process is very time-consuming and we are increasingly pushed to fit in all of the conference attendances, reporting and information gathering between each edition. We are confronted by, to use a much abused word, the ‘sustainability’ issue. Things are going to have to change, but how?

Back in 2014, for our 200th anniversary issue we carried out a survey of our readership and found that pretty well all of our coverage was appreciated, most all item categories got a pretty good rating. So we do not plan to change our scope significantly. The survey also produced some constructive criticism. Some asked for a ‘major website re-vamp,’ others for an iPad edition, more interviews and more on operational and implementation experiences. There were also calls for some graphics, better article tagging and blog for reader feedback.

This inspired me in December of the same year to make a rash promise to ‘bring search in-house and make it smarter,’ to ‘get something along the lines of IBM Watson running on our information asset’ and to create an ecosystem of websites-of-relevance.’

Three years on I regret to report that none of these noble goals have been met in full. While we have not achieved a ‘major’ website revamp, we have made a lot of incremental improvements including the online PDF edition, which doubles as an ‘iPad edition.’ There are more interviews now, but not so much on implementation experiences. The ecosystem remains a great (IMHO) idea that is yet-to-be implemented.

We do need to do more. Our tweaks so far are more concerned with making the website function as you would expect, rather than adding major functionality. On which subject you may have noticed that this is the first issue where links to past articles work ‘right,’ at least for subscribers.

We need to do more to leverage the two million (at the last count) word resource into something more useful for operators, suppliers, researchers, marketers and students. To do this we are going to, as of 1/01/2018, go down to six issues per year. This will let us do the stuff that time and resources (this is the downturn for us too after all!) have prevented us from doing. Over time we will have a better website, taxonomy-based search, educational tracks and generally a more comprehensive resource. We will also be increasingly differentiating the public (free) website from our corporate subscriber-edition to try to encourage the lurker/headlines-only community to switch, thereby helping out with our ‘sustainability’ issue! Feedback to info@oilit.com welcome.

@neilmcn


Oil IT interview, Mike Skeffington EnergyIQ

EnergyIQ’s head of business development on data management in the downturn, EnergyIQ’s JSON/XML data objects (announced at last year’s PNEC) and on the excitement of Elastic Search.

Data management must be feeling a bit fragile in the downturn..

It is always a challenge to make data management sexy! Bur we are doing well right now, our PPDM-based IQlifecycle well data management system which is popular as it provides client with visibility of their well data. Many companies do not have this today, which is pretty shocking really. Our IQexchange adds ETL and data integration via the upstream data objects we unveiled at last year’s PNEC.

The data objects JIP is now done?

Yes. The technology is now real and various components are deployed. We are also working with the major data providers and are committed to expose their data sources in a transactional form using the data objects.

What technology is under the hood?

We define our objects to be technology agnostic such that they can be used in .NET (Microsoft) or Java/XML environments. We provide a RESTful API to our database with JSON/XML objects exposed as web services. Data objects include well origin, completion event and so on. These group relevant attributes along with measures of data quality. Note that these objects are not technology centric. We don’t force Java or C# on the developer. In fact we deploy 4-5 different technologies internally.

How do you quantify quality?

With object and rules attributes that can be used to elevate an object according to a weighted score of all its quality measures. An object may for instance get a 95% quality rating.

At PNEC there was a suggestion that the objects might be open-sourced..

I wish I could provide a time line for this. We would like to move forward and transfer the technology to an association. PPDM is interested.

How do you handle search?

We are very excited with this. We use the open source, Lucene-based Elastic Search in IQinsights. This runs on top of PPDM increasing performance greatly. Elastic indexes everything and provides sub millisecond response times on a 100 million record set. We leverage Elastic to interface with other data sources, Aries, Quorum Land and clients’ in-house data.

What is in your database?

Our database is populated with whatever data the client is entitled to from the data vendors. We have also integrated with SAP and with Hadoop. We are not data providers, just data managers.

Do you build interpretation projects?

Sure. IQexchange has DecisionSpace and Geographix interfaces. We are constantly developing this kind of functionality.


Review - SPEE Monograph 4

Society of Petroleum Evaluation Engineers publication covers reserves estimation in low permeability (unconventional) reservoirs. Treatment covers state-of-the-art decline curve analysis and different approaches to numerical modeling. ‘Commercial’ considerations out-of-scope.

Published in 2016, Monograph 4 (M4) from the Society of petroleum evaluation engineers, ‘estimating ultimate recovery of developed wells in low permeability reservoirs’ is a 300 page follow-up to the 2011 Monograph 3 on reserve evaluation in resource plays and reflects the intervening shift from exploration to development.

Introductory chapters cover geology, drilling and completions but curiously, M4 ‘largely ignores’ the commercial aspects of unconventional wells and stops short of ‘fully addressing [the] assignment of developed [...] reserves.’ M4’s focus is on technical factors of evaluation.

M4’s ‘technical factors’ are less the complex process by which shale gives up its gas or oil, than with the range of empirical methods that have become standard practice. ‘Classical’ decline curve analysis (DCA), practiced for decades on conventional reservoirs, is ‘still the most widely used method’ for unconventionals.

Numerical models are presented as alternatives to DCA. These fall into two camps, full scale modeling à la reservoir simulator and rate-transient approaches à la well test. The meat of M4 therefore lies in an in-depth look at each of these approaches, authored by recognized specialists in each field.

The popularity of the empirical approach is undoubtedly largely due to a key distinction between shale gas reservoirs and tight oil plays. To quote M4, ‘While some tight oil plays produce directly from shales, similar to gas plays, most tight oil is produced from low permeability siltstones, sandstones […] that are associated with the shales from which the oil has been generated.’ The shale/tight distinction is important as is the fact that while drilling and completions may target one formation, ‘adjacent lithofacies may be supplying hydrocarbons.’ Further complexity comes from considerations of the completion method, said by some to ‘define the reservoir.’

M4 does touches on commercial aspects of tight oil development citing Southwestern Energy’s super-low F&D costs. M4 includes sections on ‘data considerations’ that enumerate the plethoric amount and types of data that could usefully be gathered and analyzed by the evaluator. Whether this is done in practice in the context of factory drilling is moot.

M4 concludes with chapters on quantifying uncertainty in evaluations and worked example problems for three US plays. M4 is a thorough treatise on the intricacies of tight reservoir evaluation. In view of the complexity of the subject and the lack of a ‘commercial’ dimension, it is not a cook-book for reserves reporting.

See also our 2012 editorial on modeling shale gas.


dGB announces GeoDataCloud

JIP to deliver Netherlands geo-data from novel object store.

Enschede, Netherlands-headquartered dGB has initiated the GeoDataCloud, a joint industry project to make publicly-released sub-surface data in the Netherlands available in a cloud-based, interactive computing environment. The project, a partnership between dGB, SGS Subsurface Consultancy and Z-Terra, is supported by TNO with data contributions from the DINO/NLOG database.

The project involves the development of an internet portal that supports data search and streaming download of subsets of public data and value-added products. User companies will be able to rent private space to perform interpretations in the GeoDataCloud and to rent hardware and dGB’s OpendTect Pro+ seismic interpretation software for processing and interpreting data.

All datasets will undergo a QC process prior to being transferred to the environment as OpendTect projects. dGB is working on a new object storage capability in OpendTect Pro to speed cloud-based data access, along with a new seismic viewer that runs in a standard internet browser. When the initial project is completed, the expectation is that the GeoDataCloud will maintain itself as a commercial entity. Anchor tenant of the GeoDataCloud is Oranje Nassau.


SEG-Y Release 2.0

Society of Exploration Geophysicists rolls out 150 page update to venerable seismic data standard.

The Society of Exploration Geophysicists has just released a major update to the venerable SEG-Y workhorse for seismic data exchange. SEG-Y Release 2.0 brings enhanced flexibility while maintaining a high degree of backward compatibility. Notable new features of SEG-Y include up to 65,535 additional 240 byte trace headers, unambiguous mapping of trace header contents and greatly increased sample and trace counts with arbitrarily large or small sample intervals.

SEG-Y R2 also support for little-endian and pair-wise byte swapping, microsecond time stamp accuracy and higher precision for coordinates, depths and elevation and more options for coordinate reference system specification. An optional XML-based extended text file header is available for ‘improved machine processing.’


An appendix describes writing SEG-Y to a byte stream without a record structure so it is possible to write SEG-Y data to disk or across the network. Other appendices cover compatibility between SEG-Y and SEG D Rev 3.0 via an RP66 storage unit tape label. Download the 150 page PDF specification from the SEG Technical Standards webpage.


PARS in the cloud

Interica offers cloud-based project archival, leveraging Wildfly’s open source application server.

The 4.4 release of Interica’s PARS upstream interpretation project archival system brings enhancements to security and cloud-enablement. A new ‘smart verify’ function is designed to support the migration of oil and gas data into cloud storage, adding data integrity checks and fine-grained control over the archival and retrieval process and costs.

PARS 4.4 generates an MD5 hash for each dataset prior to transfer that is included in the message sent to a cloud-based object storage solution such as Amazon S3.

The MD5 hash acts as a data integrity check, obviating the need for a byte-for-byte verification. For even greater security, a configurable amount of data can be read back from the object storage for additional verification, ‘further mitigating an already low corruption risk.’

PARS now also leverages the latest version of an ‘industry standard’ enterprise application server, Wildfly 10. Wildfly implements the Java enterprise edition 7 full and web profile standards. These minimize the number of ports required to run the application server and provide a platform for improved integration with other enterprise products, in particular, future versions of Interica’s IDS. Interica has certified its PARS product with most of the leading object storage providers.


OGA commissions Lloyds Register on ... decommissioning

UK Oil & Gas Authority launces multi-operator P&A optimization program with help from LR.

The UK government’s Oil and gas authority (OGA) is asking operators to voluntarily participate in a multi-operator, well plug and abandonment (P&A) optimization program. The aim is to ‘demonstrate the cost savings which can be achieved through collaborative working, stimulate work-sharing campaigns and adopt improved execution and contracting models.’

OGA has commissioned Lloyd’s Register (LR) to act as project manager for the initial selection phase.

OGA head of decommissioning Jim Christie said, ‘Cost efficiency, knowledge sharing and best practice adoption are key to our decommissioning strategy. While estimates of cost, scale and scope vary, there is no doubt that the decommissioning effort facing our basin is significant. We must act now to capitalize on the opportunity it presents for innovation, cost reduction and further development of our skilled supply chain.’ A ‘collaborative well P&A program’ is planned for 2018/19 More from OGA.


Software, hardware short takes ...

Caesar Systems, Estimages, Eliis, Petrolink, GeoLogic Systems, FracGeo, P2, WellMaster, KAPPA, Paradigm, Troika, PetroWeb, Exigo, NeoFirma, Phusion, New Century, Coreworx, IT Vizion, Wand.

Caesar SystemsPetroVR 12.1 claims ‘breakthrough’ multi-core computing speed, new audit/validation functions and multi-well simulations for shale projects.

Estimages and Eliis have announced ‘IF,’ interpretative seismic filtering, that combines fine scale interpretation and non-stationary factorial kriging.

Petrolink’s RigMetrix computes KPIs for benchmarking, monitoring and reporting on single and multi-rig drilling programs.

GeoLogic Systems’ adds a new Frac Analysis module to geoSCOUT 8.4, providing detailed fracturing data from ‘WCFD,’ the recently-acquired well completions and frac database.

FracGeo has announced DrillPredictor for Shale 2.0 a cloud-based web service for geosteering unconventional wells and optimizing completions.

The R5.0 release of P2 Land streamlines user workflows with a new file dashboard, dual monitor capability and better query.

A new simulator add on to WellMaster RMS predicts well interventions, downtime and OPEX costs for maintenance programs.

Kappa Workstation v5.12 includes a pre-release version of the Citrine field performance analysis module. An Azurite interactive explainer for formation testing is also available.

Paradigm’s E&P interpretation suite is now available from the Microsoft Azure cloud. The software is delivered either for management in-house or by Paradigm.

The 2.7.1 release of Troika’s Midi seismic data analysis tool now displays RODE metadata in a table of content output file.

PetroWeb has announced a web-based data manager for EDB, its enterprise well lifecycle database.

Startup Exigo has launched a collaboration app for oil and gas teams to ‘securely communicate and capture knowledge.’ A ‘knowledge network’ incorporates production volumes, pressure data and alarms,’ viewable through a map interface and asset activity feed.

NeoFirma has announced an oil and gas tailored business intelligence solution, bringing cloud-based data visualization and analytics to smaller independents without large IT resources.

Phusion IM has deployed new ‘buyer personas’ to make its process engineering and facilities management software available to small and medium-sized clients. The company also announced Phusion Onsite, a new data collection app.

New Century Software’s Pipeline Portal leverages out-of-the-box Esri tools and pipeline GIS data management best practices to address common operator needs. Implementation of Esri ArcGIS for Pipelines is available as a service option.

Coreworx 7.1 introduces an integrated packaging of engineering and construction work processes, bringing a complete set of documents and data to all stakeholders and a link between interface management and contract management.

Hifi Engineering has successfully demonstrated its HDS fiber optic based leak detection technology. The tests were performed in collaboration with C-Core and the LookNorth Canadian national center of excellence for commercialization and research.

The 2.0 release of IT Vizion’s Operational Excellence gathers KPIs, KOPs, indicators and limits that are ‘scattered around too many systems’ into a single application. Users can now manage indicators regardless of their source data.

Wand’s Oil and gas taxonomy now features over 2,000 terms and almost 700 synonyms covering the upstream, midstream and downstream sectors. The taxonomy targets oil and gas document management initiatives.


Review - DNV GL’s Data Quality Assessment Framework

New introduction to DNV GL’s data quality consulting services exhibits formalistic approach.

Publications from consultants range from lightweight ‘teasers’ designed to whet the appetite to informative teaching material. DNV GL’s RP 497, ‘Data quality assessment framework’ (DQAF), a 40 plus page free download, falls in the middle of spectrum. DQAF’s target audience is customers, consultants and the data quality community at large.

The basis for DNV GL’s approach is the ‘stringent’ ISO 8000-8 data quality standard that requires ‘complete definitions for both metadata and the conceptual model.’ The principle behind the ISO standard is to ‘evaluate data as correct (good) or incorrect (bad).’ This obvious principle is illustrated with a singularly uninformative graphic.

DNV GL is primarily a classification and technical assurance company and its view of data quality veers towards ‘the impact of data quality on operations.’ This is assessed using tools for risk analysis ‘such as bowtie models, risk matrices, and fault tree analysis.’ Some may consider this scope creep from mainstream data quality. Likewise, the coupling of information security to data quality makes for an extremely broad field of study. DNV GL claims that ‘a high level of maturity of data quality is generally associated with higher levels of information security.’

DQAF takes a DAMA-esque approach, proposing a framework for ‘checking that the quality of a data source matches the criteria appropriate for its context.’ In addition, a data quality maturity framework is proposed for corporate self-assessment and improvement.

Full-blown data quality assessment requires more RPs covering sub topics and publications including DNV GL’s ‘DQA for sensor systems and time series data,’ which is not publicly available,

ISO 8000-8, data quality maturity models by Loshin and CMMI, W3C data on the web best practices, ISO 31000 risk management, DNVGL-RP-0496 cyber security resilience management and ISO 27000 information security management! More formalism is evidenced in terminological definitions from ISO/IEC 11179-1:2004.

In this reviewer’s opinion, DQAF illustrates the difficulty of the abstract, multi-domain approach to data quality. For instance, in navigation data, quality pitfalls are likely to happen down inside the data and may require a deep understanding of the domain to spot and fix. The high level meta model and KPI-style approach to data maturity could lead to a false sense of ‘information security.’


GE Oil & Gas Annual Meeting 2017, Florence

IEA sees growth slowdown but ‘no demand peak.’ Total’s 'RAID,’ digital diagnostics and maintenance. BP CEO, ‘oil and gas digital way behind aviation.’ Apple/Siri creator on Predix and the digital transformation. GE’s take on the digital twin. Watson-like digital assistant for Predix. System 1 and Meridium mash-up. More from Baker Hughes, Amec and Reliance Industries.

GE reported upwards of 1,000 customers in attendance at the GE Oil & Gas annual meeting held earlier this year in Florence Italy. There were likely as many GE personnel present. GE’s flagship event includes displays of the company’s diverse line of ‘big iron,’ compressors, pumps and more, alongside its evolving digital solutions platform, Predix. But what is equally interesting is the snapshot that the GEAM provides of the state of the industry with commentary from a veritable Who’s Who of notables from operators, suppliers and politicians.

Fatih Birol was first up with a summary of the IEA’s latest energy outlook for the next couple of decades. In 2016, over half of new capacity was from renewables which are ‘no longer a romantic story.’ The IEA does not see oil demand peaking but there will be a ‘slowdown in growth.’ New conventional project approvals are at the lowest level since the 1950s. 2016 discoveries were the lowest in 70 years. An unprecedented effort is needed to avoid a supply-demand gap ‘in a few years’ time’. US shale oil is reactive but will have a hard time reacting enough. Birol foresees a ‘period of huge oil price volatility.’ With the current focus on climate change, some argue that there is no need for upstream investment. But this is not the IEA’s position. ‘Continued investment in oil and gas remains an important component of a smooth, least cost energy transition.’ Moreover, LNG is a catalyst for the second natural gas revolution with far reaching implications for gas pricing. Digital technology and electrification make for new opportunities and challenges as a major IEA study has shown.

Total’s Arnaud Breuillac recalled the shale revolution that has provided abundant resources but also created a huge downturn cycle! This hurt at first but has resulted in efficiency gains. Total, like the IEA, believes that, despite volatility, oil and gas will remain Total’s core business for decades. Digitalization is key to lower costs and to turning constrains into opportunities. In which context, Total’s digital ‘Raid’ (remote assistance intervention and diagnostics) will see some 35,000 sensors installed by 2018. ‘Digital will help cut costs and transform production.’ Operating costs fell 50% between 2014 and 2018, from $10 to $5/bbl. Total has reduced its supplier count by 2/3 and in West Africa, downtime is down 25%.

Bob Dudley (BP) concurred on the importance of digital, observing that oil and gas is ‘way behind aviation.’ Big digital data is set to be disruptive to industry. On climate change, BP is a believer and contributor to the Oil and gas climate initiative, a $1 billion fund for methane mitigation and carbon capture. On current trends, with world population set to grow to 9 billion, a 2°C plus rise is ‘all but inevitable.’

GE has hired some impressive talent to bolster its digital team. Ex-Apple creator of Siri, Darren Haas came on board last year to help out with Predix and with clients’ digital transformation. While many companies are working on the cloud-based platform for the digital twin (see this month’s lead), ‘a lot are getting it wrong.’ Predix’ embedded graph database will soon allow the complex relationships of sensors on a plant to be captured. The graph database has been announced for some time but seemingly it is only really available now and will be integrated with Predix over the next few months. Along with the cloud option, Predix can be delivered as an on premises Predix Box, a lightweight analytics appliance. Upcoming ‘data at the edge’ workloads ‘will change the industry.’ These are driven by Predix-ready controllers, gateways, appliances and the cloud, ‘several [of which] are coming online in Q2.’ Haas warned, ‘Note that we are all in on Predix, there is no plan B!’

GE’s Colin Parris provided a captivating explanation of the digital twin (DT) concept. Successful companies like Apple, Amazon and Google have cornered the market in their respective fields, ‘there is no number 2.’ This they achieve by making a numerical model of the consumer, i.e. you and me, to be able to target their advertising. GE is doing the same, not just for asset classes but for each individual compressor or pump. Instead of using averages or assumptions to predict performance, the data for the exact operating environment, costs and so on is rolled up into the digital twin or, as Parris prefers, the asset’s own ‘profit and loss account.’ Along with early warning of bearing failure and continuous prediction of remaining useful life, Predix supports dynamic optimization, balancing KPIs in real time by combining the physical (sensor) and digital (simulation) worlds.

Parris then fired up a rather Siri-like digital assistant that proffered production optimization advice and interaction in a natural language dialog. The speech interface warned of ‘scale build up in the Miss. lime!’ Harris queried ‘OK twin, what are my options?’ The machine came back with suggestions for remediation, costs and payouts. The rather contrived dialog was reminiscent of the Woodside/IBM Watson ‘Willow’ interaction that we reported on last year.

Another presentation introduced a calibration scale for companies setting out on the journey to ‘digital maturity,’ with a DAMA-like maturity scale from level 1 (data collection) right up to level 5, with AI-supported autonomous, de-manned operations. GE, along with other industry partners is targeting the ‘$200 billion opportunity’ for industry represented by a cloud-based asset performance management (APM) strategy. Concerns over security? ‘Don’t be worried, these things are being solved.’

GE’s Erik Lindjhem presented a mash up of GE/Bentley’s System 1 and Meridium portfolio. System 1 has been re-cast as an ‘edge device’ that assembles information (some may not leave the plant) and passes it up to Meridium. Both System 1 and Meridium are linked not by Predix, nor the digital twin, but by an ‘Enterprise Impact’ system that exposes data in different ‘personas.’ Thus, data is presented appropriately to field personnel or top-level strategists. Poster child for the Enterprise Impact approach is the Origin LNG mega project.

Binu Matthew, GE Oil and Gas’ head of software, led a special session on asset performance management (APM). GE had its own APM solution before it acquired Meridium in a $500 million deal last year. GE is in the process of integrating the two solutions, combining Meridium’s strategic top-down approach with its own more tactical toolset. The plan leverages a central data platform and expands APM from downtime mitigation to a more comprehensive role in production optimization, ‘leveraging the power of the digital twin.’ ‘APM in oil and gas has the largest potential of any industry we have seen.’

We chatted with Binu Mathew about the state of the Predix art. Predix is a data abstraction layer developed on cloud technology from Pivotal. This embeds Hadoop Spark and the BitStew platform for the Industrial Internet acquired last year. GE plans to pick up more state of the art software components as Predix develops. We pressed Mathew on how a mature application such as Meridium could now be presented as ‘Predix.’ He explained that the Predix roadmap is underway and that Meridium is being refactored to Predix so it can talk to other apps. ‘True’ Predix apps to be rolled out in 2017 include Predix pipeline corrosion management. This will blend radiography, ultrasonic sensing and flowmeter data. The Predix ecosystem extends beyond oil and gas. We also asked Mathew on whether having ‘20,000 developers’ working on Predix was a good thing. Mathew replied ‘That’s my mission. We have seen a lot of customer-specific work but now we are fixing on common patterns. We are moving back from the army of developers as we rationalize in phases. However, APM is not just one shrink-wrap product, we will need some service-related customization for specific client needs.’

In another keynote, Baker Hughes’ Martin Craighead laid-into inefficiencies in the industry supply chain. Globally the industry is ‘at best 50% efficient’ across the supply chain because of hundreds of hours of downtime and millions of dollars in excess costs. ‘This is costing us all in credibility with shareholders.’ Craighead considers that the return of capital employed (ROCE) as the best measure of long term value creation. Here, ‘We need to do better!’ In the period 2009-2014, ROCE was a meagre 2-3% over the cost of capital. In North America the industry was 5% underwater! ‘We need to operate fundamentally differently.’ BHI has started the discussion on ‘radical efficiencies’ with for instance, R&D in designer chemistry and EOR which is ‘a nearly untapped technology.’ Other promising new technologies include virtualization, automation tools that ‘think, act and heal themselves.’ Additive manufacturing, aka 3D printing will soon bear fruit with the announcement of the first 3D printed drill bit for later this year.

In a panel session on ‘collaboration,’ AMEC CEO Jonathan Lewis agreed that collaboration across the supply chain needed improvement. AMEC commissioned a study from Bain & Co. and found that oil and gas is in the ‘middle to lowest quartile,’ aerospace is better. Commenting the digital innovation theme Lewis asked ‘are we ready for collaboration and data sharing a la Predix? Some are, others not.’ While there is much talk of collaboration in the C-suite, this does not trickle down. ‘I’m not sure that the recent pain has been enough to force significant change in our way of working internationally.’ In North America however, collaboration has been much more successful, with un-conventionals as the disruptive/driving force. Even so, ‘we are inhibited by fine grained specification from procurement departments.’ This contrasts with the mining industry that has a track record of sharing the big picture with suppliers.

India’s Reliance Industries is in a good position to talk digital. Not only does it operate the 1.2 mmb/d Jamnagar refinery, ‘the world’s largest,’ it also runs India’s Jio telco and has its own digital services business. CIO Manoj Chouthai enumerated some of the tools deployed in Jamnagar, notably Meridium and GE SmartSignal along with AspenTech’s IP21 historian. Reliance is to partner with GE on Predix, now a core component of its digital future, along with an eclectic assembly of third party solutions from Honeywell, Schneider, Emerson and others. These are being co-developed in the Reliance Foundry aka the Digital Innovation Hub.

More from the GEAM home page.


Gexcon/FLACS User Group

Technip-hosted Paris meet hears from research into full scale hydrogen explosions, from ‘MEASURE, the ‘modelling escalating accident scenarios’ JIP, and on a new integrated validation framework. Gexcon US reports on RPSEA collaboration that is ‘overturning industry safety design practices.'

The Gexcon/FLACS* user group, held in Technip’s Paris office late last year, heard from Gexcon’s Trygve Skjold on on-going R&D activities and the results from the EU-funded HySEA project and the ‘Measure’ JIP. In HySEA is performing full-scale hydrogen explosion experiments in 20-foot ISO containers. Last years tests dealt with homogeneous mixtures, the 2017 campaign will extend this work to more realistic releases. HySEA is part funded under the EU Fuel Cells and Hydrogen 2 program.

Gexcon is about to complete the Measure JIP (modelling escalating accident scenarios). A prototype of FLACS with improved turbulence and combustion models will be delivered to sponsors Statoil, Total, Engie, DNV GL, ExxonMobil and BP. Pending sponsors’ agreement, the updated system will be made available to all FLACS users later this year.

Gexcon’s Sunil Lakshmipathy provided an update on the FLACS ‘integrated validation framework.’ This includes a database of summary validation cases using ‘over-prediction’ (i.e. conservative results) rather underprediction. Other tweaks will make it easier for users of the software to relate the experimental results to their particular application, and hence to evaluate the relevance of the validation cases in the database.

Scott Davis (Gexcon US) presented work performed in an RPSEA-sponsored project to develop advanced CFD** tools to predict explosion pressure and deflagration risk at drilling and production facilities. The project involved building full scale generic models of offshore platforms and then blowing them up! This has so far revealed that current industry safety design practice ‘is in serious error.’ RPSEA and Gexcon are now working to overturn current engineering practices that involve ‘packing the mostest into the smallest’ spaces, in both offshore facilities and refineries.

Gexcon recently announced a new product FLACS-Risk, with a ‘clear 3D visualization’ of risk to communicate and improve stakeholder understanding of safety issues. The next FLACS User Group will be held in Bergen Norway on May 3-4.

* FLACS is an ‘industry standard’ tool for modeling gas explosions.

** Computational fluid dynamics.


Folks, facts, orgs ...

Drillinginfo, Alan Turing Institute, American Midstream, Assai Software, Cairn India, Ceiba Energy, ClassNK, Coreworx, CSE Icon, DNV GL, Durham Energy Institute, Emerson, Entero, EPA, EPIM, ERF Wireless, Flowserve, Forum Energy Technologies, GE, Weatherford, Ikon Science, Intelex Technologies, Ion Science, KP Engineering, Maptek, Michael Baker International, NIST, OXY, Optime Subsea, P2 Energy Solutions, PTC, Science Group, Williams, Ziyen, SPEE.

Drillinginfo CEO Allen Gilmer is to step down. He is replaced by Jeff Hughes. Gilmer stays on as executive chairman.

Mark Girolami now leads the Lloyd’s Register Foundation-backed Alan Turing Institute. He hails from IBM.

Rene Casadaban joins American Midstream as senior VP and COO. He succeeds Matthew Roland.

Assai Software Services is now a USPI member. Willem Hendrik van der Jagt is to represent Assai on the USPI board. Cyril Sebastian heads-up the Abu Dhabi office.

Melody Meyer and Atul Gupta have been appointed senior oil and gas advisors at Cairn India.

Ceiba Energy Services announces the resignation of Richard Lane (interim CEO and COO) and Ian Simister (president.) Ronald Sifton is now interim CEO.

Toshiyuki Shigemi is now executive VP and executive director of ClassNK. Yasushi Nakamura stays on as advisor.

John Gillberry is chief executive and Steven Airey is director at Coreworx.

Hoon Chew Toh has been promoted to president of CSE’s Icon division.

DNV GL has appointed Peter Boyle as Aberdeen operations manager and John Morgan as UK business development leader for risk advisory services.

Jon Gluyas is now executive director at the Durham Energy Institute.

Gerry Inglis heads-up Emerson’s new flow loop training facility in Aberdeen.

Entero has named Gary Gonzenbach as senior industry advisor in Houston and David Leinweber as senior account executive in Calgary.

Scott Pruitt now heads-up the US Environmental protection agency.

Magnus Svensson is to manage the E&P subsurface domain at EPIM. He hails from Dong E&P Norge.

Steven Sarno, John Barnett and Joey Milan are now interim directors at ERF Wireless following the resignations of Dean Cubley and Bartus Batson.

Jay Roueche is interim CFO at Flowserve following Karyn Ovelmen’s departure.

Forum Energy Technologies has appointed Prady Iyyanki as president and CEO. Chris Gaut is executive chairman.

Maria Sferruzza is now VP global turbomachinery solutions at GE Oil & Gas. Anup Sharma is VP CIO and chief application architect at GE Digital.

Mark Bashforth is the new CEO of Ikon Science. Former CEO and company founder Martyn Millwood Hargrave becomes executive chairman following Peter Dolan’s retirement.

Brett Roberts is MD with Intelex Technologies in Denver. He hails from CH2M.

Xavier Zinsch heads-up Ion Science’s new office in Cavalaire-sur-Mer, France.

Douglas Schnittker has joined KP Engineering as VP of engineering. He hails from CB&I Tyler.

James Moncrieff is general manager for Maptek’s EAME operations.

Danielle Smith is manager of Michael Baker International’s transportation department in Denver. She hails from Jacobs Engineering.

Doug Olson is chief of the office of weights and measures at NIST following the retirement of Carol Hockert.

OXY is to relocate its investor relations office from New York to the Houston HQ. Richard Jackson is VP investor relations and Anthony Cottone senior director.

Jarle Tautra has been appointed chairman of Optime Subsea’s board of directors. He hails from Aker Solutions.

Scott Lockhart is CEO at P2 Energy Solutions. Prior to P2 he was with IHS.

Craig Hayman has been appointed to a newly created role of COO at PTC.

Science Group has named Dan Edwards president of North American operations. Henry St. Aubyn heads-up the Houston office and continues as North America principal consultant, oil and gas.

Mark McCollum is president and CEO at Weatherford. He hails from Halliburton. William Macaulay has been appointed chairman of the board. Robb Voyles is interim CFO and Lyn Beaty is senior VP finance.

Mike Dunn has joined Williams as executive VP and COO.

Shane Fraser is to lead the new oil intelligence division at Ziyen. Andrew Hayes is Middle East liaison.

Floyd Siegle is the new president of the Society of petroleum evaluation engineers.


Done deals

Petroware/Logtek, JWitsml, CB&I, Veritas Capital, EcoStim, Fir Tree, Amec Foster Wheeler.

Startup Petroware AS (a subsidiary of Logtek AS) has acquired Jjwitsml.org. The latter’s Log Studio platform provides digital well log management functions including read/write access of WITSML 1.3 and 1.4. Petroware is interested in collaborating with organizations operating Witsml 2.0 test servers to continue its work on the JWitsml sub-component.

Private equity unit Veritas Capital has signed a definitive agreement to acquire the capital services business of CB&I for $755 million.

EcoStim Energy Solutions has received a $17 million cash injection from private investment unit Fir Tree Partners. The transaction represents a ‘potential equitization of substantially all of the company’s debt,’ avoiding defaults.

Prior to its acquisition by Wood Group (page 1), Amec Foster Wheeler reported progress on its ‘non-core’ disposal program with aggregate proceeds of £246 million forecast by end of Q2.


Wireless world

WellAware/Landmark IRC. Freewave/Systech ITTT controller. NIST’s ‘factory in-a-box’ testbed.

WellAware recently announced its Integrated Radio and Controller (IRC) for oil and gas producers. Collecting real-time data from remote production sites can be ‘complex and cost prohibitive’ due to network, infrastructure or budget constraints. WellAware’s IRC offers is certified for hazardous locations and includes a store-and-forward feature to prevent data loss if the network disconnects. The IRC is also available as a service from Landmark’s DecisionSpace Production 365 platform.

FreeWave Technologies has partnered with IoT gateway specialist Systech on a tank level control application that resides on, and executes from FreeWave’s ZumLink Industrial IoT programmable radio for edge networks. The system includes an easy to use ‘ITTT’ (if-this-then-that) interface that controls analog, digital and RS485 devices. The solution embeds Systech’s SysScrip solution for remote and hard-to-reach environments.

NIST, has published a study of wireless systems for industrial environments znd on how radio frequency signals propagate in various environments. The analysis will be incorporated into a NIST ‘factory in a box’ test bed for future researchers to study the impacts on signal propagation in controlled laboratory conditions.


Going, going ... green

HARC emissions monitoring. Process Ecology’s Methane advisor, Cap-Op eco-efficiency.

HARC has developed ‘REMS,’ a real-time engine emissions monitoring system for oil and gas operations. REMS combines data from gas analyzers, mass flow meters, weather stations, control valves and the engine’s controller area network. Data is edge-processed with a field programmable gate array and National Instruments’ RIO hardware and LabView software before upload to the Amazon EC2 cloud.

In a recent PTAC presentation, Alberto Alva-Argaez outlined Process Ecology’s Methane Advisor for methane emissions management and reporting from flared and vented sources. The solution has been deployed on large Albertan facilities to verify emissions and reduce carbon tax liabilities. The system is also helping operators meet the World Bank’s objective of ‘zero routine flaring by 2030.’

Cap-Op Energy is publishing an upstream oil and gas Eco-Efficiency Handbook, as part of a PTAC Teree initiative. The publication showcases current technologies and will serve as a standards checklist for new builds, upgrades and well site expansions. The idea is to increase awareness of available sustainable technologies, energy efficiency, environmental performance and social license requirements.

In a report to the US Department of Energy, RPSEA states that it has ‘generated over $150 billion in economic value’ and ‘over $40 billion’ in environmental damage mitigation under the ultra-deepwater and unconventional natural gas research program, a component of the 2005 US energy policy act.


Knowledge graph accelerates Maersk’s digital transformation

Maana’s AI to produce financial impact of ‘hundreds of millions of dollars.'

Speaking at the Gartner Data and analytics summit, Maana CEO Babur Ozden and Ibrahim Gokcen, chief digital officer with Maersk, explained how Maersk is using the Maana knowledge graph (MKG) to accelerate its digital transformation.

MKG has been deployed at Maersk’s shipping business where multiple, possibly conflicting, considerations of client needs, scheduling and vessel information are used to optimize operations. MKG facilitates collaboration amongst operations and port logistics experts. Using models of port omissions and routing constraints, all possible options are scored on time and cost.

As more inputs are added to the model, the Graph ‘grows, learns and adapts.’ With the MKG, Maersk reduced port re-routing decision time from 6 hours to 60 seconds. Financial impact is expected to ‘hundreds of millions of dollars.’ More from Maana.


IFS, Marsden Group team on oil and gas IoT

Data discovery platform augmented with IFS’ business connectors.

IFS has teamed with The Marsden Group on an internet of things offering for oil and gas. The partnership combines IFS’ IoT business connector with The Marsden Group’s platform for data discovery, machine learning and advanced analytics. The solution will help operators monitor, capture and analyze IoT data, enabling ‘timely action and optimized decision making’ in areas such as maintenance and supply chain management.

The Marsden Group’s platform is said to plug directly into an existing supply chain system. The Group has offices in The Woodlands, Glasgow and Perth and is a GE/Predix partner. More from IFS and The Marsden Group.


Sales, deployments, partnerships ...

Geosoft, OFS Portal, Aibel, Amec, Archeio, Meridium, Beamex, GR Energy Services, Baker Hughes, CGG, Coreworx, McDermott, OPIS, Trayport, RtTech, Wood Group, DNV GL, EnergyIQ, IHS Markit, Exprodat, GE, Seeq, Honeywell, IDS, Kongsberg, Intel, Irods, Oildex, Phoenix, Foro Energy, RPS Energy, Genesis, Schneider Electric, Canon, Sigfox, OleumTech.

Statoil has implemented the Geosoft DAP server solution in a centralized system for potential field data management. Datasets are managed in SQL Server and a secure Windows file system.

OFS Portal has now reached a total of 312 operator and network agreements since its inception in 2000.

Aqualis Offshore has awarded Aibel a contract for human factors analysis across the EPC’s ongoing projects on the Norwegian continental shelf.

Amec Foster Wheeler has signed a five-year global enterprise framework agreement with Shell Global Solutions for engineering, procurement and construction management services on downstream projects worldwide.

Archeio Technologies is to provide its intelligent well file software to Eland Energy and Sundown Energy to improve operational efficiency and decision making.

Meridium (GE Digital) is to combine its APM calibration management solution with Beamex CMX calibration software.

GR Energy Services and Baker Hughes have entered into a preferred wireline and perforating services agreement.

CGG has secured a six years’ agreement with Brunei Shell Petroleum to operate a dedicated processing center at its Seria office in Brunei Darussalam.

An unnamed major oil company has implemented Coreworx interface management solution on a $12 billion West African FPSO.

Saudi Aramco and McDermott have signed a memorandum of understanding for integrated engineering, procurement, construction and installation of offshore platforms.

IHS Markit unit OPIS and Trayport are to offer a suite of new trading and price discovery screens.

RtTech Software has named Moore Process Controls a reseller for the South African market.

Wood Group has completed the FEED* for Noble Energy’s Leviathan field development project in the Eastern Mediterranean and has started the detailed engineering for the platform. The contracts are worth some $95 million.

DNV GL is to provide FEED verification to Black Sea Oil and Gas’ Midia gas development offshore Romania.

EnergyIQ and IHS Markit are to collaborate on data connectivity solutions for the IHS Kingdom geoscience suite.

Exprodat, a Getech group company, has developed a number of support services targeting the new ArcGIS pipeline referencing extension.

GE Oil and Gas is to provide Venture Global LNG a plant-wide technology solution for its Louisiana LNG export facilities.

Seeq has joined Honeywell’s Connected Plant program to help clients ‘leverage data and insights in operations.’

IDS and Kongsberg Digital are combine their drilling and well construction solutions, integrating real-time and reporting data.

Intel has joined the Irods consortium and will develop a link between Irods and the Lustre file system for HPC clusters.

Oildex and Phoenix have teamed on a ‘fully integrated’ field management solution for oil and gas.

Petrobras and Foro Energy signed a two year extension to their technology cooperation agreement for the development of a ‘next generation’ laser drilling system.

RPS Energy is teaming with Genesis to offer a range of advisory, technical and operating services.

Schneider Electric is to leverage Canon’s PrismaDirect and PrismaPrepare solutions as integral parts of its plant and process workflow platform.

Sigfox and OleumTech are to offer OleumTech’s patented self-contained, battery-powered wireless transmitter technology.

* Front-end engineering design.


Standards stuff

W3 data on the web. Energistics’ coordinated release. ISO well integrity standard. NIST digital library of math functions. OGC ok’s GeoSciML, issues RFI for open geospatial APIs. SIIS’ RP for API 17F.

The W3C has just published a new version of its ‘Data on the web best practices’ recommendations for data that is ‘discoverable and understandable by humans and machines.’

Energistics has announced a coordinated release of its E&P data standards, WITSML v2.0 (drilling), PRODML v2.0 (production) and RESQML v2.1 (reservoir), in conjunction with the Energistics Transfer Protocol, ETP v1.1. A Standards DevKit, an ETP DevKit and other tools are also available to ease implementation of the standards.

The new ISO 16530-1:2017 standard for well integrity targets oil and gas wells regardless of their age, location and type. The standard includes minimum requirements for well integrity management and recommendations that can be scaled to a well’s specific risk characteristics.

NIST has just updated its Digital Library of Mathematical Functions. The DLMF is the modern-day successor to the 1964 classic NBS Handbook, the most widely distributed and most highly cited publication in NIST’s 117-year history.

GeoSciML is now an OGC standard comprising a logical model and GML/XML encoding rules for geological map data, geological time scales, boreholes, and metadata for laboratory analyses. OGC has also just published a white paper on open geospatial APIs.

The Subsea Instrumentation Interface Standardization network has released a recommended practice and standard text for inclusion in API 17F, a standard for subsea production control systems. The protocols cover analogue, digital serial and ethernet TCP/ IP devices.


Regulatory report

RRC’s IT modernization. PPDM’s regulatory data standards committee gains new members. ExxonMobil ducks Canadian reserves reporting. API jubilant as new administration clips EPA’s wings.

The Railroad Commission of Texas has modernized its IT systems. Producers can now file well log data electronically in both .TIFF and .LAS formats via the L-1 e-log status report form on the Commission’s website. Previously, operators had to print and submit paper logs.

At a recent meeting, the PPDM regulatory data standards (RDS) committee welcomed new members from BP, Oklahoma and Colorado State regulators. They join representatives from US (Bureau of land management and States), Australia, Alberta and Saskatchewan. An RDS survey on ‘well milestones disambiguation’ has found that ‘semantic confusion on dates is widespread.’ PPDM invites others to take the survey. RDS has also initiated contact with the API on its RP78 well directional survey data standard and with PODS. Energistics will be pleased to note that henceforth, PPDM RDS committee members will ‘mention the work underway at the National Data Repository meeting in Norway.’

Exxon Mobil reports that by agreement with the Alberta and Ontario authorities is no longer needs to comply with Canadian standards for reserves reporting. From now on, compliance with US reporting rules will suffice.

Meanwhile in US, the American Petroleum Institute is celebrating the new administration’s clipping of the US Environmental protection agency’s wings. The API downstream group welcomes the administrative stay on the EPA’s risk management plan rule while its upstream unit applauded the Senate vote to disapprove the Bureau of land management’s proposed new rules for domestic oil and gas development on public lands.


Falconry AI for PI

'Nascent’ pattern recognition technology ‘understands' miner’s time series data.

Speaking at a recent ARC Advisory Group industry forum, miner Ciner Resources teamed with AI specialist Falconry to present the latter’s pattern recognition technology that has improved yields and reduced waste. Falconry’s AI plugs in to the OSIsoft PI system and is claimed to ‘innately understand’ time series data.

Falconry’s technology combines signal processing and machine learning. An intuitive GUI and software architecture ‘puts advanced condition monitoring into the hands of operations’ owners and subject matter experts.’ Ciner reported that relevant patterns in its data were spotted ‘in a matter of days.’ Along with PI, Falconry has plug ins for Splunk and Microsoft Azure IoT.

Venture capitalists Zetta Venture Partners and Polaris Partners recently injected some $5.3 million into Falconry to ‘transition Falkonry’s innovations from a nascent technology to a truly industry-impacting software offering.’ The technology is said to be applicable in data intensive industries including oil and gas. More from OSIsoft.


Safety first ...

OESI reports on human factors offshore. CVS’ scannable ID cards. Soon-to-be ‘eliminated’ CSB reports on 2013 Geismar, LA fire. Shellback’s ’smartphone safes.’ APEGA fines CNR for 2007 fatality.

The Ocean Energy Safety Institute at Texas A&M has produced a 56 page report on human factors and ergonomics in offshore drilling and production, subtitled the implications for drilling safety. The report is a review of some 200 published papers. OESI was set up in the wake of the 2010 Deepwater Horizon disaster. The OESI report sparked off an interesting discussion on the SPE human factors technical section’s bulletin board.

New technology from CVS lets energy and pipeline workers access safety procedures and emergency contacts from a smartphone. The functionality is embedded in CVS’s scannable ID cards.

IOGP has just published RP577 covering recommended practices for addressing safety risks and hazardous activities on fabrication sites.

US Chemical Safety Board chairperson Vanessa Allen Sutherland is understandably ‘disappointed’ with the Presidential proposal to eliminate the agency.

In what might be, in view of the preceding announcement, one of its last, the CSB has just issued its final report into the 2013 dual fatality explosion and fire at Williams’ Geismar, LA olefins plant. The report found process safety management program ‘weaknesses’ at the facility in the 12 years leading to the incident. These include deficiencies in change management, in the pre-startup safety review and in process hazard analysis. The CSB is to issues recommendations to the American Petroleum Institute to help prevent future similar incidents industry-wide.

Shellback is to provide some 400 ‘Smartphone Safes’ to Contract Callers. CC’s employees will no longer be able to use their cell phones while driving.

APEGA, the Alberta regulator has fined Canadian Natural Resources and the Horizon Oil Sands Project $10,0000 for a 2007 fatal construction site accident. CNRL admitted to unprofessional conduct in its engagement and supervision of engineering contractors and is to work with APEGA on a new standard for outsourced engineering and geoscience work.


Ta-ta tape, forget flash ... enter ‘Flape'

Iron Mountain heralds flash/tape hardware combo as ‘paradigm shift’ for long-term data storage.

A recent technical white paper from IBM describes tests on a high end data storage solution that combines tape, flash* and software-defined storage. ‘Flape,’ as the offering has come to be known, provides the low cost and high capacity of tape with the speed of a flash-based front end.

The idea has been around for a while, but has proven hard to implement. Now, IBM, working with a team from the Ennovar laboratory at Wichita State University have leveraged the software-defined storage (SDS) technology of the IBM Spectrum Storage range to enable an ‘easily-managed, high-performance, low- cost’ Flape solution.

IBM describes Flape as a ‘back to the future, next-generation architecture.’ Recently tape has experienced a deployment renaissance, driven by exploding data volumes and tape’s ‘ultra-low’ storage cost. While some predict that the cloud might replace tape, IBM claims that tape offers ‘many advantages over the cloud.’ Exactly what these are is not so clear since, as IBM notes, ‘many cloud service providers are deploying tape in their own infrastructure.’

In a separate release last month, Iron Mountain described Flape as a ‘paradigm shift for long-term data retention.’

* Solid state memory.


Docker ’swarm’ for FME’s geoprocessing

'Ugly details’ of parallel processing hidden. Swarms of workers perform compute intensive tasks.

FME’s trials with Docker continue. Bloggers Don Murray and Grant Arnold report on how Docker’s virtualization technology is changing the way FME is deployed both on-premises and in the cloud. Docker is the ‘ultimate abstraction layer’ in which to deploy fault-tolerant, scalable solutions. First Docker abstracted compute, then it abstracted networking, and soon it will abstract storage.

FME has been experimenting with Docker ‘swarms.’ The blog provides videos and code samples to show how a straightforward, but potentially compute-intensive task can be submitted to a swarm of ‘workers,’ instances of the FME calculation engine. The initiating job doesn’t wait for the task to finish but simply submits all requests to the workers. In fact the driver does not know it is talking to a swarm, the operation is identical to a submittal to a standalone FME engine. Docker swarm ‘magically’ looks after the networking details of sharing the connection with the workers, regardless of whether they are on a single machine or spread across multiple machines. Docker Swarm ‘hides and deals with all the ugly details.’


Teradata open sources Kylo data lake pipeline

Hadoop, Spark, NiFi bundle released under Apache 2.0 license to enable ‘fit for purpose’ data lakes.

Teradata has open sourced its ‘Kylo’ data pipeline technology that is said to simplify and accelerate data lake deployment. Kylo embeds a suite of open source technologies including Apache Hadoop, Spark and NiFi. The Teradata-sponsored project has now been released under an Apache 2.0 license. Kylo evolved from code harvested from data lake engagements led by Think Big Analytics, a company that Teradata acquired in 2014.

Commenting the release, Enterprise Strategy Group’s Nik Rouda observed that for many, implementing the Hadoop stack is a complex endeavor, ‘Big data technologies are heavily oriented to software engineering, developers and system administrators.’ ESG research found that many struggle to staff teams with BI and analytics talent. Big data and open source solution expertise is even harder to come by. Most surveyed said that their big data initiatives take between seven months and three years to show significant business value. Even when a data lake has been achieved, it may fail to attract users who find it difficult to explore.

Kylo address such challenges by simplifying development of the data pipeline and common data management tasks. Kylo’s user interface allows for code-free self-service data ingest and wrangling, while reusable templates increase productivity. Teradata’s Duncan Irving told Oil IT Journal, ‘Kylo looks really promising for oil and gas as part of the data governance story around populating a fit-for-purpose data lake.’


Blockchain: BP joins Ethereum alliance. LR on engineering use

Crowdfunding evolves to enterprise smart contract system. Can blockchain carry CAD models?

BP is a founder member of the Enterprise Ethereum Alliance, a group of enterprises, startups, academics and vendors using Ethereum’s blockchain-based ‘smart contract’ system. Smart contracts are applications that run ‘exactly as programmed without any possibility of downtime, censorship, fraud or third party interference.’

The decentralized Ethereum platform runs on a customized blockchain. Developers can create markets, store registries of debts or promises and move funds ‘without a middle man or counterparty risk.’ Initial development was crowdfunded back in 2014 by the Swiss nonprofit Ethereum Foundation. The newly-announced Enterprise Ethereum Alliance sets out to build ‘enterprise-grade’ software using the technology.

The engineering applications of blockchain was the subject of a recent workshop organized by LR’s Alan Turing Institute. Delegates explored industry sector challenges where blockchain technologies might provide a solution and the pros and cons of blockchain in engineering projects. The practicality of storing large engineering drawings and CAD models was questioned.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.