Oil IT Journal: Volume 21 Number 8


Interface Fluidics

Calgary startup repurposes healthcare analytical nano technology to measure oil properties at high temperature and pressure. At greatly reduced cost and increased speed over traditional HP/HT cells.

Interface Fluidics (IF), a spin-out from the University of Toronto, is developing a nano-scale device, a.k.a. a laboratory-on-a-chip, that is claimed to perform physical reservoir modeling faster and at a fraction of the cost of traditional methods. The technique was originally developed by David Sinton (now Interface Fluidics’ CTO) who has repurposed microfluidic technology developed for healthcare to apply it to the problems faced by Albertan heavy oil producers.

Micro fluidics is used in medical devices such as pregnancy testers and, more controversially, by Theranos to analyze minute samples of blood or saliva.

For Alberta’s heavy oil producers this means testing chemical additives, solvents, surfactants and nano-particles. This can provide an understanding of pore-scale fluid mechanics and is used to de-risk new oil production methods and optimize existing reservoirs.

Interface Fluidics’ ‘chip’ is in fact a glass or silicon slide that is etched with hydrofluoric acid to produce textured microchannels into which microscopic quantities of the fluids under test can be placed at strategic locations. The slides are placed in a small cell where it is possible to visualize simulated flow through the ‘porous media’ at reservoir relevant temperatures and pressures. Critical characteristics such as emulsion size and distribution, surface wetting, diffusion, dispersion and condensation fronts can be obtained by direct measurement.

As Alberta’s heavy oil activity declines, Interface Fluidics is turning to the international arena with a broader offering, and has conventional PVT analysts in its sights.

Interface Fluidics’ Tom de Haas told Oil IT Journal, ‘We can put a nano liter of oil on the chip and see how it reacts at reservoir temperature and pressure to, say, a CO2 or steam flood or to study CO2 sequestration. It acts as a microscopic fluid cell for subsurface PVT analysis. We can perform the same analyses as a traditional HP/HT cell but with a billionth of the amount of fluid and much faster.’

A combination of microscopy, optical and thermal imaging is used in the analysis, sometimes along with fluorescence micrography.

de Haas concluded ‘the device costs around $1,000 as opposed to perhaps $200k for a traditional PVT setup. We can perform an entire fluid phase diagram, thousands of measurements, in one go!’

Computationally the system is straightforward, with grey scale gradients and minimal image processing involved. Extra computation is used to scale up the results. More from Interface Fluidics.


Baker sale round 2

After falling at the post in an aborted sale to Halliburton, a new suitor emerges. Baker is to merge with GE Oil & Gas. But what of the software?

GE is to merge its oil and gas unit with Baker Hughes, paying BH shareholders $7 billion and keeping a 62% share of the merged company which will be led by GE’s Lorenzo Simonelli. The deal raises some interesting questions on the software front and GE CEO Jeff Immelt is counting on growth opportunities for its Predix big data platform.

Earlier this year GE announced a partnership with Paradigm and opened-up the possibility that this upstream software suite might integrate with Predix. The merger with Baker Hughes brings another geoscience toolkit in the form of Jewel Suite and another candidate for Predix integration.

Elsewhere GE and Predix have been in overdrive of late with the announcement of a Predix-based ‘digital mine’ and new software solutions for ‘digital water.’ Should you be interested in joining the new behemoth, GE is hiring, in particular at its Knowledge discovery lab which is seeking folks with big data skills including ‘text mining, semantic technologies, massive-scale/big data processing and real-time complex event processing.’ Languages (Python, Matlab, R, Java, C++) and semantic web experience (Prolog, JENA, OWL/RDFS, SPARQL) are a plus. And you can apply here.


More than meets the eye in ‘commercial’ vs. ‘not for profit'

Neil McNaughton reports back from the 2016 Intelligent Energy conference on its role in breaking down the silo walls between IT and 'the business.’ While IE and its Digital Energy twin do a good job, they face increasing competition from the ‘for-profit’ user groups such as the Esri PUG, the OSIsoft user conference and the Spotfire Energy conference. All to be reported from ‘real soon now!’

I have been editing Oil IT Journal since 1996. That rather long time frame means that when I go to conferences today I am often greeted by some rather old (like myself) folks for whom there is nothing new under the sun.

It has not always been thus. In the early days, although I was no spring chicken myself I felt somewhat junior to the white-haired individuals that held forth about this and that. The deference of youth and the realization that I had a lot to learn colored my reporting. I was a bit more open to new technology than I am now although I carried some of prejudices that I picked up from my earlier years as an end user. Notably a disdain for things ‘commercial’ and a more open mind for ‘not-for-profit’ SPE/SEG style output.

One ‘raison d’être’ of what was originally Petroleum Data Manager and is now Oil IT Journal was the pleasure that I took in pointing out that much of what is presented as new is, in fact, old. In my twenty years or so (before Oil IT Journal) of working for oil and gas and service companies, I observed that some, but not all projects, are launched on the premise that ‘there is nothing in the market place that does this.’ Not finding something ‘out there’ is almost always due to an extremely perfunctory search. Hence Oil IT Journal’s exhaustive (well quite) enumeration of stuff that is there and that folks ought to be aware of before embarking on something ‘new.’

As we have now been publishing for twenty years or so, I am now myself in the grey-haired brigade of folks who probably should have been ‘crew changed’ but have not, for various reasons which I do not propose to go into right now. Meanwhile, the young ‘generation X’ folks and millennials have moved up the pecking order and are now calling the shots and even kicking off projects here and there.

I would love to be able to say that this younger cohort of managers are all avid readers of Oil IT Journal and that it has permeated their reasoning. But unfortunately, this is not the case. While we have a reasonable subscriber base in oil and gas and service companies of all sizes, it is clear to me that we have a lot more to do in terms of touching a new generation of decision makers and helping them with some history and culture of what has gone before and what is going on now in parts of the industry and world that they may not be familiar with.

A major reason for our lack of penetration and, incidentally, an obstacle to making real progress in matters IT is the way that, despite calls for ‘collaboration and cooperation,’ industry loves to put up silo walls and live within their boundaries. Thus, someone with an IT hat, will read Information Week. Someone with an engineering hat will read the Journal of Petroleum Technology. Asking members of either of these communities to spend an extra couple of hours a month reading Oil IT Journal is a tough call even though we know it is worth the effort don’t we!

The Intelligent Energy (a.k.a. Digital Energy) community whose members met last month in Aberdeen is likewise a silo-crossing initiative. Oil IT Journal subscribers can read our report from the event on page 6 of this issue. The IE/DE events have been going for ten years or so and so have seen the aging and replacement of the community and the slow evolution of digital culture.

At the 2007 Digital Energy event, discussion centered on the need for a new breed of ‘renaissance’ engineers with IT knowledge. There was a debate as to whether engineers should be trained in IT or vice versa. I’m not sure what the outcome was then or what the situation is now. But I see a parallel with upstream data management which likewise requires folks with a dual competency in the business and in IT. In data management, it is only now that such courses are getting going (see last month’s issue).

Judging from the 2016 Intelligent Energy event, I’m not sure if the movement has resulted in renaissance engineers. The content is interesting if a little repetitive. It could be that the event has become a victim of its own success in that presentations are sometimes indistinguishable in content from what might be presented in, say, the SPE ATCE. And why not indeed? ‘Digital’ is now pervasive and here to stay. The question now arises; do we still need IE/DE? The answer is definitely yes because although digital may be pervasive, it is still at risk from the silo effect. IE/DE is one of the main events where the silo walls are breached and to good effect.

You may be wondering what the other events that do a similar job of spanning IT and the business. I will put you out of your misery straight away and try to bring the unraveling threads of this editorial together. The events that challenge IE/DE for primacy in silo busting are the commercial for-profit user groups. In the last couple of months, I have attended the OSIsoft user conference, the Esri Petroleum User Group and will be reporting from these in future issues. Also next month, we will be reporting virtually from the excellent Spotfire Energy conference which also took place recently. The quality of these events – despite their ‘commercial’ nature is well up to the standard of IE/DE. In fact the user group nature of these events can mean that they are an outlet for more impactful examples of real technology use than some of the IE/DE material.

OK there are a few raveled threads left here. In our next issue, I will explain why a good knowledge of the history of upstream IT is important, what the grey haired brigade achieved and provide more evidence of how ‘commercial’ has taken up the baton dropped by the not-for-profits.

@neilmcn


Preserving geological assets

A joint Petroleum Exploration Society of Great Britain/London Geological Society event discussed how geological assets, both physical and digital, can be held for the long term. New UK regulations have clarified operators’ responsibilities and the UK is planning a National data repository. Attendees also learned of PGS’ big seismic data solution, use of ‘Linq’ to communicate between IT and executives, of machine learning in large text collections and on digitizing ExxonMobil’s thin sections.

Some 80 delegates attended the PESGB/Geolsoc conference, Preserving and protecting geological assets, held earlier this year in the UK. Chairman Paul Duller (Tribal and Geolsoc geoscience information group) observed that extraordinary budget pressures facing exploration companies and geological surveys are placing corporate and national collections in peril. The conference set out to offer different strategies and options to ensure that geological assets (physical samples, hard copy and electronic documents, digital data) are preserved, protected and not lost in downsizing, layoffs, mergers and closures.

Malcolm Fleming (Common Data Access) outlined the new regulatory environment of the 2016 UK Energy Act 2016 which has devolved new powers to the Oil & Gas Authority. The 2016 Act clarifies licensees’ obligations in respect of retention, reporting and release of geo-information and samples and is underpinned by escalating sanctions for non-compliance. Plans are afoot for a UK National data repository, described as a ‘conventional’ NDR to satisfy licensees’ reporting and retention obligations and allow data distribution and transfer through entitlement-setting.

Mike Howe presented on the British Geological Survey’s National geological repository (NGR), a large collection of geological materials, cores and samples collected from the early 1800s up to the present day. The NGR holds core and cuttings from every UK onshore and offshore well. The current financial climate has impacted NGR’s budget significantly, income from commercial users has halved while operators, cutting back on their holdings, are offering even more material. BGS’ answer is digitization, with databases and high resolution core imagery complementing sample storage. Mention was also made of the UK National geoscience data centre, the NGR’s digital twin and of the National hydrocarbons data archive (NHDA).

We were confused so we asked BGS’ Any Kingdom for clarification: ‘The NGR/NGDC division in our collections is somewhat arbitrary and partly due to different funding sources. In essence the NGR is the ‘core store’ for our own material and that which is archived by statute from the oil and gas industry and others. The NGDC holds digital data held in databases. So while a sidewall core is physically located in the NGR, its analytical data will be served from the NGDC.

Alan Smith (Luchelan) presented a practical solution to big (seismic) data management that was delivered by Ovation Data and Talus Technologies to PGS to manage its multi-client seismic data.

Andrew Zolnai enumerated various approaches to upstream data access and workflows before homing in on his current favorite Linq, a toolset that ‘bridges the communication gap between the IT experts and the non-technical executives via an information supply chain connecting information sources to decision support outputs.’ With Linq, executives can lead improvement initiatives by asking IT experts to describe technical proposals in terms of improved information flows.

Paul Cleverley, wearing his Robert Gordon University cap, described the use of machine learning and text analytics to exploit the ‘exponentially growing volumes of unstructured information’ that simultaneously offer the potential for information overload and serendipitous information discovery. Cleverley surveyed 53 geoscientists from an oil & gas operator and geoscience consultancy to study how novel machine learning based search could improve interaction with information collections such as the SPE’s OnePetro, the Geolsoc’s Lyell Collection and the American Geological Institute’s library. The study found that word co-occurrence techniques facilitate serendipitous discovery to a ‘statistically significant’ extent.

Barry Wells (Conwy Valley Systems) provided a closer look at petrographic data management. Capturing a meaningful description of a thin section is a tricky business. The field of view of a five megapixel camera means that around 250 images are needed to cover a typical thin section. A company like ExxonMobil is reported to have millions of thin sections – so digitizing these is a major undertaking*. Images need to be stitched together digitally for use. Tools of the trade include Gigapan and even ArcGIS whose raster image capability can be repurposed for slide data.

Alan Shipman (Group 5) presented on issues surrounding records management and companies’ legal obligations for retention. All this in the context of the UK Energy Act 2016 with its new dictates regarding retention of information and samples. As yet, these do not include details of what or how long information needs to be kept. This is work in progress under the auspices of the OGA. Whatever the outcome, Shipman is a advocate of Adobe PDF as the preferred means of document retention. PDF is an ‘open’ standard format. In its PDF/A manifestation it assures visual appearance and supports metadata management and ‘guarantees’ future accessibility.

* A possibly naïve multiplication would suggest that ExxonMobil’s thin section collection would, if digitized as above, represent several exabytes!


IRM-UK 2016 Enterprise Architecture/BPM conference

Gazprom leverages BPMN template in framework. VNG Norge builds management information system on Qualiware. The Open Group presents ArchiMate 3.0.

Speaking at the 2016 IRM-UK conference on enterprise architecture and business process management (EA/BPM) earlier this year, Andrew Pincott explained how Gazprom implemented a business architecture at its marketing and trading unit in under four months. Gazprom’s business architecture model is a complete overview of all of its activities, incorporating processes for all commodities in each geography. A standard BPMN template was loaded into Visio and pruned to a subset of the full BPMN symbology. Gazprom advocates a ‘keep it simple’ approach that aims for fit for purpose technology used where it is most needed. Business processes were evaluated for their suitability via a matrix of their importance to the business, their criticality, and complexity. The exercise has produced a BPM framework and company standards for the architecture. This now underpins the deal lifecycle and controls framework, building on Visio and Orbus Software’s iServer.

More evidence of the ‘keep it simple’ principle came in Inger Anette Backer’s presentation of VNG Norge’s new management system. VNG has deployed Norway-based Qualisoft’s QualiWare to ‘streamline’ the business. The project followed on from VNG’s success with the Bue and Pil discoveries and the need for a structured process around FPSO development and commissioning. Qualisoft claims 160 worldwide clients and a significant oil and gas footprint (ConocoPhillips, Shell, Maersk, OMV, Wintershall, Statoil and more). The toolset maps processes through stage gates and across stakeholders, adding regulatory compliance and risk management to the mix. The Qualisoft reference framework for oil and gas also has so far produced some 200 live work processes at DNV. The system is also plugged-in to other tools, notably DNV-GL’s Synergi asset integrity management system.

On the standards front Andrew Josey introduced The Open Group’s new ArchiMate 3.0 specification. ArchiMate provides a language for architecture description, a framework to organize them and a graphical notation and visualization system. The new version offers improved alignment with other Open Group standards, notably Togaf, and is claimed to better embrace both business strategy and ‘physical,’ internet of things modeling.


Alberta Energy Regulator on future regulatory IT

PPDM meet hears of AER vision for ‘agile’ reference architecture.

Speaking at a PPDM Association Houston data management event earlier this year, Pavan Kumar of the Alberta Energy Regulator outlined the AER’s vision for ‘future regulatory systems.’ AER is working on a reference architecture that is intended to assure efficiency in its overall energy development activities adding ‘agility’ across the energy lifecycle. A modern user experience is needed to help provide transparency to all stakeholders.

The project begins with an integrated decision model and capability map of AER’s business including a standardized ‘risk informed’ processes to drive regulatory oversight. The life-cycle approach includes ‘cradle-to-grave’ management of operators and their environmental performance.

The future regulatory architecture targets information sharing with electronic systems that support dynamic workflows and review processes, automated alerts and notifications based on specific geographical thresholds. A central repository holds all information related to energy development and will support novel technologies including predictive analytics.

AER is launching a pilot project to investigate how it can ‘automate and systemize’ how it receives, assesses and reviews information submitted as part of the reclamation certificate process. The pilot builds on existing components at AER including a database from Informatica, Esri and Geocortex GIS functionality, a Tibco service bus and reporting tools from Tableau. New capabilities to be developed under the pilot include dynamic case management, an integrated transaction system and the user interface.


Modeling best practices benchmark survey

Lone Star’s early results and invitation to participate in cross industry ‘MBP2’ survey.

Lone Star Systems CEO Steve Roemerman has kindly agreed to share the interim results from its cross industry modeling best practices benchmarking project (MBP2). The project spans upstream ‘digital oilfield’ simulation and modeling, healthcare clinical trial use cases and behavioral economics models. MBP2 began in 2015 with the development of an industry-independent vocabulary and test interviews to validate survey design. Data collection is now underway and follow up interviews and publication are scheduled for 2017.

So far MBP2 has gathered data from a mix of communities. Modeling professionals spanning more than 40 disciplines from a dozen industries have responded to date. The survey has found that although most are involved in generic modeling activities like operations research, few report involvement in ‘big data.’ Moreover, none so far report building data-driven models. Over 50% support some form of decision analysis. Accommodating uncertainty was found to be the most important consideration in creating a useful model. However, modelers are not confident in their tools, processes, and ability to cope with uncertainty. Over 5% say they don’t even attempt to represent uncertainty in their modeling. Less than a third say they fully meet related standards promoted by most professional societies and government regulations. Oil IT Journal readers can take part in the survey here.


Software, hardware short takes

Schneider Electric, dGB, MVE, Integrated Informatics, ESTD, Exprodat, Blue Marble, Venture, Logi Analytics, Cognitive Geology, Schlumberger.

Schneider Electric has announced a Unit performance suite for oil and gas, a bundle providing closed-loop, real-time optimization, monitoring and workflow for refineries.

dGB has released OpendTect v6.0.4 with a new, inversion-based approach to creating 3D horizon cubes. dGB has also opened the OpendTect Pro Webstore, facilitating access to OpendTect Pro and its commercial plugins. Software modules can be rented from the webstore.

MVE’s Move2017 adds a new ribbon-style interface, a 3D PDF export option, 3D rendering of multiple layers and map overlays and the ability to transfer volume attributes into Petrel.

V3.0 of Integrated InformaticsGeomancy Decision Engine adds functionality for offshore routing and enhanced well pad siting capability for the inclusion of faults and fault zones. Geomancy is now compatible with ArcGIS 10.3 up and is said to be ‘ArcGIS Pro-ready.’

Retina, a ‘next generation’ reservoir simulator from Iran’s Engineering support and technology development (ESTD) is optimized with a suite of solvers that promise a ‘stable solution with large time steps in huge, complex reservoirs.’

A new version of Exprodat’s Unconventionals Analyst ArcGIS desktop extension (V222) allows users to create well pads and laterals across sets of contiguous sections, view all results in 3D and assess compliance with the SPEE Monograph 3 reserve estimation protocol. Exprodat has also updated its Data Assistant, now compatible with ArcGIS Desktop 10.4 and 10.4.1 and adding new data formats. Microseismic data can now be represented with ‘beach-ball’ plots of foci, magnitude and orientation.

V 18 of Blue Marble GeographicsGlobal Mapper adds functionality for 3D data visualization and processing with dynamic rendering of terrain and LiDAR data and concurrent display of multiple terrain surfaces. A new ‘infinite view’ displays all loaded 3D data and dynamic rendering updates map detail on zoom in. A new flexible single user license facilitates work between office and field.

Venture Information Management’s Lead 2 Asset solution leverages a SharePoint-based portal, GIS software and Logi Analytics’ business intelligence platform to display key reserves, subsurface, licence and production information for the entire corporate portfolio.

Cognitive Geology has announced ‘Hutton,’ a scenario-based property modeling plugin for Petrel that develops geologically realistic alternatives prior to geostatistical workflows.

Schlumberger’s Petrel GeoTesting plug-in features global sensitivity analysis to target geological features of interest and incorporate geological uncertainty into well test design and execution.


USPI-NL 2016 member meeting, Amersfoort, NL

Cfihos V1.2 engineering data handover update. Cfihos from the EPC viewpoint. Chevron, ‘it’s hard to actually use standards!’ BP avoids ‘chaos’ with Mimosa OIIE PoC. Aligning BP, Chevron, Shell RDLs.

USPI-NL, the Dutch standards body held its annual member meeting earlier this year in Amersfoort. USPI director Paul Van Exel and Anders Thostrup (Shell and USPI chair) kicked off the event with a review of recent activity, in particular the flagship Cfihos* project. Cfihos originated in Shell where it has been in use since 2004 and is now in V1.2. It was successfully deployed in the Mimosa OGI use case 1 (information handover) and membership now includes BP, Total, Amec/FW, Petrofac, L&T, Datum 360, PhusionIM and Intergraph. Transition to an ISO standard is slotted for 2017/18. Cfihos has achieved critical mass in participation, now it needs critical mass in usage!

A ‘safari’ visit to engineer CB&I allowed Cfihos to be seen from the EPC viewpoint and compared with some 20 other handover requirements. There is a huge difference in the quality and volume of handover specs. These can be a vague entreaty to ‘use ISO 15926’ or a detailed 400 page document. Cfihos lies between these extremes and is thought to addresses most owner operators’ needs. Meanwhile, USPI and POSC/Caesar continue with maintenance of ISO 15926-4 and are working on document classification and metadata for inclusion in 15926 although ‘funding remains an issue.’

Although Vic Samuel (Chevron) is on the IOGP standards committee (ISC), he recognizes that it is hard to persuade his company to actually use standards. The ISC is working to improve standards’ effectiveness with a set of operator priorities and to coordinate standards organizations’ activity. To date industry has adopted relatively few information management standards but the EPIM ILAP and Cfihos are seen as likely quick wins.

Peter Whittal outlined BP’s approach to project and asset information flow. This involves a phased handover of quality info during build along with reporting and visualization of 3D model involving ‘60-70 systems,’ working from a central information store for document and tag management. All of which must be up and running at startup. BP ‘avoids chaos’ with a class library of the same terminology and data for use throughout. The system (actually the Mimosa OIIE) was delivered as a proof of concept in 2015 with help from Bentley, SAP and OSISoft and is said to have ‘a lot in common’ with Cfihos (See also Oil IT Journal N° 5 2015.)

Josh Vincent (Chevron) provided a progress report on the alignment of industry reference data libraries from BP, Chevron and Shell. These are currently not aligned so the plan is for ‘minimal’ additions to Cfihos to support all three. So far a ‘philosophy’ of alignment has emerged. A discussion ensued as to the real work involved here. The Fiatech Jord completion effort was evaluated at a notional cost of ‘around $1.5 billion dollars, spread over a 20-year period’ (Oil IT Journal May 2014). Indeed legacy standards work has seen ‘orders of magnitude’ more effort than Cfihos. The hope/expectation is that Cfihos can achieve a lightweight mapping with a realistic amount of work.

* USPI-NL’s Capital facilities information hand-over standard.


SPE Intelligent Energy 2016, Aberdeen

Is IE the way out of the downturn and now the time to try new technology? Investors like the ‘cost deflation’ story. BP on doing more in digital. ConocoPhillips on raising ‘tool time’ KPI. Dynamic bow ties and Macondo. OMG ‘oil and gas is not different!’ CiSoft Solutions break down the silo walls. Yokogawa’s Exapilot. CiSoft’s SOS-Net. IET’s MTO methodology. Optique update. Exxon GLOWS.

The organizers reported over 1400 attendees at the 2016 SPE/Reed Exhibitions Intelligent Energy conference in Aberdeen. In the opening plenary, Schlumberger retiree Walt Aldred opined that now, at the bottom of the cycle, was the best time to try new stuff. Autonomous drilling systems will be out ‘in the next year or two’ and automation will be ‘pervasive’ across our industry.

Redburn Associates’ Rob West observed that E&P share prices have held up better since 2014 than the oil price would have suggested. Investors appear to be expecting a price recovery and have bought into the ‘cost deflation’ story. Intriguingly, the majors created most shareholder value in a ten-year ‘sweet spot’ from 1993-2002 when prices were low. Cost cutting includes a scale back from ‘over maintenance’ in the Macondo aftermath. Maintenance spend (as judged by shutdowns) doubled after Macondo but is now down to 2004 levels. Labor costs in bbl/employee quadrupled during the period from 1980 to 2002 but have halved since then.

A panel discussed how intelligent energy could show the way out of the downturn. Greg Hickey stated that BP has achieved a lot in the digital space but that ‘doing more of the same only take us so far.’ BP has set out on journey to ‘transform, and focus on margin’ by ‘digitizing the upstream with a manufacturing focus.’ Toyota was cited as an exemplar of what BP is trying to achieve but it is GE that is supplying the toolset. BP is to address the enduring problems of equipment downtime and drilling inefficiencies with ‘smart sensors, cognitive computing and wearables.’ Standardization will move work to where it can be best accomplished. Why has this not been done already? Essentially because digital platforms could not support such activity. The cloud has changed all this, specifically GE’s Predix which is to provide BP with across-the-board analytics. GE’s industrial internet will provide ‘digital twins’ of infrastructure and a test bed for field and plant-wide optimization. Pretty well all of the above exists today, but in silos. The key now is to integrate and automate, ‘make the computers do the heavy lifting 24x7.’

David Boyle (ConocoPhillips) offered some interesting observations on productivity. Despite the sexy control rooms and remote operations center, ‘tool time,’ the time offshore workers actually spend on the job was a ‘consistently embarrassing’ two hours out of a twelve hour shift. A renewed focus on identifying bottlenecks led to efficiency improvements and tool time is up to 6 hours per shift. Platform reliability is also up with shutdowns down from every 2 weeks to every 4-5 weeks.

According to Johan Atema, Shell believes in the ‘lower for longer’ scenario and wants to be making money at $40 oil. Which, incidentally, is not really a ‘low’ oil price, rather a historical average. Shell wants to change its ‘arrogant, inward-looking attitude’ by learning from industry and from the outside world. Shell’s digital effort already has its sweet spots of equipment monitoring and maximizing production at least cost. The data infrastructure achieves very high uptime. In Oman the focus is shifting from the large operations center to the ‘smart mobile worker’ with an augmented way of working. Operators may be kitted out with a thermal camera, a GoPro, iPhones, gas monitors and good communications. A permit to work may be issued on the spot as required.

Mark Edgerton (Chevron) sees intelligent energy as comprising a large, growing toolkit. All Chevron platforms operated out of Aberdeen have condition-based monitoring systems that communicate with Houston HQ and to equipment manufacturers. Production is now maximized in real time through multiple small tweaks. Edgerton doesn’t like the word ‘workflows’ but he does like what they do! Offshore data flows into the iOps real time center and is used to improve maintenance planning and find out which teams are most effective. The future will bring ‘more digitization and more opportunities.’

Presentations/papers

Andrew Hartigan (Lone Star Analysis) has applied a ‘dynamic bow tie’ approach to risk management, developed for the aviation industry, to a retrospective analysis of the Macondo/Deepwater Horizon blowout and fire. Lone Star’s technology translates bow tie diagrams into a ‘dynamic model of trigger events, activities and barriers.’ Model nodes are filled with auditable data and relationships mapped as connected lines and probabilistic math. Hartigan warned of ‘duplicitous data’ that is present in many nodes. In Macondo, decisions were made by people unaware of the current state of play. Pressure, flow and historical data were input along with subject matter evaluations. Rolling up the whole model Hartigan concluded that prior to the event there was a ‘30% probability’ of a blow out as compared to a ‘nominal’ 0.045% probability. Comment: ‘nominal’ 0.045% seems rather high while 30% is clearly too low! More from the paper SPE-181036-MS and from the Lone Star video.

Claude Baudoin (Object Management Group and Cébé IT) Observed that while the ‘money’ in the industrial internet is mostly in smart grid related activity, oil and gas should not think ‘we are different.’ An IIoT demonstrator at a refinery tracked employees and tagged high risk areas using ‘smart helmet’ technology and other wearables. On the other hand IT/OT convergence is exposing systems to the risk of hacking. In a survey, over half of industry respondents said that standards are important for the IIoT. But what standards? The Standards Leadership Council has carved up the standards space into multiple bailiwicks but this is ‘more in the intent than in the execution.’ The SLC is ‘work in progress.’ The Open Systems Interconnection model IoT standards are an ‘alphabet soup.’ We need IT and OT to collaborate on the IIoT, perhaps starting with the Industrial internet consortium’s free reference architecture. On the security front Baudoin cited the hack of a control system on the BTC pipeline, attributed to the PKK, the Kurdistan Workers Party. The skill set required as OT migrates into IT is broader than in the old days of scada. Companies should keep IT architecture, security, governance and sourcing in-house as core skills not to be outsourced. Oh, and ‘expect to be attacked.’ In the Q&A, Baudoin was taken to task for his ‘do not outsource’ dictum. He relented some saying ‘OK, just keep governance in house.’ But also cited the case of a chemical company that had all its IT in Bangalore and did not know enough about its systems to re-bid the contract – SPE-181107-MS.

Mike Hauser presented work performed at the Chevron-sponsored CiSoft Center for interactive smart oilfield technologies at the University of Southern California. CiSoft is working to ‘break down the silos walls’, to enhance efficiency and improve HSE in a move from ‘conventional’ data management to ‘smart’ IT. The lab’s output is commercialized through Hauser’s ‘CiSoft Solutions’ unit. Initial focus for commercialization has on four ‘high priority’ inventions. Hauser was not very forthcoming as to what these were, but a visit to the CiSoft website located PDFs describing ‘integrating data sources,’ a ‘smart engineering apprentice’ and ‘visual grammar,’ a.k.a. ‘data analytics for users without a technical background’ – SPE-181068-MS.

Despite its large footprint in the field, Yokogawa has not been terrible audible in the SPE/intelligent energy community. So it was good to hear Maurice Wilkins on the company’s role in automating procedures for efficiency and safety. Wilkins’ talk revolved around the importance of standard procedures that are about to ‘change the industry.’ Today the main cause of plant trips and accidents is human error and frailty. An ExxonMobil study of transient operations found that although they only represent 10% of a facility’s life span, they are responsible for 50% of incidents. Enter standard operating procedures (as practiced in aviation) and standards-based decision support. Standards of relevance include ISA 18.2 (alarm management), ISA 101 HMI management and the ISA 106/88 procedure automation standard (of which Wilkins is an instigator). Citing the Mogford report on the 2005 Texas City explosion which found that the plant’s systems were ‘too complicated to start up manually,’ Wilkins offered a quiet plug for Yokogawa’s Exapilot procedural assistant. Exapilot would have halted the plant operations as soon as it detected that the alarms were not working. The Abnormal situation management consortium also got a plug – SPE-181019-MS.

Eric Cayeux from Norway’s IRIS R&D organization has been researching automated drilling performance and risk. There are many sources of uncertainty in drilling but few are generally considered. Requirements for good drilling performance may be various and complex. Here Cayeux has analyzed the risk ‘big picture’ for an extended reach well to optimize the drilling plan in face of uncertainty. Monte Carlo simulation was performed across the wide range of inputs to see if safety thresholds are not breached and to figure the optimum path through the multi dimensional parameter space. Current drilling scenarios are ‘far too deterministic.’ The Iris DrillWell Center and software also got a plug – SPE-181018-MS.

Jim Crompton, presenting on behalf of Chevron/CiSoft, described SOSNet, a.k.a. the smart oilfield safety net. This combines a machine learning component developed at the USC data science lab which leverages a large image base of photographs of corroded pipe. The imagery was combined with physical inline inspection data and used to train a neural network to look for defects, rolling in equipment tags and other data sources. The SOSNet information bus links information across multiple heterogenous data sources via a ‘semantic asset repository.’ This is an ontology-based semantic-web style repository (a triple store?) that can be queried with ‘automatically generated’ Sparql. Information extraction from drawings and images is described as a ‘robust and fully automated’ process – SPE-181048-MS.

Asgeir Drøivoldsmo of Norway’s Institute for Energy Technology introduced the ‘Man technology and organization’ methodology for optimizing operations and maintenance staffing levels of greenfield projects. The corollary of a move from time-based to condition-based maintenance is that staffing levels are no longer pre-determined. To reap the benefits of condition-based maintenance, a flexible workforce is required. MTO advocates a campaign-oriented workforce with a minimal crew on site plus a ‘campaign crew’ on call – SPE-181102-MS.

David Cameron (University of Oslo) presented the results of the EU Optique program that promised ‘simple oil and gas-oriented access to big data.’ Optique provides ‘ontology-based data access.’ Disparate data sources can be accessed through a graphical query generator that understands terms like ‘well bore.’ Statoil is said to be piloting the approach as is Siemens for gas turbine maintenance. The ontology was ‘bootstrapped’ from existing database schemas. A demonstrator is said to have shown that federation across six technical databases was possible. An open day-cum-summit was held at Oxford University as Optique enters its final year – the project is to end in November 2017. The final year will include training of IT experts in the use of the system and integration with a geoscience desktop. In the Q&A Cameron was asked if Optique was going to be used ‘commercially’ in Statoil. He replied that this was the case*. Another questioner asked how the Optique approach differed from the many ‘data virtualization’ offerings on the market. The answer was unclear – SPE-1811111-MS.

Zachary Borden presented ExxonMobil’s gas lift optimization workflows (Glows) that are automating its gas lift surveillance and optimization effort. In a large asset, there is a good chance that an underperforming gas lift well will go unnoticed. A lot of ExxonMobil production comes from gas lift wells but there are relatively few gas lift specialists. ExxonMobil has tried data-driven and physics-based models to arrive at the conclusion that there are ‘horses for courses.’ Gas lift problems include slugging, intermittent lifting, tubing casing communication and more. Various physical or machine learning tools and classifiers are good at solving different problems. Support vector machines, random forest and naïve Bayesian classifiers are all available. The trick with Glows is recognizing which tool should be used in what circumstances. The Glows event detector is said to be very successful and easily distinguishes between normal flow and slugging. Glows also performs physics-based wellbore hydraulic models with embedded software (Prosper from Petroleum Experts). The Valve Performance Clearinghouse database at Louisiana State University was also used. Following field trials Glows is now considered a ‘one stop shop’ for well performance monitoring and has contributed to significant production hikes – SPE-181048-MS.

* We have it on reasonably good authority that this may not in fact be quite accurate.


Folks, facts, orgs ...

Aker Solutions, API, Aqualis, CB&I, CGI, Chevron, DataCo, EnTouch, Exterran, Flowserve, Geoscience BC, GSE, Enable Midstream, New Century Software, Oxy, Quorum, Qinterra., Russell Reynolds Associates, Siemens, Seismic Image Processing, Tellurian, Ikon Science, NIST, Coats.

Dean Watson is COO of Aker Solutions, Egil Boyum is head of products and Mark Riding is head of strategy.

Mario Salazar is now the API’s ‘external mobilization manager.’

Ben Lazenby is director at Aqualis Middle East. Reuben Segar is COO at Aqualis Offshore.

Patrick Mullen is now COO at CB&I.

CGI has named George Schindler as president, CEO and member of the board, succeeding retiree Mike Roach.

Rhonda Morris is now Chevron’s corporate VP HR.

Grahame Blakey has joined DataCo as director of geoinformation.

Samantha Foley is now EnTouch’s CMO. She hails from Allegro.

Girish Saligram is president, global services at Exterran. He was previously with GE Oil & Gas.

Mark Blinn is to retire from his role as president and CEO at Flowserve. The company is seeking a replacement.

Geoscience BC has added Jared Kuehl, John Milne, and Alan Winter to its board of directors.

GSE has appointed Suresh Sundaram as class II director and chairman of the nominating committee. Jim Stanker is chairman of the audit committee. Sheldon Glashow is to retire from the board.

Enable Midstream Partners has named Craig Harris as EVP and chief commercial officer.

Mike Ortiz is now VP strategic development at New Century Software.

Mario Aguirre has been appointed director petrotechnical and geospatial data management chez Oxy.

Jordan Copland has joined Quorum as Executive Vice President and Chief Financial Officer. He hails from Omnitracs.

Peter Keilen is VP marketing and communication at Qinterra.

Marc Baca is now member of Russell Reynolds Associates’ industrial and energy and natural resources practices.

Helmuth Ludwig is to succeed retiree Norbert Kleinjohann as Siemens CIO.

John Green is now IT Manager at Seismic Image Processing.

Meg Gentle is now president and CEO of Tellurian. She hails from Cheniere.

Deaths

Ikon Science reports the death of Mike Bacon, company’s principal geoscientist, ‘renowned writer and stylist.’

NIST has announced the death of Katharine Blodgett Gebbie, a ‘visionary physicist and senior government research administrator who supervised and mentored four Nobel laureates in physics.’

Engineer and ‘founder of the commercial reservoir simulation industry,’ Keith H. Coats died on September 13th.


Done deals

Bluware, Headwave, Arcos, Samsix, ExproSoft, Miriam, GE Digital, Meridium, MSA Safety, Senscient, OpenText, Dell EMC, Documentum, Rockwell Automation, Maverick Technologies, Automation Control Products, Quest Offshore, Verisk Analytics, Weatherford International, SEC.

Bluware, Hue AS and Headwave are to merge into a new company to ‘drive the 3rd wave’ of E&P software. Lars Olrik will be CEO of the combined company which will have over 100 technology professionals. Bluware provides software development services to majors, notably Shell US.

Emergency resource management software house Arcos has acquired mobile damage assessment developer Samsix. The acquisition combines Arcos’ personnel and equipment management solution with Samsix’ mobile damage assessment and crew location services.

Well integrity and reliability software house ExproSoft has acquired reliability, availability, and maintainability specialist Miriam. Miriam’s cloud-based RAM Studio helps oil and gas operators achieve high uptime at oil and gas fields and processing facilities. Integrating of RAM Studio with ExproSoft’s WellMaster will enable prediction of well failure, downtime and intervention costs.

GE Digital has acquired asset performance management specialist Meridium. The APM toolset will integrate GE’s Predix industrial internet platform. GE took a 26% stake in the company back in 2014. The overall (100%) cost of the acquisition is put at $495 million.

MSA Safety has acquired Senscient, a laser-based gas detection technology provider used in a broad range of applications including oil and gas where its patented ELDS technology can detect a wide range of toxic and flammable gases such as H2S.

In a $1.62 billion deal, OpenText has acquired Dell EMC’s Enterprise Content Division, including Documentum. Barclays advised on the deal and provided a $1.0 billion ‘debt commitment’ in support of the transaction.

Rockwell Automation has acquired systems integrator Maverick Technologies. The acquisition strengthens Rockwell’s expertise in key process and batch applications adding platform-independent domain expertise. Rockwell also recently acquired Automation Control Products, a provider of centralized thin client, remote desktop and server management software.

Quest Offshore Resources has sold its data and subscriptions business to Verisk Analytics. The business will integrate Verisk’s Wood Mackenzie unit. The consulting arm and conference businesses were not sold and continue to operate under Quest Offshore.

Weatherford International is to pay a $140 million fine to settle a case brought by the SEC that it inflated its earnings by $900 million over a five year period by using ‘deceptive income tax accounting.’ More from Law360.


Big data, internet of things news

Recent big data/IoT news from NIST, PrismTech, Quorum, BitStew, Honeywell, NTT, OneM2M.

The US NIST IT Laboratory has published an introduction to the Networks of things’ proposing a foundational science and common vocabulary in an academic fashion. Whether this will stand up against the terminological onslaught of the vendor community is moot.

An explanation of IoT concepts and technology choices is available in PrismTech’s white paper, ‘Vortex at the edge.’ Vortex is Prismtech’s implementation of the DDS protocol for fog/cloud computing.

Quorum’s white paper, ‘Energy driven by innovation’ provides a top-level view of the cloud, with a list of cloud providers, NoSQL database options and machine learning toolsets. BDA has application in land decision making, drilling and reserves estimation.

BitStew’s presentation of its Mix Core framework for machine intelligence outlines various analytical options and a real-world use case analyzing information streams from ‘full authority digital engine control’ (Fadec) sensors on jet engines.

Honeywell’s new ‘connected performance services’ for refiners and gas processing plants provides cloud-based IoT analytics for design and optimization of catalysis.

NTT Japan’s Virtual Engineering Community and Mitsui Chemicals report data-driven modeling success in predicting gas plant outputs from 51 input data streams.

On the standards front OneM2M, a global standards initiative for machine-to-machine communications has published Release 2 of its protocol for connecting IoT applications and devices.


Emerson on tank management

White paper advocates a ’system engineering’ approach to shale well site equipment monitoring.

A white paper from Emerson addresses the optimization of shale operations in the face of difficult market conditions. A study found that a major problem in moving to a new drill site was variability in pad facilities. A lack of hardware uniformity made it hard to deploy uniform solutions and to take advantage of reusable engineering.

Emerson used a system engineering approach that leverages a small number of critical devices and systems placed strategically to deliver data through a consistent architecture. Guided wave radar level sensors feed into a data historian such that all information on oil pumped into a specific tank can be analyzed without the need to send a technician to the tank roof. The solution offers condition monitoring of pumps and compressors with acoustic and vibration sensors feeding into the historian. It is claimed that the modular platform for well site equipment and operation ‘does not require a major integration project.’ Emerson claims 30,000 licenses have been sold to date worldwide.


National Instruments on accurate sensor measurement

There’s more, much more than meets the eye at the sharp end of the IoT!

In the world of big data it is easy to forget just how hard it can be to get a decent measurement of temperature, pressure and other process parameters. A lot of tricky issues need to be handled ‘at the edge’ if one is to avoid digital garbage.

A new 29 page white paper from National Instruments, ‘An engineer’s guide to accurate sensor measurement,’ explains in some depth how physical phenomena are converted into electrical signals and digitized. Some sensors do not respond linearly to change and may require signal conditioning. Additional components and circuitry may be needed to take advantage of the full dynamic range of measurement hardware and to reduce noise.

The Guide explains how different sensors work and what best practices are required to connect them with instrumentation, to implement signal conditioning and minimize measurement error. Coverage includes design, operations and calibration of temperature sensors, strain gauges and acoustic devices. Big data/IoT aficionados might like to read through the publication to get a feel for the potential for error in field deployed devices and how much effort goes into trying to get the basic measurement right.


In Deep and still digging?

EU Exascale project - €700 million for HPC R&D.

The EU Exascale project has published a ‘Lookback on 5 years of EU Exascale R&D,’ summarizing progress on various high performance computing projects. The goal of the Horizon 2020 program is for a ‘supercomputer based on EU technology featuring among the world top three by 2022.’ Europe has committed €700M to supercomputing. Lookback covers the previous FP7 framework that ran from 2011 and 2016, during which eight HPC projects consumed some €50 million.

The DEeep/Deep-ER project, with sponsorship from CGG and Fraunhofer, targeted development of a new Exascale-ready HPC platform and a standards-based, easy-to-use software stack. A Deep prototype, Jureca has been installed at the Jülich Supercomputing Centre, Germany and is to be extended with a ten PetaFLOP/s ‘Booster’ system.


Sales, deployments, partnerships ...

Archeio, GE Oil & Gas, Modec, IBM, Enzen, Bit Stew, Aker Solutions, Subsea 7, Emerson, Intergraph, Lloyd’s Register, Opsens, Precise Downhole Services, OvationData, Western Digital, BetaZi, Schlumberger, Landmark, Boss Controls, Cisco, Aqualis Offshore, Aptomar, Kolos Marine.

Dallas-based startup Archeio’s ‘next generation,’ hosted well file software is helping Parsley Energy access oilfield data efficiently.

GE Oil & Gas has signed with Modec to apply data analytics and gas turbine expertise to Brazilian FPSOs.

IBM has supplied Slovenian retailer Petrol d.d. with a big data/analytics appliance comprising an IBM DB2 analytics accelerator for z/OS and Cognos. The solution provides employees with access to sales data for effective ‘suggest-sell’ at point of sale.

Enzen is to leverage Bit Stew’s Mix Core data intelligence platform to offer complex data integration services to its clients in India, Europe, Australia and the US.

Aker Solutions, Det Norske Oljeselskap and Subsea 7 are to combine their expertise to find the most cost-effective solutions for developing Det Norske’s Norwegian subsea fields.

Emerson has secured a new five-year contract with Nexen to provide metering management services for Nexen’s UK operations. Emerson has also provided Shell Philippines Exploration upgraded automation technology and services for its Malampaya operation.

Gazprom Neft is to use Intergraph SmartPlant for owner operators to create an engineering data management system. Hyundai Engineering plans to utilize Intergraph Smart 3D on all its projects after achieving cost savings of20% plus using Intergraph’s intelligent design (sic) basis verification solution.

Maersk Oil has awarded Lloyd’s Register a five-year contract for technical expertise on pressure system, structures and subsea equipment.

Opsens has partnered with Precise Downhole Services for the commercialization of its product line in Canada including exclusive distribution of the OPP-W sensor product line.

OvationData has chosen Western Digital’s HGST active archive and Versity storage management joint solution to build private, cloud-scale storage for verticals including oil and gas.

BetaZi has performed studies of the major US shale provinces using its proprietary physics-based predictive analytics solution and P2 Energy Solutions’ Tobin data.

Rosneft, BP and Schlumberger are to collaborate on seismic R&D. Rosneft joins as an ‘equal partner’ in BP’s ongoing project with WesternGeco to develop innovative cable-less onshore seismic acquisition technology.

Petronas has used Landmark’s NETool software to plan enhanced oil recovery projects in the Malay Basin offshore Malaysia. The NETool simulator was used to compare different completion scenarios to optimize production and estimate ultimate recovery.

Boss Controls has joined the Cisco solution partner program and can now quickly create and deploy ‘internet of energy’ solutions.

Aqualis Offshore, Aptomar and Kolos Marine are to jointly provide oil companies and rig operators with an integrated offering of marine operations and field monitoring services on the Norwegian continental shelf.


Standards stuff

NSF on reproducibility. Energistics, OPC Foundation map Witsml to OPC UA. New IOGP Geomatics guidance note, Open Geospatial Consortium approves CDB simulator standard. Oracle donates EBOs to Open Applications Group. New Chem eStandard released.

The US National Science Foundation has just published a ‘Dear Colleague’ letter advocating and encouraging reproducibility in computing and communications research. Reproducibility is designed to correct research bias toward positive results, to avoid overemphasis on presenting ‘breakthroughs’ and to correct a lack of incentives for researchers to retract irreproducible findings.

A joint Energistics/OPC Foundation workgroup reports progress mapping Energistics’ Witsml model into the OPC UA object model. The group has identified initial targets Witsml models and use cases for the companion specification. Work also progresses on the semi-automatic translation of the XML Schema and/or UML that is used by Energistics to define the WITSML model into OPC UA models. The work is likely to extend to Prodml to OPC UA translation.

IOGP has published Report 373-23, Geomatics Guidance Note 23 that clarifies issues associated with the usage of the web Mercator coordinate reference system for web mapping in oil and gas.

The Open Geospatial Consortium has approved the ‘common database’ (CDB) as an OGC standard. Originally developed for battlefield simulation, the CDB can also be used as a geospatial data repository for legacy systems. A CDB ‘synthetic environment’ data store contains geospatial data that represents the natural environment, including terrain relief, imagery, 3D models of static and moving objects.

Oracle has contributed its Enterprise business objects and associated IP to the Open Applications Group. While the Oracle EBOs are ‘no longer supported by Oracle’ OAGi takes the gift as ‘proof of Oracle’s commitment to Open Standards.’ OAGi also announced the release of Chem eStandards Release 5.4 which includes mapping to the Japanese chemical industry’s JPCA-BP EDI standard.


Going, going ... green!

UK CCS Association, Carbon sequestration leadership forum, DoE, CCS Cost Network, Quest CCS.

The UK Carbon capture and storage (CCS) association has published a lessons-learned report from the UK’s CCS program 2008 – 2015. The 36 page free download was issued following the decision of the UK Government to cancel its CCS commercialization program in 2015. The report found that a full-chain CCS project could have been delivered at Peterhead, using the Goldeneye store. But for the White Rose project at Drax to succeed would have required important adjustments to the terms of the program. It appears that the UK Government considers CCS as too costly and ‘there is no appetite from any developer to participate in a further UK CCS competition.’

The Carbon sequestration leadership forum (Cslf) met this month in Tokyo and has produced a 145 page report with updates from CCS initiatives in Australia, Japan, the US, UK and Norway. Japan’s flagship Tomakomai CCS demonstrator is now up and running and scheduled to sequester 100k tons/per year of CO2 through 2018.

The US Department of Energy has announced awards of $13 million to ‘quantify and mitigate methane emissions from natural gas infrastructure.’ The DoE has also selected eight new research and development projects to receive a total of $11.5 million in federal funding under its ‘Crosscut’ initiative. The projects include geothermal energy and CCS.

The CCS Cost Network 2016 Workshop held earlier this year at MIT has just published a 125 page Proceedings document. The workshop discussed currently available information on the cost of CCS and the outlook for future deployment. One interesting finding is the opposite commercial logic between US and Europe. In the US physical CO2 is sold and used in secondary recovery. In Europe, there are just ‘paper contracts of uncertain value.’

One oil that is playing the CCS game is Shell Canada whose Quest CCS project has reached a one-year milestone. Quest has captured and stored one million tons of CO2, about one third of the emissions from Shell’s Scotford Upgrader near Fort Saskatchewan, Alberta. Shell and its JV partners Chevron and Marathon have put the IT engineering plans for Quest into the public domain such that others can use them to build future CCS facilities.


Cyber security round-up

Cyber news and views from SAP, Maana, Alert Logic, Attivo, Bayshore, Mocana and US NIST.

SAP blogger Brent Potts reports that cyberattacks currently cost businesses as much as $400 billion a year globally, rising to $90 trillion by 2030. Malicious cyberattacks are on the rise and the IoT revolution is adding to the concerns of security threats. Petrobras’ network was reportedly hacked by the US National Security Agency!

Maana is applying its AI system to phishing detection, a ‘major threat to oil and gas companies.’ Maana’s natural language processing helps analysts identify phishing attempts.

A study by Forrester Consulting for Alert Logic has found that ‘companies struggle to adapt security operations to cloud.’ The study, based on a survey of 100 IT professionals found that businesses are challenged to recruit staff and secure finances for in-house security. The findings are music to the ears of Alert Logic, a provider of ‘security-as-a-service’ solutions for the cloud.

Attivo Networks received the 2016 Best of Show in security at Interop Tokyo for its Deception platform that provides inside-the-network threat detection for networks, clouds and ICS-scada environments.

Bayshore NetworksIT/OT Gateway now has the capability to protect against malware attacks such as BlackEnergy, IronGate, and StuxNet. Policy-based security detects ICS/scada malware.

Mocana has teamed with Infineon to embed the latter’s Optiga trusted platform module into its ‘security of things’ platform. Optiga’s TPMs are standalone security controllers based on the international standards of the Trusted Computing Group.

US NIST Special Publication 800-46 is a 50 plus page guide to enterprise telework, remote access and bring your own device security. NIST has also released its Strategic plan for NICE, the National initiative for cybersecurity education.


DNV-GL white paper on uncertainty in risk assessment

How to counter the perception that addressing uncertainty creates uncertainty.

A white paper from DNV GL investigates uncertainty in risk assessment. The 24 page document by Andreas Hafver et al. builds on research from the University of Stavanger. Regulators increasingly require uncertainty in risk assessment to be addressed as a prerequisite for improving safety. This brings a challenge in that addressing uncertainty may be perceived as creating uncertainty. For some, it is not clear how more focus on uncertainty adds value to decision-making.

The paper shows that uncertainty is an integral part of risk. Assessing and communicating uncertainty can help operators take safety-critical decisions with more confidence by understanding which uncertainties are important and if they can be reduced without compromising safety.

A picture that includes all uncertainties may overload the decision-maker and blur important messages. However, if uncertainties and their influence are understood, analysts can sharpen their message around those that matter. DNV advocates an iterative, top-down approach to risk assessment that is designed to minimize the assessor’s own bias and preferences.


Aveva, Capgemini team on engineering data in the cloud

Digital twin for engineering information management in the Amazon cloud announced.

At its 2016 World Summit user conference in New Orleans this month, Aveva announced a partnership with Capgemini to deliver a digital asset solution in the cloud. Aveva Connect is an engineering, design and information management software as a service ecosystem to be delivered via Amazon web services.

The first component to be released is Aveva NET Connect, an IM offering to be delivered from Capgemini’s Technology Services unit. Capgemini’s Abdelmajid Boutayeb observed, ‘NET Connect means that a digital asset can be rapidly deployed in the cloud for cross-disciplinary information sharing. But instead of a simple ‘lift and shift’ to the cloud, we use our ‘Cloud choice’ methodology to enable the enterprise-ready, hosted solution’

Aveva also announced a partnership with Aegex for the delivery of digital asset solutions in hazardous environments using Aegex’ ‘LFM Netview’ laser scan visualization and Aveva NET IM on the AEGEX10 intrinsically safe tablet.


Blockchain for gas transactions

SunGard coders compete to leverage Bitcoin-style technology in e-payment applications.

FIS unit SunGard, a financial software and technology specialist, reports on the ongoing operational and commercial changes occurring in the US natural gas market where midstream operators’ processes have been impacted by the development of shale reserves and the rise of gas production in regions previously underserved by gathering and transmission systems.

LNG and pipeline-based gas export is opening-up new commercial opportunities particularly with the liberalization of the Mexican market. The changes argue for the migration of operators’ IT systems to the cloud to mitigate spiraling costs and complexity.

Looking further ahead, FIS’ coders have demonstrated that blockchain technology (as used by Bitcoin) can underpin financial settlement, trading, clearing or payment against delivery. The prototype solutions competed in the 2016 FIS annual Hackathon in Singapore last month where 13 finalist teams competed against each other. The winning team used open source frameworks to embed blockchain techniques into various ‘smart’ contracts.


Warwick Analytics, Venture team on predictive analytics

University of Warwick spin-out offers big data analysis on-a-budget.

UK-based Venture Information Management has collaborated with Warwick Analytics on the use of predictive analytics to solve business challenges ‘on a budget.’ The buzz around big data and the internet of things have made decision makers aware of the potential of predictive analytics. But the approach usually means large budgets, long lead-times, multi-stakeholder engagement and a team of data scientists.

Warwick Analytics, a spin-out of the University of Warwick, was built on ten years of academic research resulting in a number of ‘unique proprietary algorithms’ and features that are claimed to position it to solve business challenges ‘without the need for the expensive compute infrastructure that analytics requires.’

The solution is designed to work with dirty or incomplete data without the need for extensive clean-up projects. Data requires minimal manipulation compared to traditional analytical methods. WA’s toolkit classifies unstructured content with an automated information retrieval function. Up to 85% of the analytical process is automated, minimizing the cost of data scientist support.

Venture envisages that the solution has application in maintenance optimization, with real time integration of sensor data, and analysis of maintenance reports and other data to locate the root cause of existing issues in downstream and upstream assets. Another use case is in production optimization, increasing data re-use and review of historic upsets to eliminate loss and reduce inefficiencies. WA’s SigmaGuardian adds a root cause analytic functionality to the offering.


Energy from gravity?

Dutch scientists and architects come up with a ’sustainable energy generator.’ Do try this at home!

Shakespeare’s Falstaff, in Henry IV Part 2, twisted the Chief Justice’s words on the ‘effect of gravity’ into a wisecrack on the ‘effect of gravy, gravy, gravy.’ However these Dutch architects and scientists are not joking with their claim to have ‘physical proof for energy from gravity.’
Janjaap Ruijssenaars (Universe Architecture, Amsterdam) and scientists from VIRO in Hengelo have been working on what they claim is a ‘sustainable home energy generator.’ The device combines gravity with mechanical instability to ‘improve the efficiency of a piezoelectric generator.’

An explanatory video can be viewed on the Gravity Energy website where it is clear to anyone with a passing acquaintance with science that the contraption cannot possibly generate anything other than hot air. Nonetheless, 10% of the shares of Gravity Energy, the Amsterdam-based company that will license the patented invention, were snapped up by angel investor Jeroen van den Hamer, who believes that the ‘This could be a game changer.’ Or, as Falstaff might have put it, a first step on the gravy train.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.