March 2014


200th issue survey results

Oil IT Journal’s anniversary survey results show strong endorsement for expanded coverage. Opinions divided on print vs. online. Readers’ ideas on what to fix and how to stay relevant!

Last month’s survey brought a strong endorsement for Oil IT Journal’s evolution from a relatively narrow publication focusing on E&P data to its current broad coverage of IT as it impacts oil and gas. Almost 90% of responses were ‘broad coverage is good,’ supported by comment such as, ‘we can learn from nearby industries,’ ‘you reflect the growing inter-connectedness of systems’ and ‘interdisciplinary use of data means the broader scope makes sense.

Opinions were divided on the print vs. online edition. 60% read Oil IT Journal mostly online, 30% read both editions and 11% of respondents read only on paper although this is likely a biased sample.

All of our regular features were rated as ‘important, a regular read’ with the ‘Software, hardware short takes’ section just beating the Editorial for the top slot. Similarly coverage of all of the ‘occasional’ features was rated as ‘about right’ with, interestingly the ‘Going green’ section just leading the ‘Consortium corner’ feature for popularity. Negative (‘do not read’) scores for both regular and occasional features were all around and below 10%.

Asking ‘Why do you read Oil IT Journal’ was, of course, fishing for compliments and we were not disappointed. An upstream IT professional working for a large independent summed things up as follows ‘I have been working in oil and gas IT for over 20 years, and Oil IT is my leading source for information. Editor Neil McNaughton has a keen mind and a talent for absorbing and summarizing information, providing context across an impressive breadth of topics. It is valuable to find a publication that takes such highly technical subject matter and renders it readable for those who are not experts in the field. Reading Oil IT piques my curiosity and encourages me to learn more! It’s helpful to read about the conferences that I cannot attend, and even more interesting to compare Oil IT with my own notes on conferences that I did attend, to get additional insight and see what I may have missed.

Ideas for improving Oil IT Journal included a ‘major website re-vamp’ (the current site was described as like a village newsletter from the early days of the web), an iPad edition, more interviews and more on operational and implementation experiences. There were also calls for some graphics, better article tagging and less acronyms and a blog for reader feedback.

One reader just advised, ‘Stay relevant and focused on what you do best, Oil IT Journal is one of the very few magazines that most of my personnel read cover to cover every month. It does a great job and fulfils its mandate well.’ We are keeping the survey live for a few more weeks so you can still contribute to the debate on OilIT.com.


iOps Center, Austin

Emerson proposes a move from ‘distant and dangerous’ workplaces to a centralized integrated operations lab for ‘next generation’ decision making.

Emerson Process Management has announced an ‘integrated operations’ (IO) initiative that combines its technology, a real world ‘experiential’ lab and consulting services. The initiative promotes accessible expertise and the ‘safe, collaborative collocation of essential personnel.’ Emerson’s Peter Zornio explained, ‘Running production operations is challenging in today’s dull, distant, dirty and dangerous locations where few people want to work. A problem compounded by the cost and scarcity of skilled workers.’

IO allows cross-functional teams to operate from ‘more desirable’ locales, leveraging video conferencing and real-time access to asset data to streamline decision making. Emerson is offering a test bed for IO deployment in the form of its ‘iOps Center’ located in Austin, Texas. Here a working model of a production enterprise lets customers experience Emerson’s ‘next generation’ cross discipline decision making.

Emerson’s Jim Nyquist added, ‘We have partnered with Dell, Barco, Cisco, OSIsoft and others and can now give customers a clear vision of what’s possible.’ Emerson is now looking to expand its iOps network to other locations. More from Emerson.


Why has IT not made us more productive?

Editor Neil McNaughton offers a couple of ‘third party’ debunks before investigating why information technology’s early productivity gains have stalled. The answer is that vendors, from Intel through Microsoft to Apple get paid on a ‘per seat’ basis. Anyone for a conspiracy theory?

The hugely gratifying response to our questionnaire last month showed (see this month’s lead), inter alia, that readers like Oil IT Journal when it is debunking stuff. Personally I like it even better when we can report on others who debunk stuff for us. So just to kick off this 200th editorial, a couple of recent third party debunks...
You have probably heard many speakers exhort you to ‘pick the low hanging fruit,’ with the implication that this will lead to the best allocation of resources. Well our first debunker, McLaren Software’s Tim Fleet said earlier this year1, ‘don’t pick the low hanging fruit.’ Doing so makes for projects that, while easy to implement, are unlikely to represent the real world. Shell’s Johan Stockmann should get a debunker’s award for his two contributions made at the SMi E&P data conference last month2. Stockmann urges us to ‘forget the single version of truth’ because there isn’t one and also to avoid ‘buy not build’ blindness. In other words there is plenty of stuff that is best done in-house.

~

In the February 4th issue of the Financial Times Martin Wolf was inspired by a new book3 on the ‘next big thing’ to hit us all, the arrival of smart machines that will eat (more of) our jobs. Not having read the book I will refrain from comment on this theme. But one of Wolf’s asides got me thinking along lines that have previously
landed me in trouble. Which is a good reason to revisit the topic. Wolf cited Robert Solow who, back in 1987, quipped that, ‘we see information technology everywhere except in the productivity statistics.’ Wolf reports trends in output per hour in the US as ‘quite mediocre.’ I would like to suggest why this might be so.

A remark attributed to IBM founder Thomas Watson put the worldwide market for computers at ‘maybe five.’ While this is both apocryphal4 and wrong it reflects what was at the time (1943!) a possible trajectory for the fledgling industry. The following decades saw a market of more than five. But the paradigm of big machines doing a lot of stuff, increasing productivity and destroying a clerical jobs largely continued.

Early computers were also designed to be programmed by subject matter experts. They were programmed in languages that were tuned to the task in hand. Cobol for business use and Fortran for scientists and engineers. This situation offered an effective way to harness the power of computers in service of the business—a highly efficient paradigm that brought the early productivity gains. Both languages by the way are still in widespread use5.

Then came the personal computer which threw a spanner in the works, leading to the situation we have today of a huge IT industry that is dependent on having a large number of ‘bums on seats,’ all using software that is tuned individual use. Practically all upstream software outside of seismic processing falls into the same ‘single user’ category. In fact, although big computers are commonplace, their use cases are somewhat forced. Running multiple simulations for Monte Carlo analysis is OK, but what about running the business?

Across industry, in finance, engineering and just about everywhere else, the worst offending (and most ‘personally productive’) application is of course, Excel. The ubiquitous spreadsheet allows the individual subject matter expert to develop ‘solutions’ to just about anything. But their scope is limited to the user’s own desktop. Gathering the results from such dispersed activity is hugely problematic in the enterprise.

If we now turn to the tools of the trade, the old domain specific languages, these get short shrift from the programming community. Their crime? They are just old and worn out. Vendors have replaced them with a plethora of languages that suit their own purposes. These may include worthy aims (although not necessarily yours), like working well with the web. Some aims are less worthy like vendor lock in. Subject matter experts wanting to develop solutions to business problems are faced with near insurmountable barriers of programming complexity and may have to throw in the towel and abandon development to ‘IT.’

You may think this is all a bit cranky and it probably is. But bear with me and try an economic analysis of the situation—in other words, ‘follow the money.’ Imagine that somehow, a substantial productivity gain, as seen in the early days, was achieved today. What would be the effect of this on the likes of Intel, Microsoft and Apple? It would be disastrous! Less bums on seats, less chips, software licenses and hardware sold! This is definitely not how these players see their future. Much marketing effort is directed at avoiding any real rationalization, pitching instead, tools for oxymoronic ‘personal productivity.’

In the 1870s, Mr. Remington designed his typewriter keyboard to decrease productivity, with a sub-optimal placing of the keys to slow down typing and avoid jams. In my considered and conspiratorial opinion, Microsoft’s ‘ribbon’ was designed with similar intent—to slow things down and reduce productivity. I sincerely hope that the ribbon will not be with us for as long as the qwerty keyboard. If it is though, what a triumph for futzing!

~

I realize now that our 200th edition online questionnaire may have disenfranchised some of our print copy readers. If you would like to contribute to the debate, the questionnaire will remain up for another month and you can email us at info@oilit.com. Hey, you can even write us a letter and send it to The Data Room, 7 Rue des Verrieres, 92310 Sevres, France.

1 Oil ITJ January 2014.

2 See page 6 of this issue.

3 The Second Machine Age, Erik Brynjolfsson and Andrew McAfee.

4 Wikipedia.

5 ibid.

Follow @neilmcn


Review - Handbook of Scada/Control systems security

Robert Radvanovsky and Jacob Brodsky’s Handbook looks into what has gone wrong in the ongoing convergence of IT and process control from a mostly managerial standpoint.

The 350 page Handbook* offers an exhaustive analysis of what could and, to a degree, has gone wrong in the ongoing convergence of IT and process control systems. A chapter on ‘social and cultural aspects’ emphasizes the different cultures and opinions of the stakeholders. Engineers may doubt the reality or likelihood of cyber attacks on their system or, when confronted with the evidence, may just want to unplug control systems from the business network. For the authors, convergence is inevitable and unplugging is unrealistic.

The handbook is multi-authored and although the chapter titles ostensibly distinguish different facets of the problem set, there is inevitably overlap and repetition. A long chapter on ‘risk management’ is followed by a whole section on ‘governance and management. The book takes something of a ‘managerial’ view of the problem. In this reviewer’s opinion it is rather lacking in the technical detail one might expect in a ‘handbook.’

For instance, more detailed information on different scada vintages and their vulnerabilities would have been useful. But to whom? That is the problem with the whole field of security, scada or otherwise. As one of the concluding chapters has it, there are ‘issues’ with sharing information and data in that it may be used by the hacking community.

The chapter on ‘physical security’ makes a good point that security requires an enterprise approach and that the network is not ‘the center of the universe.’ This is followed by a chapter on ‘red/blue team tabletop exercises.’ All jolly good stuff but not really scada specific.

Brodsky’s chapters on communication and network topologies get more technical but that’s only about 20 pages worth! A short chapter on data management asks, ‘why do we collect and keep so much log file data,’ without giving a clear cut answer. Although an earlier chapter on integrity monitoring advocates the use of log files as an ‘oft-overlooked but critical component of a secure scada system.’

An appendix mentions a project of potential interest to Oil IT Journal readers—Linking the oil and gas industry to improve cybersecurity (Logiic), an ongoing initiative run by five major oils, the US Department of homeland security and the Automation federation.

The authors state that the Handbook is ‘work in progress.’ It is also, as far as we know, the only show in town. As such it makes a good first pass at collecting a lot of diverse information but it suffers from its multiple authorship and a governance, rather than a technology-led approach. It would have been good, for instance, to have made clearer the distinction between IT security and what is specific to scada. Readers are likely to come from either the IT camp or process control and more technical explanatory detail might facilitate communication between the two camps.

* A handbook of Scada/Control systems security by Robert Radvanovsky and Jacob Brodsky—CRC Press 2013. ISBN 9781466502260.


Cyber security round-up

CSC’s incident response. Industrial Defender acquired by Lockheed Martin. NIST’s cybersecurity framework. Scada ‘pain points.’ Rockwell on why ‘Pandora is really out of the box!’

Computer Science Corp. (CSC) is offering a cyber incident response service to clients worldwide. Following a cyber-security incident, CSC provides access to trained professionals along with technical and strategic capabilities delivered through a ‘retained pricing’ model that is claimed to lower the financial and technical risks of future incidents.

While CSC’s public sector clients have benefitted from the service for the past two decades, commercial organizations are ‘ill-equipped and unprepared for cybersecurity incidents.’ A 2013 Ponemon Institute study, ‘The Post-Breach Boom’ found that only 41% of US companies had the tools, personnel and funding to prevent breaches. Moreover only 39% could minimize damages if breached. CSC has teamed with Co3 Systems and Alvarez & Marsal on the new offering. More from CSC.

Lockheed Martin has acquired Industrial Defender, a provider of cyber security solutions for control systems in the oil and gas, utility and chemical industries. The terms of the agreement were not disclosed but are ‘not material to’ Lockheed’s operations.

Lockheed chairman Marillyn Hewson said, ‘ID’s experience of addressing cyber threats to industrial control systems complements our own IT cyber security expertise.’ ID CEO Brian Ahern added ‘We share a perspective on the importance of protecting critical infrastructure from an increasingly hostile landscape. Our combined capabilities make up a comprehensive suite of technology and services designed to face modern day threats to operations and the business.’

The US National Institute of Standards and Technology (NIST) has released version 1.0 of its cybersecurity framework designed to protect the nation’s financial, energy, health care and other critical systems from cyber attack. The 40 page document provides a taxonomy and mechanism for organizations to assess their current cybersecurity posture and set a ‘target state’ and a means to achieve the same. NIST has also released a roadmap document to accompany the framework.

Inductive Automation (IA) has published a Scada ‘pain point’ graphic explaining how Scada is broken and what IA’s technology does to fix it. Some 19 pain points are enumerated, including the absence of a SQL database, no support for OPC-UA and more.

Rockwell Automation and Cisco have also jumped on the cybersecurity bandwagon with a new ‘Converged plant-wide Ethernet architecture,’ as reported in a recent Control Magazine piece by Jim Montegue. This includes an amusing citation from Cisco’s Rick Esker viz., ‘With Stuxnet and its 85 families of worms, Pandora is really out of the box.’ Rick needs to re-read his classics!


Norwegian EPIM updates its data hubs

CGI, Computas, TopQuadrant and OpenText power LogisticsHub and License2Share.

The E&P Information Management Association (EPIM), a not-for-profit organization run by oil and gas companies operating on the Norwegian continental shelf, has issued two contracts for its upstream information hubs. EPIM has awarded semantic data specialist TopQuadrant and systems integrator Computas a contract to develop the EPIM LogisticsHub (ELH).

ELH will leverage TopQuadrant’s ‘TopBraid’ semantic technology to improve tracking of cargo carrying units (CCUs) used to ferry equipment to and from offshore platforms and drilling rigs. The solution will serve as the common knowledge base for sharing event data received from RFID tags to capture data across the Norwegian offshore supply chain. A pilot implementation of the hub is to be delivered in 2014, followed by full-scale implementation in 2015. Top-Quadrant’s technology has already been used in EPIM’s environment and reporting hubs. More from Computas and TopQuadrant.

EPIM has also awarded a contract to IT behemoth CGI for the operation and further development of its cloud-based License2Share (L2S) system that manages oil and gas activities on the Norwegian continental shelf and internationally. The contract runs through to 2017 and has a total value of 74 million NOK. L2S provides operators, partners, engineering companies and governments with common IT tools to use in their day-to-day work. L2S is based on OpenText’s content management system and the current solution contains over three million documents and is used by over 7,500 users.

L2S has had an interesting history. In 2011, EPIM engaged Logica (now CGI) to develop a substitute for the LicenseWeb and AuthorityWeb reporting standards. In 2012, L2S was accepting reports from operators as PDF documents. Last year the initiative was presented (by Logica—0603) as a poster child for none other than Microsoft’s ‘Mura’ upstream reference architecture. L2S was said to ‘demonstrate the power of Mura’s foundational principles.’ More from CGI and EPIM.


FracFocus report, citing public concern, advocates more openness

Task force recommends reducing trade secret wriggle room for chemical disclosure.

The US Secretary of Energy’s task force has issued its report on FracFocus 2.0, a public registry for the disclosure of the chemical constituents of frac fluids. The review was kicked off last year to assess improvements in shale gas production, best practices and disclosure with the objective of reducing the environmental impact of unconventional shale gas and production.

A key finding of the review is that some 84% of the wells registered with FracFocus invoked the ‘trade secret’ exemption for at least one chemical. The reviewers concluded that the data ‘does not suggest the level of transparency and disclosure urged by the task force. [...] More can be done.’ The goal is for ‘very few’ trade secret exemption claims from disclosure. The public is clearly concerned about the nature of the chemicals used in hydraulic fracturing and it is in the industry’s interests to meet this concern.

On the plus side, the upgrading of the old FracFocus registry to a ‘risk-based data management system’ is a step in the right direction. The task force recommends that DOE fund FracFocus to enhance the website, suggesting improvements such as improved public search, improved accessibility for data scrapers batch downloads of PDFs. A more radical suggestion is that the whole database should be downloadable in ‘raw, machine-readable form.’

The authors believe that such improvements could be realized ‘at modest additional cost’ and would enhance FracFocus’ position as a valued source of publicly disclosed data with a ‘significant degree of uniformity.’ Download the task force’s report.


Remote geosciences data visualization

ISN trials Nvidia/Cisco proof of concept graphics system to ‘see if hype is justified.’

In our interview with Paradigm’s Urvish Vashi last month we discussed the problems of remote visualization of highly graphical applications in the cloud. ISN CTO Paul Downe addressed this in a blog post last month, outlining the rationale behind running geosciences apps remotely, a trend that he describes a ‘one of the newest and most interesting IT concepts in the industry.’

Remoting geosciences apps brings opportunities for data consolidation and centralisation, enhanced security, better collaboration, reduced data duplication and improved version control. Many companies today operate with high-end graphics-intensive workstations with local data storage. Frequently there will be multiple workstations in different locations around the world, each with copies of the same data. Moving to the cloud centralizes data and graphics-intensive processing power that is accessible from anywhere and from any device.

ISN and partners Cisco and Nvidia are running a proof of concept system to trial remote applications on real customer data ‘to see if the hype is justified.’ The setup comprises a Cisco C240 M3 server and Nvidia’s VGX K2 at the data center. This offers ‘outstanding’ rendering performance from twin Kepler GPUs, each with 1536 CUDA cores and 4GB of video RAM.

At the client end, ISN proposes a Citrix XenDesktop with HDX remote desktop. WAN capability is optimized with Citrix’ CloudBridge which reduces bandwidth requirement and application traffic considerably. Downe will be reporting on the system tests over the next few weeks. Follow him here.


Software, hardware short takes

SCM, Blue Marble, Calsep, Baker Hughes, WellDog, Dassault Systèmes, Invensys, Exprodat, Qbase/MetaCarta, Midland Valley, Norsar, ExxonMobil, Providence Photonics, PVI, Noumenon, Orange.

SCM E&P Solutions has released added new ‘tips and tricks’ for Schlumberger’s Petrel Studio 2012, a compendium of ergonomic shortcuts for the popular geosciences interpretation suite.

The 2014 edition of Blue Marble’s Geographic calculator includes display of EPSG ‘area of use’ polygon data, improved vertical coordinate system handling and new ‘land survey summaries’ for use with Canadian survey systems.

Calsep has added a new interface to Halliburton’s Wellcat engineering software to the 21.2 release of its PVTsim fluid compositional modeler.

Baker Hughes and WellDog’s co-developed ‘AquaTracker’ is a permanent multi-zone aquifer monitoring system for coal seam gas producers.

Dassault Systèmes has released the 2014 edition of its 3DExperience product design and lifecycle management platform, heralding a ‘broad move’ of its software to the cloud.

InvensysSimSci APC 2014 combines an intuitive graphical user interface with a ‘rigorous and robust’ calculation engine for advanced process control applications, leveraging a ‘natural workflow’ that includes support for model case file development and connectivity with digital control systems.

Exprodat has upgraded its suite of Esri ArcGIS-based upstream applications with new play-based exploration workflows. A data assistant transfers data between commonly used oil and gas data formats and ArcGIS. And Team-GIS’ ‘Unconventionals analyst’ streamlines reserves booking and well pad planning workflows for factory drillers.

Qbase has announced MetaCarta for SharePoint 2013, a plug-in that ‘geo-enables’ structured and unstructured content inside the SharePoint environment. MetaCarta combines map-based geographic search with traditional text and keyword search. Its geo-parsing solution uses natural language processing and what is claimed to be ‘the world’s largest gazetteer’ to identify and disambiguate geographic references.

Midland Valley Exploration (MVE) has retooled its structural geological modeler around a core geological toolkit, ‘Move.’ The 2014 release combines model building, analysis and surface management functions. A new stress analysis tool analyzes fault and fracture systems under a user-defined 3D stress state for risk evaluation of complex reservoirs, earthquakes and CO2 storage. The new release also offers 3D seismic data and imagery/terrain model manipulation.

Norsar’s SeisRox 3.0 introduces a new full-field workflow for forward modeling of the pre-stack depth migration response of large reservoir models. Models can be imported from Schlumberger’s Eclipse or built from imported surfaces or NORSAR-3D models. The new release models coil and other complex shooting geometries. SeisRox runs on 64-bit Windows and Linux.

ExxonMobil Upstream Research has awarded a commercial license of its InteliRed gas detection system to Providence Photonics. InteliRed uses artificial intelligence to analyze infrared camera images and detect escaping hydrocarbon gases. The system, which was co-developed with PP, has application in refineries, LNG and gas processing facilities.

Pegasus Vertex’s Cempro+ adds zonal isolation quality assurance calculations integrating mud removal optimization, solving 3D momentum and continuity equations, fluid concentrations and displacement efficiency.

Noumenon has announced a ‘speed reader’ for relational databases. Originally developed for use with its SP3/XMpLant plant data store, the technique promises hundred-fold speedup for any large database, especially Oracle.

Orange Business Services has hiked its satellite service to the Americas with more bandwidth, new offshore licenses and two new teleports. The offering targets inter alia the oil and gas vertical.


Energy Exchange Smart fields summit, Houston

HighMount on automation, data management and Spotfire-based analytics. Texas Energy Network.

Around a hundred attended the inaugural Energy Exchange Smart field summit held earlier this year in Houston. John Argo described HighMount E&P’s data management and automation strategies to optimize operations at its key Sonora asset. Argo described Sonora as a ‘gas factory’ with its 6,000 wells and one of the largest deployments of electronic flow meters in the world. A high bandwidth wireless Scada network connects these to the office.

HighMount has replaced its legacy Excel analytics with Spotfire-based tools for production surveillance and analysis. These are used to target underperforming wells for intervention, prioritizing high potential wells currently producing say, under half of their 120 day average.

Information is presented to stakeholders along with the economics and estimated return on investment of the intervention. Historical data is leveraged in a well failure analyzer that tracks failures back to 2006. More historical data has allowed HighMount to optimize its routing. This was previously designed to minimize drive time between locations. Now routing includes a weighting for individual opportunities based on well type and characteristics. Rout optimization and increased automation has significantly reduced manpower requirements and redirected efforts to higher potential horizontal oil wells. Overall the initiative has resulted in around $2 million in annual savings along with around $5 million worth of increased production.

Greg Casey presented the Texas energy network (TEN), an LTE network dedicated to oil field connectivity. TEN provides internet connectivity thanks to a secure ‘carrier class’ 4G network, an oil and gas focus and specialized customer equipment. TEN promises rapid deployment of mobile field offices on drilling rigs and transmission of telemetry data to the office. An ‘off-net’ function provides connectivity with sites outside of the TEN coverage area. TEN operates in the Texas’ Permian basin, Eagle ford and panhandle. Clients include Shell, Devon, Apache, Anadarko and Chesapeake. More from TEN. The Energy Exchange is planning a second edition of the Summit in January 2015.


SMi E&P Data Management, London

BG on managing ‘annoying’ Petrel data. Shell, ‘upstream data is different, avoid buy not build blindness.’ Big data capability and Jboss/Teiid data virtualization for Landmark’s re-vamped DecisionSpace. Repsol transforms with information. ExxonMobil, ‘give end users data management tools.’ OMV formalizes its information management following reorganization. GDF Suez’ SharePoint-based ‘one stop’ information shop.

Chairman Fleming Rolle (Dong), introducing the first speaker, BG Group’s Sherwin Francis, observed that ‘few have cracked the (ubiquitous) data management problem around Schlumberger’s Petrel.’ As Francis explained, Petrel’s file-based data structure and absence of a database makes central control challenging. Users can create projects at will, duplicating projects and reference data. Such flexibility is fine for users but a nightmare for data managers. Petrel project management tools exist, Schlumberger’s own Prosource and the Petrel project reference tool. The Innerlogix data cleansing suite can also be used to manage projects and a new Petrel Studio database tool is currently in pilot. Third party tools such as Blueback’s project tracker are also under study.

Petrel project managers must chose between various scenarios and data duplication strategies. These include a single master reference project, multiple connected reference projects or many stand alone projects. This kind of challenge did not exist in the days of OpenWorks or Geoframe. BG currently uses a Seabed/Recall master database and Innerlogix to create Petrel master and reference projects from which individual working projects are derived. In the future Studio will sit between Seabed and feed individual projects directly in what is described as more like the old Geoframe/OpenWorks type of scenario. Francis wondered why Schlumberger had not provided this functionality years ago. Further improvement should be achievable using Blueback’s Project Tracker which gives a good view of shared storage through scheduled scans, populating a SQL database and providing a geographical representation, pinpointing inconsistencies in data and reference relationships, monitoring use and managing disk space. The effort has been worthwhile for BG which now has an automated process for managing projects globally and less unused projects and duplicates.

For instance, Blueback Tracker found six different Petrel versions in use at one small asset where around 180 projects were compresses to about 20. Keys to success were management support, a rational directory structure and the implementation of interpretation standards, best practices and archival. It is also a good ideas not to run more than one or two Petrel versions.

Francis added that onboarding users by explaining the benefits of structuring Petrel projects approach was key. An observer agreed that the key to data management success is participation. Data managers should not sit on the sideline. They need to know what the team is doing and talk daily with end users. ‘Petrel is a very annoying tool. Blueback’s tools are a way of getting them out of trouble but it is not a solution.’

Johan Stockmann offered Shell’s perspectives on data architecture and management. The upstream is different. While the aim is, as in other verticals, for trusted data, this needs to be set against the ‘large volumes of expensive, idiosyncratic E&P data.’ There are moreover, ‘high levels of uncertainty’ and importantly, ‘no single version of the truth—forget it!’ Enterprise architecture is designed to accommodate such multiple parallel truths. Shell’s EA is loosely based on the Open group’s framework (TOGAF). The ‘system agnostic’ data model is defined down to a level that will support automation. Key data definitions are owned by senior staff with a methodological mindset and a great network.

Stockmann was scathing of ‘buy vs. build blindness’ and of the ‘local lobby’ which has it that ‘central standards are too expensive’. Quality is not the only problem. Today, solution providers need to include a fully specified data model and knowledge as to where data will come from. Web services are OK but you still need to understand sources, creators and usage—‘it all needs to be thought through.’ On the E&P ‘difference,’ iterative workflows are particularly demanding of data management, multi dimensional data is not well suited for relational databases or models. And real time sensor data makes for ‘interesting architectures.’ While Shell is an active supporter of standards both industry-specific and horizontal, ‘no single standard supports all our needs’.

Far too much time is spent on data migration, search and clean-up—even by senior execs who shouldn’t be futzing with spreadsheets at midnight! By doing the architecture right, Stockmann expects to reduce the data overhead by 66%.

Accoring to Chaminada Peries, Landmark’s software unit has had its head down for the past couple of years but is now about to release new products under the DecisionSpace banner. These will include workflow tools, dev kits, ‘big data’ capabilities and, because desktop deployment ‘is no longer an option,’ cloud functionality. The new framework will ‘acknowledge and work with third party apps’ (Petrel was not specifically mentioned but that is what we understood).

Landmark is trying to regain the enterprise high-ground with a solution that, it is claimed, reduces data duplication and the dependency on files and application data stores, while improving security and data audit. This is to be achieved thanks to ubiquitous data virtualization leveraging, the open source Jboss/Teiid virtualization engine. This underpins the DecisionSpace data server with connectors to Petrel, Recall, PPDM, SAP and other industry staples.

Data is retrieved as Odata services, a ‘powerful and open protocol.’ Halliburton is a contributor to the Odata community. Also on offer is a ‘unified API for text and spatial search across all data, GIS integration and Hadoop-based business intelligence. Stand-alone apps like WOW and PowerExplorer have been re-tooled to the web platform to take advantage of the ‘new reality’ of the cloud. DecisionSpace is now presented as a ‘platform as a service.’ Developers write plug-ins to the cloud-based infrastructure. More from Landmark.

Pat Merony and Gema Santos Martin presented Repsol’s ‘Transform with information’ initiative. This blends people, processes, tools and a ‘lean/kaizen’ approach. At the heart is an in-house developed ‘GeoSite’ portal providing a single point of access to information. Repsol has elevated information management to be on an equal footing with other upstream disciplines like geology, geophysics and reservoir engineering. A comprehensive service catalog has been drafted leveraging concepts from DAMA, CDA and PPDM. Attention has also been placed on career paths with potential promotion to data advisor and or architect roles and to competency development. Asked if candidates were ‘knocking at your door now?’ Santos Martin replied modestly, ‘Not yet but they are not busting down the door to get out!’

Keith Roberts (ExxonMobil) has been 33 years working in the upstream and remembers the days of the drawing office and typing pool. Things changed with the arrival of the PC and desktop applications that let people do stuff for themselves. Exxon’s aim is to do the same for data management—putting it in the hands of the end user.

The data manager’s role will have to change in the face of trends like the big crew change (already a reality for Exxon), which is driving productivity, increasing data volumes and pressure on cost. The future will see greater technology integration, ‘Petrel is just the start of it.’ The plan then is to ‘give users the tools they need for data management and get out of their way.’ Companies also need to move from data ‘schlepping,’ shoveling stuff around and on to adding value through data science and forensics. Exxon is looking at the cloud on a case by case basis but has concerns regarding bandwidth and security.

Juergen Mischker traced OMV’s three year journey to formalize its information management following a major reorganization in 2011. This saw the introduction of corporate information governance and data management disciplines—both anchored in E&P (rather than in IT). IM is now a key process owned by a senior E&P VP. A new data enhancement project has improved the quality and quantity of well data.

David Lloyd described how GDF Suez is supporting its growing operations with a one stop shop, The Portal, providing access to information, documents and data. Users are presented with a customizable SharePoint-based web client. SharePoint has its critics, but it makes for an affordable solution and provides basic document management functionality out of the box.

Deployers can buy third party web parts to plug SharePoint’s gaps. Lloyd advocates a ‘reasonable’ information architecture with drill down by function, asset, project, asset and well. SharePoint TeamSites is used for calendars, meetings, workflows and ‘presence management,’ showing when people are online.

Data cleanup was a prerequisite. On just one shared drive, there were 2.5 million files (almost half a million duplicates) in 184,000 folders. Getting rid of the mess can even reduce your carbon footprint!

But the real challenge is convincing users on merits (and obligations) of good information management. Other components of GDF Suez’ IM solution include Flare’s E&P Catalog, Livelink and OSIsoft PI. The Portal also opens up to vendor data from Whatamap, Hannon Westwood, IHS, DEAL, CDA and others. Unified communications and video chat with Microsoft Lync also ran. SharePoint may not be the best user interface around but it provides flexibility. GDF has three developers working on customization. There are issues with SkyDrive and various Office vintages. TeamSites can ‘spiral out of control.’ All of which need managing. There is ‘a lot going on backstage.’ GDF uses PRINCE II and ‘agile’ methods. It has been a ‘massive scary journey but fun!’ More from SMi Conferences.


Standards leadership council public meeting, Paris

Total CTO on need for collaboration. BP on the 21st century digital oilfield. More from PPDM, OMG.

In his keynote address to the Standards Leadership Council, Total Group CTO Jean-Francois Minster described a ‘new world’ of numerical data, connected objects, gigaflop supercomputers and exabytes of new data per year. The changes are impacting E&P but also sales and marketing where, ‘even Amazon sells lube oil!’ Opportunities abound—from digital asset management, operations and in drilling automation. A common language is needed to combine different data sources sharing definitions and formats. External stakeholders increasingly expect transparency and if our terms are fuzzy this creates suspicion in the public mind. Data has a long lifetime and formats need to ‘stay alive’ for years. Collaborating on data and formats is not easy—there is inertia in the installed base! Even in Total, data formats differ from one unit to another. Total welcomes the efforts of the SLC to collaborate on this ‘long term issue.’

PPDM Association CEO Trudy Curtis welcomed a new SLC member, the Object Management Group (OMG) and introduced a nice marketing retrofit to Energistics’ standards, now categorized as supporting ‘data in motion’ (presumably leaving PPDM with ‘data at rest’). Curtis asked rhetorically, ‘how come there are so many standards?’ and proposed a roadmap of tactical projects like PPDM to Witsml mapping and the units of measure standardization initiative. A white paper covering information and awareness of standards bodies is in preparation.

BP’s Mark Brunton observed that although upstream digital technology is now mature, there remain limitations on data sharing. Integrated standards are seen as essential to BP’s redefined 21st century digital oilfield. BP is working to make global data available centrally and to offer a standard corporate look and feel so that engineers can operate from any location.

The SLC needs to be more than a forum for conversation—it needs to act in the interests of industry. To achieve this the it needs a governance framework and a ‘clear agenda, setting out how the individual standards come together,’ with direction from owner operators.

OMG CEO Richard Soley set out his storefront extolling the merits of UML as the ‘way forward.’ OMG’s modeling tools are graphical, code and text is deprecated. The OMG’s business process modeling notation got a plug as did the meta object facility. The consortium for software quality also ran as did the cloud standards customer council.

Energistics CEO Jerry Hubbard adds the following. ‘Our Forums in Paris and Houston attracted more than 80 delegates each. Our strategic planning sessions have focused on global standards adoption, initiating collaborative projects between standards rather than simply identifying intersection points. The SLC is to initiate operators and adoption advisory groups to provide strategic guidance and outreach. We are also holding an event in Utrecht just prior to Intelligent Energy.’ More from Energistics.


Folks, facts, orgs ...

Altair, Wood Group, Ansys, ArkEx, BEG, Charles River Associates, CyrusOne, Emerson, IHRDC, Kepware, Landmark, Meridium, Merrick, Nabors Industries, NASA, SPE, Oceaneering, Oiltanking, PG&E, PODS, Rajant, SAP, SRI, Tendeka, Unique Maritime, USC Viterbi, SCA.

Altair has appointed Detlef Schneider senior VP EMEA and Pietro Cervellera as operations manager for Germany, Austria, Switzerland, and Eastern Europe.

Wood Group has appointed Ian Marchant as chairman. He succeeds retiree Allister Langlands. Andrew Stewart is MD of the Australia and Asia Pacific business.

Andrew Yang is to step down as VP and president of Ansys’ Apache design unit.

Jim White, former president at Discovery Acquisition Services has been named president of ArkEx.

Azure Midstream has hired Eric Kalamaras as CFO. He hails from Valerus Energy Holdings.

Changbing Yang is lead investigator on the Texas Bureau of Economic Geology’s real-time, in-situ CO2 monitoring (RICO2M) trial, a large-scale deployment of an optical sensor network supplied by Intelligent Optical Systems to monitor a combined EOR and CCS project.

Robin Cohen has joined Charles River Associates as a VP of the energy practice. He hails from Deloitte.

Data center services provider CyrusOne is to build a third data center at its 45-acre Houston West data center services campus.

Emerson Process Management has opened an ‘innovation center’ near Austin, TX, a 282,000-square-foot, $70 million facility.

Bruce Peters has joined IHRDC as senior accounts manager in Houston.

Brett Austin is now president of Kepware Technologies. Tony Paine is CEO.

Nagaraj Srinivasan has been promoted to VP Landmark Software and Services.

Patti Foye is now Meridium’s CMO.

Scott Raphael is now Merrick’s CEO. He was formerly with Schlumberger.

William Restrepo is CFO of Nabors Industries. He hails from Pacific Drilling.

Brian Muirhead, chief engineer of NASA’s Jet Propulsion Lab, will open the SPE Intelligent Energy conference in Utrecht, Netherlands next month.

Oceaneering International has appointed Eric Silva as VP and CIO. He hails from El Paso Corp.

Ken Owen is president and CEO of Oiltanking Partners and Christian Flach is chairman of the board.

Pacific Gas and Electric has appointed Sumeet Singh as VP asset and risk management of gas operations.

Christa Freeman has joined the PODS staff as technical coordinator. She hails from IBM.

Sagar Chandra has joined Rajant as VP business development, Latin America. He joins from Geovia.

SAP Ventures, SAP’s independent VC affiliate has named Rami Branitzky as MD of its market development team. He hails from Grok Solutions.

SRI International has named Mariann Byerwalter as chairman of its board. She was formerly CFO of Stanford University.

Tendeka has promoted Annabel Green to VP of strategy and marketing. She was formerly with Weatherford.

Scott Jamieson has joined Unique Maritime Group as business development manager at its head office in Sharjah, UAE. He hails from Divex.

Jim Brink has been named Chevron Fellow at the USC Viterbi school of engineering.

Dan Patience has been named president and director of Well Power.

Death

Subsurface Consultants & Associates has announced the death of its founder and chairman emeritus, Daniel John Tearpock following a three year battle with pancreatic cancer.


Done deals

Schneider Electric, Invensys, Aker Solutions, EQT, Chesapeake, Ziebel, Dassault Systèmes, Accelrys, NuoDB, Fugro, Intertek, ION Geophysical, KKR, Quest Global, Siemens, TUV Rheinland.

France’s Schneider Electric has completed its $5.2 billion acquisition of UK-based industrial automation specialist Invensys.

Aker Solutions has sold its well-intervention business to Swedish EQT for NOK 4 billion plus an earn-out provision.

Chesapeake Energy is to dispose of its oilfield services division as either a spin-off to shareholders or an outright sale.

Investinor portfolio company Ziebel has raised $10 million from existing investors and ConocoPhillips Technology Ventures and Chevron Technology Ventures.

Dassault Systèmes has commenced a cash tender offer for all of the outstanding shares of Accelrys.

NuoDB’s latest round of funding includes new investor Dassault Systèmes along with existing venture investors.

Fugro has completed the acquisition of high-resolution mapping services provider Roames Asset Services from Ergon Energy.

Intertek has acquired International Inspection Services, a provider of non-destructive testing services to the energy, oil and gas industries based in the Middle East.

ION Geophysical now owns 70% of OceanGeo (formerly known as GeoRXT).

KKR has closed its Energy Income and Growth Fund I, a $2 billion vehicle targeting the North American unconventional sector.

Engineering solutions provider Quest Global has acquired Beeken TechQuest and its engineering services unit.

Siemens’ industry sector and venture capital unit are launching a $100 million ‘industry of the future Fund’ targeting ‘young and dynamic’ companies. Investments to date include Lagoa (Montreal), a provider of cloud-based 3D visualization software and cyber security specialist CounterTack (Boston).

TÜV Rheinland is acquiring UK-based Risktec Solutions, a provider of risk and safety services.


Artificial lift R&D Council gas lift workshop.

Exxon on gas lift optimization potential. Petrobras’ Fortran/Python models. OVS for Wintershall.

The 37th annual ASME/ALRDC international gas-lift workshop held last month in Houston underscored the growing use of IT in modeling complex gas lift operations. In his keynote, ExxonMobil’s Mark Agnew revealed that around one third of Exxon’s worldwide wells, producing almost one million barrels per day, are gas lifted, and most of these have optimization potential. Exxon’s global artificial lift center of expertise is using CO2 tracers to analyze wells in real time. Exxon’s current focus is on optimizing its top tier gas lift wells.

Researchers from Petrobras and Brazil’s Unicamp research faculty presented a decade-long R&D program on a combined experimental and numerical simulator for intermittent gas-lift, a technique that is widely used in Brazil’s low pressure mature fieldst. Full physics models have been developed for key well components such as casing, gas core, liquid slug and more—a nonlinear system of some 20 plus equations. The modular codes were implemented in Fortran 90 and run under a graphical user interface developed in Python and the PySide/Qt library.

A joint presentation from OVS Group and Wintershall Libya showed how OVS’ ‘One virtual source’ product was used to fix data gathering and model calibration issues prior to a gas lift optimization and automation program. Solutions were developed for back allocation, virtual metering and optimization. More from OVS. Read the ALRDC presentations.


Chevron backs OpenSPL standard

New standard addresses ‘spatial computing’ with support from Maxeler, Stanford and Imperial

Chevron has backed a new consortium that sets out to promote ‘OpenSPL,’ a new standard for ‘spatial computing.’ In a spatial computer, programs ‘execute in space rather than in a time sequence.’ Application data is laid out on a chip and operations execute in parallel. The new paradigm promises ‘dramatic increases’ in performance per watt and per cubic foot compared to conventional machines.

Maxeler Technologies’ presence in OpenSPL suggests that its ‘multiscale dataflow computing’ is a model for the new initiative. Maxeler’s hardware accelerators minimize data movement through the use of ‘large scale spatial computing.’ Analyst Frost & Sullivan awarded its 2014 European technology innovation leadership award to Maxeler, inter alia for its work on seismic processing and reservoir modelling. Oil IT Journal has been reporting on Maxeler’s field programmable gate array (FPGA) technology since 2007.

OpenSPL has application in other verticals as witnessed by the presence of the CME Group derivatives marketplace that handles ‘3 billion contracts worth $1 quadrillion annually.’ CME CIO Kevin Kometer said, ‘CME Group has long been a supporter of open efforts including the FIX Protocol and the Linux foundation. We will leverage spatial computing for our critical high performance computing needs.’ Other OpenSPL members are Juniper Networks, Imperial College London, Stanford University, University of Tokyo and Tsinghua University. More from OpenSPL and Maxeler.


Ontological foundations of petroleum modeling

Upcoming PNEC presentation from Endeeper proposes novel approach to interoperability.

Brazilian Endeeper has released details of a paper that it is to present at the 2014 PNEC data integration conference to be held in Houston next May. The abstract has it that all software design implies building a conceptual model of reality. But model terms may have different meanings for different stakeholders. A fault may mean one thing to a field geologist and another to the seismic interpreter. These different views are pitfalls to the data modeler and for software interoperation. Endeeper analyzes industry-standard formats (LAS, WITSML, PRODML, and RESQML) to derive an ontological foundation for the reservoir modeler.

Endeeper’s approach is claimed to ‘overcome the ambiguities of geological terminology.’ A conceptual model of geological objects (well, wellbore, horizon, fault, lithological unit) are then mappable across data exchange formats. Read the abstract and visit PNEC.


From PetroTech to petro-toke!

PetroTech Oil and Gas diversifies into pot production.

A curious release from PetroTech Oil and Gas unit LP US Management Group reveals a radical diversification from its core business of enhanced oil recovery. The new unit is to ‘leverage its expertise in natural resource development’ [...] ‘to secure a leadership position in the rapidly developing marketplace around legalized cannabis and hemp production.’

LP US is to produce ‘medicinal and recreational’ marijuana for home and commercial use from a facility in Telluride, Colorado with growers in Colorado and Washington. The presence of ‘professional producer and comedian’ Jae Benjamin, who serves as president makes us wonder if this is an early April 1st tease. Seemingly it is not. More from PetroTech.


Sales, deployments, partnerships ...

EcoSys, Flare, Schneider Electric, Aveva, Wood Group, Emerson, Eurotech, GE, IFS, Lloyd’s Register, Meridium, NDB, Paradigm, Rock Solid Images, CGG, Taqa, Wood Group, Tendeka, Tieto, IDC Energy Insights, Petrosys, Rock Flow Dynamics, Welltec.

Engineer CH2M HILL has selected EcoSys’ enterprise planning and cost controls software as its cost management system for client projects.

Wintershall is to deploy Flare’s E&P Catalog suite to classify its upstream electronic information and hardcopy archive and to provide visibility of its information asset.

Schneider Electric has certified Industrial Defender’s Automation systems manager for use in its Scada and oil and gas applications.

Technology development and licensing company, NGLTech, has chosen Aveva Everything3D as its primary 3D design tool.

Wood Group has been awarded a one year operations and maintenance contract extension by Chevron North Sea Limited for services to the Captain and Alba fields.

Emerson Process Management has been awarded a $7 million contract to provide its DeltaV operator training solution for BP’s Quad 204 North Sea FPSO replacement project. Statoil has awarded Emerson a $2.7 million contract to supply an integrated condition and performance monitoring system for the Gina Krog platform.

Orsyp has appointed Eurotech Computer Services to resell its Streamcore WAN optimisation control and ‘application-aware’ network performance management solutions in the UK and Ireland.

Statoil has chosen GE’s Proficy SmartSignal predictive-diagnostic to improve uptime and increase reliability of its heavy rotating equipment. The tool will be embedded in Statoil’s condition monitoring predictive analytics solution.

GE Oil & Gas is to deploy its ReliabilityMax predictive maintenance solution at a BG coal seam gas-to-liquefied natural gas facility on Curtis Island, Queensland, Australia.

Driller Songa Offshore is to deploy IFS Applications in support of its onshore and offshore operations in a contract worth NOK 40 million.

Lloyd’s Register Consulting has secured a risk analysis contract with Aker Solutions for the Johan Sverdrup development leveraging its RiskSpectrum software.

Meridium asset management is now SAP-endorsed as integrating with SAP’s own enterprise asset management solution.

NDB Asia Pacific has secured a contract to gather, QC and load well data for the NZ Petroleum and Mines Dept.

Mexico seismic services company Comesa has licensed Paradigm’s suite including GeoDepth, SeisEarth and Geolog.

OMV Norge and Rock Solid Images have entered into a one-year R&D partnership on quantitative interpretation in exploration.

CGG has signed a framework agreement with Taqa ‘strengthening’ their partnership in the Middle East. More on the rather complicated arrangement from CGG.

Talisman Sinopec Energy has awarded Wood Group a $500 million, five year contract for engineering and modification services to eleven offshore assets.

Tendeka has won a two-year contract from China’s Zonton Energy Technology Company for the supply of autonomous inflow control devices.

Tieto’s hydrocarbon accounting software came out ahead of the pack in a recent MarketScape analysis by IDC Energy Insights. Download the report excerpt.

Petrosys has signed an exclusive three-year agreement to distribute Rock Flow Dynamics’ tNavigator reservoir simulation solution in Australia, New Zealand, PNG and Timor-Leste.

Danish based Welltec Corporation has won a $5 million, four year contract from Ecopetrol Colombia for the supply of well tractor conveyance and well cutter services.


Standards stuff

OMG re-pitches oil and gas. APM/Open Group partner on Togaf. Ansys and the Functional mockup interface. OpenGeospatial and Sensor web enablement.

The Object management group (OMG) has toned down its pitch to the oil and gas vertical (as Oil IT Journal reported back in November 2013). OMG still believes that, in the data sharing space, ‘little has been done since the adoption of Witsml,’ but no longer maintains its superiority claim over OPC-UA. OMG believes that the upstream ‘has much to gain from the extensive body of work done on process and information modeling and exchange in other domains’ and is proposing ‘an industry-specific task force’ to cross-pollinate the efforts of other OMG subgroups.

The APM Group (APMG) and The Open Group are partnering to provide accreditation services for The Open Group’s products with an initial focus on Togaf and ArchiMate. APMG’s multi-lingual assessors will help The Open Group enter new markets and ensure quality support of existing standards.

Ansys blogger Sameer Kher reports on a new exchange standard for simulation-driven product development, the Functional mockup interface. FMI help OEMs communicate with their suppliers and share vendor-neutral descriptions of models. Ansys’ ‘Scade’ tools generate FMI-compliant models that can be re-used in third party tools such as Modelica.

The Open geospatial consortium has approved its Sensor model language (SensorML) 2.0 encoding standard, a component of its Sensor web enablement suite of standards used in satellite mission planning, monitoring and alerting. SWE standards are said to be key enablers for the Internet of things.


Opto 22 on ‘high performance’ HMI

White paper deprecates P&ID-based human-machine interface in favor of Groov toolkit.

A new white paper from Opto 22 takes issue with current human machine interfaces (HMI) which may have contributed to major disasters such as the BP Texas City refinery explosion. Early attempts to represent real time processes using computer displays derived from piping and instrumentation drawings (P&IDs) as this ‘seemed logical.’ Most of HMIs are still based on P&IDs which, as Bill Hollifield and Ian Nimmo, authors of The high performance HMI handbook observe are ‘tools for designing rather than controlling a process.’ Current HMI ‘focuses too much on the process hardware rather than on the operator’s mind.’

The white paper offers concrete suggestions for improved HMI incorporating ergonomics and user-centered design to provide operators with consistent information in context.

For large systems, the authors acknowledge that market leaders Honeywell with its Experion PKS and Emerson’s Delta V have done a good job of improving the operator interface for large industrial systems. For less complex systems, Opto 22 recommends its own ‘Groov’ toolset for building simple operator interfaces that can be deployed on a variety of platforms, from smartphones to web-enabled large-screen TVs. More from Opto 22.


GE announces ‘Directive’ tool for directional drilling

Improved rate of penetration and less maintenance downtime from vibration monitoring.

GE Oil & Gas has announced a new ‘Directive’ directional drilling system that is said to improve drilling performance by monitoring shocks and vibration in real time. GE’s Yokima Davison told Oil IT Journal, ‘The Directive system captures shock and temperature events to an onboard 32MB memory. This allows service companies to monitor out-of-spec events and perform condition based repairs and maintenance. The diagnostics ability combined with improved stability of sensor calibration allows service companies to predict calibration and maintenance schedules, optimizing use and reducing repair costs.’

Asked if the Directive tool would be feed into GE’s Proficy/Predictivity solution set, Davison replied, ‘Downhole tools do not have the ability to predict the onset of vibrations, but can detect vibrations early enough so that drillers can take remedial measures before they reach a level of severity that causes equipment damage or reduce rate of penetration.’


UK Energy Institute publishes process safety guidelines

New publications describe ‘high level framework’ for process safety management.

The UK-based Energy Institute is publishing a series of process safety management (PSM) guidance documents that describe a high-level framework for PSM. According to the Institute, a better understanding of PSM is required to ensure the efficient running of operations in major accident hazard organisations. The first two volumes of the series are available for free download.

Process safety is a blend of engineering and management skills focused on preventing catastrophic accidents and near hits. According to the EI, many organisations are looking to adopt a more holistic and systematic approach to assuring the integrity of their operations. The high level framework is divided into four focus areas, leadership, risk assessment, risk management, and improvement. The first publication, ‘Leadership, commitment and responsibility’ is a 50 plus page document that proposes 15 ‘expectations’ for PSM and a (rather complicated) workflow plan for their realization. This is followed by a detailed drill down into the means of achieving the objectives and monitoring the results with a EI process safety survey benchmark EIPSS.

The second publication addresses compliance with industry standards, targeting different points in the overall workflow diagram with a similar approach and offering a similar EISS rating scale. An appendix provides a ‘non-exhaustive’ list of standards-setting bodies reflecting a complex regulatory landscape of international guidance publications, codes of practice, standards and tools.


Iron Mountain survey finds major ‘gaps’ in RIM accountability

Records and information management still struggles in face of poor employee engagement.

A ‘comprehensive’ benchmark survey* of some 1,300 records and information professionals finds that accountability gaps threaten corporate records keeping and compliance. While companies lay claim to strong records and information management (RIM) programs, the survey finds that many fall short of their objectives. The shortcomings are preventing organizations from realizing cost savings and expose them to fines for non-compliance with industry and federal regulations.

While 87% had a RIM governance program in place, these lack key elements of an effective program. Only 17% conduct internal audits and assess compliance. A mere 8% can provide metrics on program effectiveness and employee improvement—in fact only 7% report that employees are ‘engaged and supportive’ of the program. 75% complained that lack of automation was preventing the implementation of an ‘efficient and defensible’ retention and destruction program and a healthy 64% acknowledged an inability to break the ‘keep everything’ culture of their organization.

* RIM benchmark survey by Iron Mountain, Cohasset Associates, AIIM and ARMA.


Ansys embeds Nvidia GPU accelerator in Fluent 15.0

Poster child Parametric Solutions reports on accelerator use in structural and CFD design.

Ansys and Nvidia are claiming a ‘first’ for their use of graphics processing units (GPUs) to speed computational fluid dynamics simulation of large complex models. The new functionality, available in the 15.0 release of Ansys’ Fluent solver, combines multi-core parallel processing with Nvidia’s GPU hardware.

Gas turbine designer and manufacturer Parametric Solutions has been using Nvidia Maximus appliances, comprising Quadro GPUs and Tesla compute engines, to run Ansys structural mechanical simulations for a couple of years. Parametric CTO David Cusano reported a performance ‘doubling’ from the Maximus. Cusano expects similar performance gains from Fluent 15.0, ‘especially when paired with the new, high performance Tesla K40 accelerators.’

Ansys’ Barbara Hutchings added that support for Nvidia’s accelerator is also available in Ansys’ HFSS electromagnetic simulator. ‘Customers can now apply GPUs to speed up fluid dynamics, mechanics, and electromagnetic simulations.’ More from Ansys and Nvidia.


WellMetrics addresses the data challenge

GIS interface and M-File-based document management said to fix ‘arcane’ E&P well data stores.

Well Metrics has released a Prezi slideshow explaining its ‘M-Files’ solution to the E&P data silo ‘challenge.’ E&P companies are organized by discipline and use commercial software to store and manage data in discipline silos. Gathering corporate knowledge across disciplines is hard and relies on subject matter experts to broker requests for information. This often involves time-consuming data re-formatting prior to use.

Much relevant information is held in Excel spreadsheets, Word documents, PDFs held on the Windows file system. WellMetrics describes this storage mechanism as ‘arcane.’ Retrieving data from such systems is ‘100% dependent on human memory and logic.’

Enter WellMetrics’ geographic information system (GIS) for cross discipline data collection. WellMetrics uses the M-Files document management system to connect directly to data silos. Data access is ‘self-brokered’ by any person with security to access the application. All data is kept natively in the organization’s applications of choice. WellMetrics leverages web services to serve data in its GIS mapping system alongside vendor data from Dwight’s, IHS, and Tobin. The M-Files DMS adds generic access to all relevant information associated with a well.


Hazardous area ‘Blaster’ robot to enter Total’s Argos challenge

Colorado School of Mines’ prototype trialed by Abu Dhabi institute in refinery context.

The Colorado School of Mines (CSM) has developed ‘Blaster,’ a robot designed to operate in hazardous areas and situations such as those arising in oil and gas plants. The prototype system is being trialled by the Petroleum Institute of Abu Dhabi to increase safety in oil and gas refineries

According to CSM professor John Steele, refinery operators are currently exposed to potential explosions, gas leaks and extreme weather conditions. ‘We are trying to get robots to do the same operations humans can do, but by taking the human out of harm’s way, we are increasing safety.’

Blaster is equipped with a methane gas sensor, video camera, microphone, thermal imaging camera, GPS, digital compass, laser-range finder and Wi-Fi. The CSM is developing applications that monitor and log the sensors’ data.

The CSM has entered France’s ‘Argos’ challenge. Argos, a.k.a. the Autonomous Robot for Gas & Oil Sites sets out to encourage the development of robotic systems for use in extreme conditions of cold and heat and hazardous environments where they can reduce workers’ exposure to risk.

Argos has backing from Total and the French National Research Agency (ANR). The plan is to ‘attract the attention of actors across the robotics world to the difficulties encountered by the oil gas industry [and to] pool competencies to produce the robots of tomorrow.’

Winners of the initial challenge will have two and half years to further develop their technology. Robots should be to be able to ‘accomplish repetitive tasks and respond in an emergency.’ More from CSM and from Argos.


EarthCube - digitizing Yosemite

National Science Foundation ‘transforms’ geological field work with ‘earth centered communication.’

EarthCube, a US National Science Foundation (NSF) initiative to ‘transform research through a cyberinfrastructure’ is planning a digital field trip to Yosemite to trial novel field data gathering technology. The trials will be conducted under the ‘Earth-centered communication for cyberinfrastructure’ (EC3) program targeting field-based geosciences.

The field trip sets out to ‘facilitate a dialogue’ between geologists, computer scientists and psychologists and to break down the silo boundaries and allow scientists to work efficiently and innovatively. Specific challenges to be addressed include data management and the digitization of analog data and its incorporation into field-based workflows. Data collection sensors, metadata collection, ‘smart’ mobile apps, standards and machine learning are also on the menu. The first field trip is planned for August 2014 and includes ‘breakfast, a picnic lunch and dinner.’ More from the NSF.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.