September 2011


BOP Monitor

A software solution from Lloyds Register companies ModuSpec and Scandpower repurposes real time risk assessment modeling technology developed for nuclear reactor monitoring.

The 2010 Macondo disaster brought the blow out preventer (BOP) into the public eye with its spectacular failure to perform. Long before Macondo, a 1999 study (www.oilit.com/links/1109_2) by Norway’s Sintef research unit for the US regulator, the Minerals Management Service, now Boemre, found that BOP failures accounted for nearly 4% of drilling downtime. A total of 117 failures were observed during the two year study of 83 wells drilled in the deep water Gulf of Mexico. Worst case events included a control system failure that caused a total loss of BOP control and a shear ram that failed to close.

Today, Lloyd’s Register, the UK-based classification and risk management organization is working to develop software for managing BOP risk. Lloyd’s Register Energy companies ModuSpec and Scandpower, have teamed on ‘BOP Monitor,’ an application that provides BOP assessment while drilling.

BOP Monitor was derived from Scandpower’s proprietary ‘RiskSpectrum’ package originally developed for the nuclear power industry and now deployed in 50% of the world’s nuclear plants. The product includes tools for fault and event tree analysis, documentation, risk monitoring and failure mode analysis. BOPs from two manufacturers have been modeled with more to follow.

Scandpower CEO Bjorn Inge Bakken explained, ‘BOP Monitor uses a risk model based on high quality data to assess operational risk levels in real time, including the identification of faulty components.’

Duco de Haan, ModuSpec’s CEO added, ‘Current BOP risk assessment lacks a consistent structure. Decisions on whether or not to pull them for inspection and maintenance can be difficult to understand for senior management and regulators.’

BOP Monitor embeds data drawn up by a team of operational experts working from current documentation and drawings. Risk assessment is then performed in a controlled setting leveraging Scandpower’s methodology which is claimed to improve on traditional ‘ad-hoc’ assessments. In the event of an incident, the system provides rig owners, operators and regulators with advice on the BOP ‘within hours,’ allowing for ‘informed decisions on whether or not to continue operations.’

A product development review panel has been established with members from the public and the private sectors. Experts from ModuSpec, Scandpower, the nuclear and drilling industries will update the panel with feedback from deployments. Panel members will receive failure mode and criticality analyses, reliability data and the results of fault-tree analyses. Lloyds Register describes itself as a group of ‘charities and non-charitable companies.’ Lloyds acquired Scandpower in 2009. More from www.oilit.com/links/1109_3 (Lloyds) and www.oilit.com/links/1109_4 (RiskSpectrum).


Baker Hughes’ Artificial Intelligence

Following equity stake, case based reasoning technology from Verdande will be embedded in BEACON remote operations platform.

Baker Hughes has acquired a minority stake in Verdande Energy, a unit of Verdande Technology of Trondheim, Norway. Verdande supplies a case-based reasoning (CBR) application, ‘DrillEdge’ for real-time identification of drilling problems. DrillEdge works from a database of previously drilled wells to provide an ‘intervention-while-drilling’ response, a.k.a. a ‘bridge between past experience and current operations.’

Scott Schmidt, president of drilling and evaluation for Baker Hughes said, ‘DrillEdge can lower risk, increase rate of penetration and reduce non-productive time. The solution complements our Beacon remote operations platform and will help customers understand their more technically challenging wells.’ Verdande CEO Lars Olrik added, ‘This deal strengthens our place in the market with global support and will accelerate development of our technology.’

Other Verdande investors include Statoil Venture/ProVenture Seed AS, Investinor, the founders and employees. Last year (Oil ITJ November 2010), DrillEdge was deployed at Shell’s network of real-time operating centers for 24/7 monitoring of ‘high visibility and challenging wells’ worldwide. Baker Hughes will be involved in future DrillEdge development. More from www.oilit.com/links/1109_5.


Enterprise architectures for anyone?

Neil McNaughton walks through the content of this month’s Oil IT Journal wearing his enterprise architect’s hat. The interoperability problem statement remains essentially the same. But there are a lot of potential solutions out there and more are coming. Will the semantic web focused integrated operations initiative come good? Will vendor solutions prevail?

Reporting from frequent conferences as we do means often listening to presentations that do tend to repeat themselves. Several presentations at this month’s ECIM data management in Haugesund, Norway addressed the problem of software interoperability and data exchange across the enterprise. This issue has been with us since we began reporting back in 1996 and it was something of an ‘old chestnut’ even then. Why then are we still talking about interoperability, silos and applications as data thieves? One thing is that the early promise of an architectural approach to the problem, built around a database (Finder was doing pretty well at the time and POSC, now Energistics, was getting a lot of attention with its Epicentre data model) has largely failed. Meanwhile, stand alone apps, the Petrels, Kingdoms, proliferating Excel spreadsheets and innumerable tools for production data have actually increased in prominence. Statoil’s Lars Olav Grøvik related that in one joint venture, the operator reported real oil production from a non existent well and that looking for data still delays projects. Unitization is a pain point even though modern information systems ‘should be able to produce an error free unitization database’ (page 6).

~

While such issues persist, I would like to turn now to some of the solutions that are available—particularly as this issue of Oil IT Journal is full of interoperability-oriented offerings. If you are into upstream applications, most can now trade data with each other. Boutiques like Petris, iStore and others have developed busses and data translators that let you play one app to the tune of another. In other fields like process control, tools like OSIsoft’s PI System (see page 5 for our report from the 2011 EU regional seminar) have more data adapters than you can imagine—as do systems from companies like ISS (see our interview with CEO Richard Pang on page 3). If you are of a strong disposition, you may still hanker after a central database solution to the problem in which case the folks from Teradata would love to talk to you about their work with ConocoPhillips and Western Refining (page 7). On a smaller scale, this is the approach that Geologic Systems is working on with its PPDM-in-a-Box solution (page 10). Your integration needs may go beyond the upstream. What if you want to roll in data from your ERP system? Well there are plenty of solutions here too—from the aforementioned companies and, as of this month, from Stonebridge which has announced a master data management solution tuned to P2 Energy Solutions software line-up. In next month’s issue we will be reporting on a new solution from Tibco’s OpenSpirit unit which has added ERP integration to its solution leveraging the Tibco’s ActiveMatrix Business Works architecture.

There are plenty of tools out there, so why haven’t we got further along the road to interoperability? IBM data guru Sunil Soares presented an ECIM keynote on data governance and the enterprise architecture question (page 6). Soares believes that it is both desirable and feasible to bring the whole enchilada into a single IT framework. His use case was the question ‘how many employees do we have?’ This simple question seemingly defies current IT systems. Do you only count full time staff in the HR system? What about part time consultants in the badging system? Or some other category in SAP? Anyone who has worked on an integration effort will realize that this sort of question crops up all the time. If life were as simple as mapping from N_EMP in one system to EMPCOUNT in another the ‘problem’ would have been solved way back. Perhaps the real problem is the problem itself. If you can’t define what ‘an employee’ is, then no information system in the world will give you an unequivocal answer to the question ‘how many do we have?’

There is another difficulty with the EA approach and that is the extent to which you have to throw away all your existing applications and infrastructure to make it work. The modeling approach is OK if you can be sure that your applications will be reading data from the EA. This may or may not be possible. If your solution is of the middleware/SOA variety, will it be able to talk to your existing data stores? How much of your data resides in the applications? Geologic Systems’ Wes Baird describes the problem of what happens here when you update a well there (page 11). Grøvik also touched on this issue calling for better separation between applications and data stores.

Which leads me to another Norwegian data effort—the ‘integrated operations’ in its latest ‘high north’ IOHN flavor which was highlighted at the 2011 Semantic Days conference in Norway earlier this year (page 7). This interoperability effort is based on the premise that all data sources and all applications will be retooled around semantic web standards, or at least will see semweb wrappers developed. This is not going to happen any time soon.

Let me now add in another request from the design team. What if we need to roll in data from all our systems, from ERP/SAP and so on and so forth and we need to share entitled data with our partners. This is on the face of it quite a big ‘other’ requirement. We have already had to rewrite all our data servers, all our apps and now all our partners have to do the same? This is not going to happen either. Largely because for every joule of energy spent on investigating a ‘standard’ approach, a kilo joule goes into developing and marketing commercial systems. Witness last month’s lead, where we reported on the OSIsoft/Industrial Evolution solution to partner data sharing in the Gulf of Mexico.

~

To wind up on a lighter note I will leave you with an observation made by Landmark’s Janet Hicks who thinks that data management is like dusting, ‘you may have done it today, but you will have to do it again tomorrow.’ Which leads me to the corollary of the repetitive presentation, the repetitive editorial. I plead guilty.


Oil IT Journal interview—Richard Pang, ISS Group

Richard Pang is CEO and MD of ISS Group, developer of the BabelFish platform and solutions for well testing, production loss accounting and more. Pang covers successes with BP and Hess, its deal with Schlumberger, MURA and offers his take on the digital oilfield and production optimization.

How did BabelFish1 originate?

ISS started 16 years ago with a process industry orientation, working for refiners, particularly at BP’s Kwinana, Western Australia refinery where BabelFish originated. The technology later found application in the upstream with the development of oil and gas specific applications. These later became the Oil and Gas Suite—our production data management system (PDMS). Key early clients were Petronas Carigali and Hess in Asia. We also work with the mining, water and consumer goods verticals—making chocolate is a process industry too!

What sort of applications are we talking about?

Production loss accounting, well testing, gas nomination systems and a plethora of data to visualization and reporting applications. We also provide solutions to major LNG projects from wellhead to ship. In mining these are dubbed ‘pit to port.’ One significant recent sale was to Hess Corp. which piloted our production loss accounting system ‘A-Plus’ on its Pangkah, East Java site. A-Plus was used to model capacity and track and report losses. This has led Hess to develop a standard capacity model and loss accounting methodology across all its assets. Loss accounting is now performed consistently across all assets with the flexibility to cater for local asset-specific operational requirements.

Do you operate up or downstream of the data historian?

We capture data that does not exist and validate what does. Our ‘Verify’ application does limit checking, eliminates multiple instances of data and captures data manually—for well tests, loss and operator logs at shift handover. The cleansed data is published to another repository—either our own or, for example SAP-BI.

You announced a major deal with Schlumberger in 2008. Is that still active?

Yes for next few years anyway. This is a global agreement. Schlumberger is a distributor for BabelFish, but not for the Oil & Gas Suite which we still sell direct.

Have you got much mileage from your OpenSpirit/BabelFish adapter?

Not really - especially since they were bought by Tibco and now have a similar data bus...

And the deal with Aveva?

No. We have drifted away from engineering. We did the Woodside project with Aveva. They do have some great tools like Aveva’s Smart P&ID.

So engineering and upstream remain worlds apart?

Yes there is a big gap. But it is one that we and others are interested in trying to fill even if the situation is not changing much. A lot of money is still wasted when engineering hands over to operations.

What’s your take on the ‘digital oilfield?’

We learned a lot from one ‘full blown’ digital oilfield tender from an Asian independent. The company terminated the tender when they saw the price folks were asking! For them, and us, the ‘digital oilfield’ is, more pragmatically, the PDMS rather than the utopian vision being advocated by some of the majors. This may sound great, but in general, the value proposition of down hole smart meters and other high end stuff is unclear. The client, by the way, calls the solution a ‘digital oilfield,’ but it is really our PDMS!

You announced a pilot with Saudi Aramco in 2007. How has that panned out?

The deal fell apart. Maybe we should have opened an office there. Aramco’s 77 gas oil separation plants are now monitored with Siemens’ XHQ solution.

How does a client decide between your PDMS, XHQ and say OSIsoft’s PI ProcessBook?

Verify, BabelFish and the other PDMS components offer a modular solution that integrates with the heterogeneous environments that prevail in most companies today—tools like Merrick, Halliburton’s TOW. ProcessBook is tied to PI System data.

What do you make of interoperability initiatives and standards?

For sure there is a lot of redundant data out there and we need agreement with WITSML, PRODML and so on. But there is still an issue with real world data. A well test for instance uses tubing head pressure and there is the assumption that this number is available somewhere. It is not hard to align such data with a given standard. But the problem is that there may be 100 ‘standard’ attributes per well every 5 seconds. They may include the one you need or maybe they won’t! There will be even more data flowing from ‘smart’ wells. The truth is that most people are not interested in most attributes...

Is the Microsoft upstream reference architecture (MURA) for real?

We’ve attended the MURA meetings. There is some interesting stuff there even if it is not moving very quickly. But it is generally a good idea to fit with Microsoft’s SharePoint architecture. All our solutions all operate within the Microsoft platform.

So MURA equates to ‘it runs in SharePoint?’

If you like (laughs).

What about production optimization?

If you mean stuff like GAP, ISM, Production Universe and neural nets, no. The idea of pressing a button and getting more oil sounds great. But with these model based approaches, model upkeep is a big overhead. It’s not really our focus. We are more concerned with presenting relevant information to operators and other stakeholders. One of our major clients defines ‘production optimization’ as its program of work. Our approach in both refining and the upstream is to provide information visualization. If you see when a well is acting differently, it could just mean that a model is at fault - or it could represent an opportunity. More from sales@issgroup.com.au.

1 ISS’ flagship framework for data capture from field equipment, process control and management systems.


Energy Institute whitepaper on safety critical task analysis

UK body investigates ‘disparity’ in accident reporting, proposes methodology for safety analysis.

The UK Energy Institute has just released a 60 page whitepaper titled ‘Guidance on human factors safety critical task analysis (SCTA)’ which addresses a ‘disparity’ in major accident safety reports. These tend to focus on ‘technical’ failure, neglecting human factors. This despite ‘widespread awareness’ that accidents like Piper Alpha, Chernobyl and Texas City were in part caused by human failure.

SCTA can be performed at design time to anticipate risks by identifying hazards and deciding which tasks are safety critical. Next understand and categorize critical tasks to pinpoint potential human failures. Then work to remediate such.

For brownfields, accident reports and risk assessments may be available. For new builds, SCTA needs to start early on. The report refers to a variety of UK and international standards impacting SCTA. The approach also includes risk matrices, checklists and various scoring systems. The interplay between human and machine may be unclear in the face of automated alarms, trips, and relief valves. The approach is to ‘think through’ each task to identify SCTs. Data collection is key—a trio of documents, interviews and ‘interactive observation.’

Hierarchical task analysis is used to break down SCTs into sub-activities to identify critical situations and design mitigation procedures with the intent to reduce the risk to ‘as low as reasonably possible’ (ALARP).

The EI whitepaper is undoubtedly a useful contribution to the canon even if it does raise the specter of ‘paralysis by analysis.’ But there is far too much jargon and too many acronyms for comfort. Using more straightforward English would have helped get the message out to a non specialist audience in a less painful manner. The report is a free download on www.oilit.com/links/1109_9.


UK announces ‘PEARS’ online licensing system

UK Department of Energy and Climate Change adds web-based licensing and relinquishments.

The Energy Development Unit of the UK government has announced a new web-enabled licensing system to go live later this year. ‘PEARS,’ the petroleum e-licensing assignments and relinquishments system will let license holders manage offshore licenses online including operator approvals, equity interest changes and relinquishments. The new system is designed to provide security and ease of use and will ne part of the UK Oil Portal, which already takes on-line applications for new licenses. More from www.oilit.com/links/1109_10.


Hadoop/MapReduce LIDAR data and the cloud

Cyberinfrastructure presentation reports successful use of ‘big data’ solution for geoscience data.

Speaking at the Cyberinfrastructure Summer Institute for Geoscientists last month, Sriram Krishnan of the San Diego Supercomputer Center investigated the use of Hadoop from the open source Apache foundation to manage Lidar data. Hadoop is a ‘big data’ solution derived from Google’s ‘MapReduce.’ The system is leveraged in cloud computing environments and is used by Yahoo!, EBay and Facebook.

Lidar data, often used in pipeline and seismic line routing, is usually obtained from airborne laser surveys. SDCC uses IBM’s DB2 spatial extender to index Lidar data. The test data set covers a wide area of the San Andreas fault area and is stored on an extensive computer cluster. Krishnan outlined some of the advantages of the current database approach. This offers SQL-based query in a production quality environment but suffers from data loading and retrieval overhead, scalability and cost of the high end hardware and software.

A hybrid solution with point cloud data stored as files in the industry standard ASPRS LAS data format with metadata in the relational database offered better price performance and proved more amenable to the cloud computing paradigm. Hadoop appears a promising programming environment for large scale data processing on commodity resources. These can be public or private clouds or a ‘traditional’ HPC cluster. In a previous publication (www.oilit.com/links/1109_12), Krishnan reported a significant code size reduction with 700 lines of Hadoop Java code equivalent to 2900 lines of C++. Read Krishnan’s presentation on www.oilit.com/links/1109_11.


Austin GeoModeling gets patent for 3D geological interpretation

Broad patent covers geological interpretation, well log correlation and ‘dynamic’ cross sections.

Austin GeoModeling has been awarded a patent by the US Patent Office for its ‘next-generation’ 3D geological interpretation technology. US Patent number 7,986,319 B2 described as a ‘system for performing geological interpretation operations [..] comprising: a storage medium [..] a plurality of instructions for execution and a processor for [..] well log correlation, dynamic cross-sections [etc.].’ According to CEO Robin Dommisse, ‘The patent describes our integrated 3D geological interpretation software and encompasses numerous innovations in the field of geological interpretation software development.’

What appears to be the essence of the patent is the three-dimensional nature of the tool and its dynamic interpretation and modeling system. This is described in no less than 93 claims for different aspects of the technology, reading a bit like a dump of the Recon manual and marketing material. But, the patent warns, coverage is ‘not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.’ AGM president Tron Isaksen predicts increasing revenues as ‘industry moves to adopt dynamic 3D geological software solutions more broadly.’ Read the patent on www.oilit.com/links/1109_13.


Software, hardware short takes

ESSI Corp., ES&H Consulting, Mechdyne, Expro, Fugro, Zeiss, IDS, BGS, Object Reservoir, OSIsoft, Tecplot, P2 Energy Solutions.

Black Elk Energy has deployed sensors from ESSI Corporation on its Gulf of Mexico platforms. The sensors provide early warning of hurricane damage or capsizing. ES&H Consulting designed the system—www.oilit.com/links/1109_50.

The 3.0 release of Mechdyne’s Conduit software for immersive environments supports multiple users and motion capture across applications including, Enovia/Catia, Pro/Engineer, Maya, Google Sketchup/Earth and more—www.oilit.com/links/1109_51.

Expro now offers a BOP inspection service using its multi-finger caliper and ‘ViewMax’ down-hole camera systems to provide in-situ imagery and integrity data on BOPs, casing and risers—www.oilit.com/links/1109_52.

Field tests are currently underway on the Fugro/Zeiss ‘RoqScan’ portable rock properties analyzer. The system analyzes wellbore cuttings and core samples to provide mineralogical and textural datasets within an hour—www.oilit.com/links/1109_53.

IDS has announced TourNet, which ‘transforms the IADC Tour Sheet from an e-document, to a digital database.’ TourNet provides multi-rig analytics and a bi-directional Witsml capability—www.oilit.com/links/1109_54.

The British Geological Survey’s ‘iGeology’ was voted best ‘community favorite’ mobile app at the ESRI International User Conference—www.oilit.com/links/1109_55.

Object Reservoir has announced ‘Limits 3.0,’ a shale gas focused reservoir engineering tool. Limits uses stochastic forecasting to predict future well performance—www.oilit.com/links/1109_56.

OSIsoft ‘s PI Coresight is a new front end to PI System data. The tool was designed for ‘ad hoc’ data analysis and provides gauges, trends, data values, tables or bars. Coresight works in ‘any web browser that supports Microsoft Silverlight’—www.oilit.com/links/1109_58.

The 2011 release of Tecplot RS adds support for Computer Modeling Group’s IMEX, STARS and GEM simulators—www.oilit.com/links/1109_59.

Tobin Map Data is now available online as part of P2 Energy Solutions Tobin All Access subscription service. Tobin Online users can access map data directly from P2 Energy Solutions’ servers, ensuring the data is the latest available. GIS technicians and geologists can also access Tobin Online as a web map service from tools such as ESRI’s desktop ArcGIS—www.oilit.com/links/1109_60.


OSIsoft Barcelona Regional Symposium

GDF Suez and RasGas present PI System use cases. Total fields Magion’s complex event detection.

The European OSIsoft PI System user meet was held in Barcelona earlier this year. René Thomassen presented the use of the PI System at GDF Suez, now an €80 billion revenue utility with operations around the world. GDF Suez has seen major grown through acquisition over recent years. Thomassen was not sure that he had even located all the PI Systems in the group (he computed around 30 in the EU alone) and he invited other GDF PI users to come forward. PI System forms the core of several in-house developed tools for operational support and plant and process information management. GDF’s platform combines SQL Server, PI System and ABB’s MicroSCADA. Thomassen’s presentation showed how prevalent PI System has become in GDF’s power generation, trading and marketing operations. PI is used for operations dashboarding through a GIS front end, in operational ‘cockpits’ leveraging PI WebParts for drill down into plant data, equipment monitoring and bespoke applications to track specific activities. PI Asset Framework is currently being rolled out to replace the legacy reporting system from Ecodata. Thomassen says that quantifying the value of a PI framework is hard but he is ‘sure that the economics are good.’ Detailed performance monitoring reveals opportunities for improvement. Real time data enables proactive maintenance and avoids downtime. PI can be used during plant commissioning to analyze supplier data and ‘force improvements or refunds.’ PI’s rich toolset means GDF can achieve more with internal resources and avoid third party development.

Jassim Al-Shammali and Mohammad Shamsul showed how RasGas was building a real time information system including a monitoring system built around PI. RTIS monitors activity at the LNG Plant and sends alerts by sms or email in event of something abnormal. RTIS’ user base is growing as new interfaces are added and more critical data is available from the system. The PI System surveillance tools and data security were cited as key wins.

Hervé Delesalle (Total) and Tjidde Boers (Magion) described the use of complex event detection for oil field monitoring. Netherlands-based Magion is a process control engineering software house. Total’s field monitoring R&D project seeks to turn raw data into valuable information via a monitoring platform architecture and ‘production intelligence’ in the form of complex event detection. Early results have demonstrated a significant reduction in the time spent on data collection with a noticeable impact on production. The platform includes PI-ACE, WebParts, PI-Asset Framework and Notifications. Magion’s ‘µSuite’ is being trialed for event detection and recognition. The platform allows for asset model management including site structure definition and connectivity to data sources. Microsoft SharePoint and PI WebParts provide visualization. The test case was a shortfall (production loss) detector to identify and ‘understand’ underperforming wells. Production rates are estimated from real time pressure measurement. Wells are modeled as polynomials stored in the production database. As shortfalls are indentified, their causes progressively enrich the µSuite database with flow rate, well potential and context, as defined in Total’s standard classification table. PI-AF allowed good data integration with other reference data sources like SAP. µSuite provided graphical tools to design business algorithms with the PI System and PI AF integration. Presentations available on www.oilit.com/links/1109_15.


2011 ECIM Data Management, Haugesund

Upstream data event heard from IBM info-guru Sunil Soares on data governance, Spring Energy on data management in small oils and Statoil’s quest for an information architecture and ‘big data.’

ECIM retained its claim as the premier E&P data management forum with a full house of over 300 attendees—some 35% from outside Norway. In an entertaining keynote, Eirik Newth, who describes himself as a ‘futurologist,’ focused on mobile computing and social media. Mobile computing today is at the same stage as the PC market was 20 years ago. While games like Angry Birds and Word Feud are ‘huge productivity hogs,’ there is as yet nothing revolutionary in mobile, ‘but that will come.’ One problem is lock-in that vendors’ ‘honey traps’ bring—such as Apple’s App Store. Industry needs less ‘killer apps’ and more standards like HTML 5. This should let users use content across multiple devices. Facebook has ‘defined social media’ and demonstrates strong synergy with mobile computing. Facebook’s ‘products’ are its customer, i.e. you! FB is a very futuristic company. This can be good and bad. Facebook may have played a role in the Arab spring, but it can be used against people by the secret services and police forces. Newth left Facebook six months ago. You should be ‘very wary!’

Lars Gaseby (Shell) kicked off the proceedings arguing that it is better to get your data into shape early in the life of a field. If you wait, the well count and data volumes will rise and you will be competing for resources with other activities when production is declining and costs are squeezed. What should you be doing about your data? You need to manage it and assure quality by fixing defects. How? These are the questions that ECIM seeks to answer.

IBM information management guru Sunil Soares observed that ‘governance’ has achieved buzzword status. What is ‘governance? ’ When Soares was writing the blurb for his latest book on the topic (www.oilit.com/links/1109_38), his publisher suggested he ‘write something catchy.’ In fact ‘no one agrees what it is,’ but it is, ‘really about quality.’ For a large organization, the question, ‘how many employees do you have?’ may be hard to answer. You may have to look at different systems (SAP, badging, HR). But then there the question of what is an employee? Staff, contingent, supplemental? Governance involves the formulation of policies for information and data management (considered a ‘technical issue’). Governance in E&P may mean aligning multiple definitions of a well to ‘optimize the information asset.’ Governance spans data quality, standard names for equipment and assets, accounting, HSE, regulatory/compliance. Metadata is the key to tying information across sources via consistent definitions.’

Soares’ discourse morphed into enterprise IT, data stewards and a envisaged a data governance manager ‘sitting in IT.’ The RACI matrix (www.oilit.com/links/1109_39) also got a mention. Comment—Soares’ talk was to an extent, subliminal advocacy for IBM’s ‘Infosphere’ master data platform. Now who might be best set to implement such?

Johan Kinck of Spring Energy described the E&P data landscape from the viewpoint of a small oil company. Smaller companies’ IT/DM function requires a ‘jack of all trades.’ Typically, key geotechnical IT is managed in house, with help from hosted services such as Petrobank, L2S, NPD etc. ‘Commodity’ IT is outsourced. The application environment will likely include tools like Petrel, SMT, FFA, possibly linked together with OpenSpirit. Data quality is just as important an issue for small companies as it is for the majors. Imperfect E&P data standardization is problematic just as is tool diversity. A complex IT landscape spans Linux, SQL, Oracle, Access, and various flavors of Windows. DM capability from vendors is ‘uncertain.’ Vendors are not ‘evil’ although they do have their own agenda. ‘Sometimes we feel a bit neglected.’ Small companies can rarely afford high-end solutions and avoid large investment in hardware or tools. Spring, along with other, smaller Norwegian operators, has set up ‘Information Management Oslo,’ to share insight and experience.

Statoil’s Eldar Bjorge has seen a lot of data management initiatives—from Geoshare, DAEX, POSC to PPDM. These tend to focus on data modeling rather than information management. Statoil’s SCORE project was about an enterprise architecture, before the term was invented. If the oil industry was a leader a decade ago, other verticals have caught up and brought new concepts and powerful resources to play. EIM is the subject of many publications and reports from consulting houses. Bjorge noted contributions from Accenture, Forrester Research, DAMA, Schlumberger and The Open Group. While E&P is different, ‘we have something to learn and something to share.’ Statoil has used the DAMA data dictionary as the basis for its own data management Wiki. Statoil’s IM functions and domains have been mapped into a DAMA-esque framework. Statoil now has an IM governance framework in place and an enterprise data model based on the DAMA framework. E&P has had a long focus on specific challenges, but now other industries have caught up and have stuff to teach us.’

A follow-up presentation from Liv Stordahl Borud fleshed-out Bjorge’s presentation revealing that Statoil was currently rolling out Software AG’s Aris tool (www.oilit.com/links/1109_32) to underpin its information architecture. A master data pilot worked in parallel with the EA project providing input on the information layer in the architecture. Initial IM maps and catalogs are now being migrated from Excel to Aris (www.oilit.com/links/1109_32). The information model includes subject areas and related processes, stewards, data source and master storage.

Lars Olav Grøvik offered an entertaining take on the current state of data management which remains a pain point for many operators. One joint venture reported real oil production from a non existent well. There may be different number of wells in different systems. Looking for data still delays projects and it is the same for most large oils companies. Re-determination and unitization should be easy with modern information systems but they are not. There are always errors in the unitization database. Departments initiate ‘shadow’ data systems because of the challenges of corporate IT leading to reporting and compliance issues. A recent communication from NPD, the Norwegian regulator said that no Norwegian operators were in total compliance! Statoil was in the mid to upper range. The good news is that things are still working despite the data explosion. The not so good news is that data is still ‘exploding’ with even more data generated by permanent seismic arrays and petabytes of seismic on disk. New uses for data mean that we need to break the link between databases and applications. Grøvik sees possible salvation from the ‘big data’ solutions deployed at Yahoo, eBay and others.


Semantic Days 2011, Oslo

Integrated operations in the high north update. Statoil’s new environmental monitoring solution. Model-driven access to OPC. IBM’s ‘safe technology roadmap.’ EPSIS’ oil platform topology.

In June, the annual ‘Semantic Days’ conference was held in Oslo, Norway. Semantic Days is billed as a ‘meeting place for industrial and public sector use of semantic web technologies.’ The conference included a special session on Norway’s richly-endowed Integrated Operations in the High North (IOHN—Oil ITJ March 2010) project, one of the industy’s most ambitious test beds for semweb technology.

Karl Johnny Hersvik traced Statoil’s long march on the interop road with earlier initiatives of an ‘integrated information platform,’ ‘global operation data integration’ (GODI) and now, IOHN, real time environmental monitoring and ‘open data for innovation.’

Statoil’s Vidar Hepsoe unveiled the ‘new collaboration model’ for environmental monitoring. Complex operations in the high north mandate a paradigm shift from ‘expeditionary’ offline sampling to continuous monitoring. Operators need to demonstrate that they are not harming sensitive cold water coral structures with real time monitoring of seawater chemistry and video. This means blending data from echo sounders (fish radar) cameras, sonar doppler current profilers, hydrophones and hydrocarbon sniffers. This involves a multi disciplinary, multi system activity with ‘disparate systems and non standard, point solutions.’ The solution? System integration based on semantic technologies, along with ontologies and modeling rules that ‘understand’ what ‘environmental impact’ means in different contexts. Semantics and ontologies promise shared understanding, improved processes and better collaboration.

Trond Solberg, solutions architect for plant integration with Statoil described data modeling in industrial domains. A tender following the GODI program resulted in the selection of IBM’s information integration core (IIC) solution. IIC provides ‘model-driven’ access to OPC connectors into the IBM Infosphere service bus. Statoil is currently piloting the technology at four assets. The idea is to have enterprise-wide access to plant and equipment related data, through standardized information models feeding data from diverse sources into end-user applications. Solberg advocates a ‘bottom up’ approach, focusing on relevant use-cases and assessing what equipment attributes are needed. ‘Decide what these are and what they should be called. Model them and nothing else. But keep in mind that you will have to expand the model later so don’t do anything that will restrict expansion.’ No single standard provides all the required functionality. Solberg concluded enigmatically that ‘an integration layer is not always the correct answer to a given problem.’

IBM’s Ron Montgomery compared the ‘walled garden’ of proprietary operational eco-systems with the open source movement. Open source may appear ‘chaotic’ with uncertain support. But it can be the key to unlocking the garden gate—as a ‘new industry solutions model where systems of systems interoperate in an industry eco-system based on open, supplier neutral standards.’ Enter IBM’s ‘safe technology roadmap’ a combined ‘reference and execution environment’ spanning OpenO&M, Mimosa and ISO 15926. The system is being tested at the Northwest Upgrading/Redwater Partnership tar sands project. Engineering consultant Assetricity provides a range of Open O&M tools to the initiative including a ‘model-based information transformation engine’ to map between OpenO&M, ISO15926 and the information models of historians and control systems. Ultimately use cases will include field maintenance, asset configuration updates and semi-automatic triggering of condition-based maintenance.

Project manager Frédéric Verhelst kicked off the IOHN session. IOHN is a four year, $15 million project that is to complete next year. IOHN turns on the ‘application of semantic models in ISO 15926’ for ‘proactive monitoring and management of production critical sub-systems in collaboration with external expert centers.’ Two proofs of concept are underway. One, sand control, leverages ‘data standardization and abstraction of domain knowledge using a semantic model.’ The other, erosion monitoring, applies ‘autonomous decision making using a knowledge model based on an ontology.’

Baard Henning Tvedt (EPSIS) walked through the process of building a topological model of an oil platform, a hub between a reference data library, with SQL links to production accounting and SPARQL access to other data sources. He concluded, perhaps unsurprisingly, that ‘the graphical (UML) modeling approach better included domain experts than the RDF triples,’ but that the Visio UML modeling tool was ‘unsuited to automated model transformation.’ Those with a strong constitution may like to checkout more ISO 15926 presentations on www.oilit.com/links/1109_9.


Fuse, Western Refining, ConocoPhillips on Teradata

Fuse demos seismic on Teradata SRDS. Teradata user conference to highlight oil and gas use cases.

At ECIM, Fuse was showing its XSeis/Streamline seismic data server on a Teradata server providing access to a full trace database. ‘Regular’ SQL provides random access to inline and cross line and (real soon now) horizon slices. This flavor of Streamline leverages Teradata’s spatio-temporal registered data structure SRDS, the subject of a technical presentation at last year’s American Geophysical Union fall meet(www.oilit.com/links/1109_30). The system can also perform trace-based processing and attribute analysis. All of which ‘could be done in Oracle but would lack Teradata’s scalability’ as demonstrated in flagship client eBay’s 40 petabyte system. Fuse have also developed a Petrel plug in for XSeis to stream data from any data source.

Teradata got a vicarious plug in Lars Olav Grovik’s keynote. Grovik bemoaned the current state of play in upstream data and implied that things were ‘better at eBay,’ a large Teradata shop. Other oil and gas Teradata references were unveiled in the agenda (www.oilit.com/links/1109_31) to the Teradata user conference to be held next month in San Diego. Presenters include Western Refining CIO Blake Larsen, Paul Kissell, VP IT at ConocoPhillips’ Eagle Ford/Integrated Operations of the Future unit. Kissel is to discuss the ConocoPhillips and Teradata ‘journey’ to turn data into information that can be used throughout the organization to impact real-time operational decisions. More from www.oilit.com/links/1109_33 (Fuse) and www.oilit.com/links/1109_34 (Teradata).


Folks, facts, orgs ...

ITF, Absoft, Aker, Altair, Baker Hughes, CiSoft, UtiliPoint, Decision Strategies, Dow, eLynx, ENGlobal, UT, MIT, IFP, ISS, OFS Portal, Oiltanking Partners, Pioneer, PwC, Select Energy Services, Senergy, Schlumberger, Spectraseis, Oracle, SMT, Westheimer.

ITF is inviting proposals for R&D into non conventional oil reservoirs—www.oilit.com/links/1109_37.

Aberdeen-based software specialist Absoft has recruited Eric McAdam as head of professional services.

Aker Solutions has appointed Bengt Larssen as head of its geo business. Larssen was previously with Western Geco.

Mauro Guglielminotti has been appointed General Director for Southern Europe and Africa of Altair Engineering France.

Andy Sallis is now president of AMEC Oil & Gas Americas.

Martin Craighead is to succeed Chad Deaton as CEO and President of Baker Hughes in January 2012. Darrell Howard has joined the company as president, integrated operations. Howard hails from VICO Indonesia, a BP/Eni joint venture.

CiSoft has appointed Cauligi Raghavendra vice dean for global academic initiatives and Joe Qin as vice dean for academic initiatives, Asia.

Patrick Reames has rejoined UtiliPoint division CommodityPoint as analyst and MD for the Americas.

Patrick Leach has been named CEO of Decision Strategies replacing Chris Reinsvold who has resigned. Steve Jacobs is the new COO.

Jim Fitterling, president of Dow’s corporate development and hydrocarbons is to assume executive oversight of the Chemicals and Energy Division.

Gary Tootle heads-up eLynx Technologies’ first international office in Calgary.

ENGlobal has appointed Dennis Pisula as Senior VP, business development. He hails from WorleyParsons.

The University of Texas and MIT are seeking funding for a new ‘research center for environmental protection at hydrocarbon energy production frontiers.’ REEF will be co-located at UT’s campus in Austin and MIT.

Jean-Jacques Lacour has been appointed Director of Strategy for French Petroleum Institute IFP Energies Nouvelles.

Shane Attwell, founder and MD of ISS group has stepped down as MD and is replaced by CEO Richard Pang.

Weatherford International’s president of IT, Michael Dove, has joined the OFS Portal board.

Former CEO of TransMontaigne Partners, Randall Larson has been appointed to the board of Oiltanking Partners.

Current President and CEO of Pacific Star Energy, Ken Thompson, has been appointed to Pioneer Natural Resources’ Board of Directors.

PwC has appointed its former US Chief Diversity Officer Niloufar Molavi as US Energy Leader and head-up its Houston office. The company has also promoted Craig Friou and Jade Walle to partners in its energy practice.

Select Energy Services has named Brady Crouch VP HR & HSSE. Crouch was formerly with ConocoPhillips.

Stuart McAuley and Murray Douglas head-up Senergy’s new facilities engineering and project delivery unit, Senergy Development Solutions. McAuley previously founded HPHT Solutions. Douglas hails from AMEC.

Schlumberger Information Solutions has hired Anderson Lopes as data management specialist, Olorunsola Abdul as reservoir engineer, Oghenevwoke Ighure as market analyst, Dorcas Karikari as reservoir engineer and Christophe Lallement as software engineer.

Todd Chuckry heads-up Spectraseis’ new Calgary office. Rob Oulton is Micro-seismic sales and marketing representative.

Jay Hollingsworth is director of Oracle’s oil and gas industry business unit. He was previously with Schlumberger.

Andy James is now director of software development at Seismic Micro-Technology.

Tom Ripley is now CTO at Westheimer Energy Consultants.


Done Deals

Cameron, LeTourneau, Joy Global, Dell, Force10, Emerson, Net Safety Monitoring, SensorTran, Halliburton, HP, Autonomy, Lawson Software, Approva, Intertek, QinetiQ, Lufkin, Pentagon Optimization Services, Nova Metrix, Sensornet, Tendeka, SGI, OpenCFD, Spectraseis, VSG, Noesis.

Cameron is to buy LeTourneau Technologies drillings systems and offshore products divisions from Joy Global for approximately $375 million in cash.

Dell is to acquire data center networking company Force10 Networks.

Emerson Process Management has acquired Calgary-based Net Safety Monitoring, a provider of toxic and combustible gas detectors, flame detectors, and safety systems.

Distributed temperature sensing (DTS) systems producer SensorTran, has become a wholly-owned subsidiary of Halliburton. DTS will become a component of Halliburton’s fiber optic center of excellence in Houston.

HP is to acquire enterprise information management software house Autonomy Corporation for £25.50 per share cash.

Infor affiliate Lawson Software Americas has acquired continuous controls monitoring software developer Approva.

Safety solutions provider Intertek has acquired the QinetiQ fuels and lubricants testing business.

Lufkin Industries is to buy Canadian well optimization company Pentagon Optimization Services.

Nova Metrix has completed the acquisition of UK based asset monitoring solutions provider Sensornet. Sensornet’s in-well oil and gas business is being retained by its former owner, Tendeka.

SGI has completed its acquisition of open source computational fluid dynamics company OpenCFD.

Spectraseis has received new investment of CHF 2.8 million from SVC-Ltd, a wholly owned Credit Suisse Group unit.

Visualization Sciences Group is to acquire image processing and analysis software provider Noesis.


Wireless World

Fern Communications for Qatargas. M2M and Skywave. CalAmp and Matrikon. RigNet and Intelsat.

Qatargas is deploying ‘FRX-1’ radio repeaters from UK-based Fern Communications. The system has solved a created during loading operations as tankers were blocking line of sight communications. The units are now standard components of Qatargas’ onboard communications equipment. More from www.oilit.com/links/1109_20.

M2M Data Corp. now offers remote oilfield monitoring services through SkyWave’s IsatData Pro solution, a 10kB packet data service over the Inmarsat global satellite service. More from www.oilit.com/links/1109_21 (M2M) and www.oilit.com/links/1109_24 (SkyWave).

CalAmp has joined MatrikonOPC’s global partner network. CalAmp provides for mission-critical communications using licensed, spread spectrum and cellular technology. CalAmp will integrate Matrikon’s OPC servers into its mission critical wireless solutions. More from www.oilit.com/links/1109_22.

Oil and gas communications service provider RigNet has awarded contracts to Intelsat for new satellite capacity. The service hike targets the middle east, the gulf of Mexico and other locations. IntelsatOne will provide connectivity to head offices via Intelsat’s teleport and terrestrial facilities. More from www.oilit.com/links/1109_23.

A new wireless broadband standard promises to ‘open rural America to the internet.’ More on the IEEE 802.22-2011 ‘multipoint’ WAN from www.oilit.com/links/1109_25.


2011 Cyberinfrastructure Summer Institute for Geoscientists

Caltech shows impact of ‘big data and big computing’ with spectacular whole earth models.

Speaking at the CSIG’11: ‘Big Data and Big Computing in the Geosciences’ event Michael Gurnis, director of Caltech’s seismological laboratory, presented an overview of 3D and 4D dynamic earth models. These are used to study the ‘large scale space time pattern of the geological record, drivers of sea level change and the forces behind plate tectonics.’ Such processes are best understood by connecting disparate observations in a 4D dynamic earth model—going beyond plate tectonics to model the whole earth from core to surface in a time lapse, 4D supermodel. This, as one can imagine, raises several software and computational ‘issues.’ Gurnis reviewed the state of play, with academic efforts such the GPlates (www.oilit.com/links/1109_26) Python framework for plate visualization, CitcomS (www.oilit.com/links/1109_27 and Rhea/P4est (www.oilit.com/links/1109_29).

The 80 megabyte PowerPoint is a free download (www.oilit.com/links/1109_28) and provides some great imagery from these models. Caltech’s own ‘Rhea’ model is an ambitious attempt to perform full physics modeling across the whole earth. Rhea uses adaptive mesh refinement to adapt model resolution (AMR) to local complexity. A 22 million initial mesh translated into a 65 million cell model that took 15 hours to run on the 6,000 core Texas Advanced Computing Center’s ‘Ranger’ cluster. More from CSIG on www.oilit.com/links/1109_27.


Visage shows analytics on Geologic System’s dataset

Bertrand Groulx and Nathalie Saint Hilaire investigate Monteney natural gas and Bakken oil shale.

In a series of recent postings, Visage Information Solutions president Bertrand Groulx and Geologic Systems’ blogger Nathalie Saint Hilaire have been showing off Visage’s ‘visual analytics’ capability using Geologic’s hosted Canadian well data. Visage provides an infrastructure layer to public and in-house systems from third part vendors though its ‘Dynamic ETL Technology’ that assembles information on the fly and offers ’ in-memory’ data processing.

Combining Geologic’s data with other sources has provided insights into production from British Columbia’s prolific Monteney natural gas province. Plotting production against horizontal well azimuth shows the strong influence of direction on production. Spatial plots show the Monteney ‘sweet spots’ and allow further drill down to investigate the influence of frac count and spacing on productivity. The ad-hoc investigative nature of Visage lends itself to interactive analysis—such as understanding the impact of a gas processing plant was shut down and the influence of gas price on activity. Groulx has also investigated the Bakken oil shale play of SE Saskatchewan. Early indications are that initial 2011 production rates are 10% up on 2010. More from www.oilit.com/links/1109_6 (Visage) and www.oilit.com/links/1109_7 (Geologic).


BP plans Houston real time operations center

Company pledges revised drilling standards to Boemre.

In a letter to Michael Bromwich, director of the US regulator Boemre, BP E&P outlined revised deepwater oil and gas drilling standards for its Gulf of Mexico operations. The ‘voluntary’ standards go beyond regulatory obligations and reflect the company’s ‘determination’ to apply lessons it learned from the Deepwater Horizon accident. The new standards address blowout preventer design and testing, cement testing and enhanced spill response planning. In addition to the new performance standards, BP has also implemented several actions including the establishment of a real-time drilling operations center in Houston. The company is also to share its experience of the use of remotely operated vehicles and management of marine response vessels and activities. Download BP’s letter to Boemre from www.oilit.com/links/1109_28.


Sales, contracts, partnerships and deployments

Kongsberg Oil & Gas, IDS, Energy Solutions, ABB, KBR, Assai, McLaren Software, CJV, Dow, Saudi Aramco, Emerson Process Management, Rockwell Automation, Honeywell, SAP, Industrial Defender, CoreTrace, Kinesix, Opsens, Paradigm, Acceleware, Meridium, Sintef, TNO, IFP, Hart, VSG.

Kongsberg Oil & Gas Technologies and Independent Data Services are teaming on a reporting platform to merge real-time and offline oilfield data for web delivery.

MOL unit Slovnaft is to deploy Infotechnics’ ‘Opralog’ shift logging software at its Bratislava refinery.

Energy Solutions International is supplying a gas management system to Petronas’ gas transmission pipeline project. Solution delivery will be executed by local agent Sedia Teguh Sdn Bhd.

GDF Suez unit Storengy has awarded ABB a main automation and instrumentation contract for a new natural gas storage facility in Cheshire, UK. ABB will provide engineering design services for the automation, safety instrumented system and 800xA integrated process control and safety systems.

KBR has won a pre-Feed contract with Anadarko Mozambique for its proposed LNG plant in Mozambique. KBR has also won an engineering design and procurement support services contract from Hyundai Heavy Industries for the BP Quad 204 FPSO Project west of Shetland and a contract from BP Norge for engineering studies for the Hod revamp.

EnQuest has chosen AssaiDCMS and AssaiNET as its corporate document control and management system.

McLaren Software has been awarded a $1.4 million contract by Chevron Australia for its Enterprise Engineer software and services on the Western Australia Gorgon Project.

Chiyoda, CB&I and Saipem joint venture CJV has been awarded a Feed contract for the Arrow LNG project in Curtis Island, Queensland, Australia.

Dow and Saudi Aramco are forming a joint venture, Sadara Chemical Company, to build and operate a chemicals complex in Jubail Industrial City, Saudi Arabia. Total project value is $20 billion.

FH Tank Storage is using Emerson Process Management’s Rosemount wireless level and pressure transmitters to provide overspill protection at its Kalmar storage terminal on the east coast of Sweden. The deal includes a plant-wide Smart Wireless network, DeltaV digital automation system and AMS Suite predictive maintenance software to automate tank storage level monitoring.

Canadian Grizzly Oil Sands has awarded a $4 million order for Rockwell Automation’s PlantPAx process automation system for its Algar Lake Project.

Honeywell and SAP are teaming to combine Honeywell’s production planning and scheduling tools with SAP’s ERP toolset.

Industrial Defender has signed an exclusive deal with CoreTrace to market its ‘Bouncer 6’ host intrusion prevention system to the oil and gas market.

Invensys has requested new licenses for Kinesix’ ‘Sammi’ graphics application-development suite for a ‘large Petrochemical Corporation’ in Beijing.

Opsens has received an order to instrument 21 wells from a major Alberta oil producer using its OPP-W Fiber Optic Pressure Sensors.

Fugro has selected the Paradigm/Acceleware reverse time migration solution for depth imaging complex geologies.

Gladstone LNG has selected Meridium as the platform for its enterprise-wide APM initiative.

Norway’s Sintef, TNO in the Netherlands and IFP Energies Nouvelles (France) are teaming on a new ‘Tri4CCS Alliance’ which aims to make the capture, transport and storage of CO2 safer and more cost-effective.

TractBuilder has teamed with Hart Energy to provide GIS datasets for the energy sector. These include natural gas, crude oil, wells, products and more.

Visualization Sciences Group has signed a global partnership with Schlumberger for expanded integration of VSG’s Open Inventor 3D graphics technology across the whole range of Schlumberger software products.


Standards Stuff

Boemre embeds OGP spec. ECCMA announces data quality summit. Energistics’ geographic metadata profile. IFIT releases IT glossaries. Wellstorm’s Ruby Witsml tools. CoLan new release.

Boemre is to incorporate large sections of the new drilling hazard site survey guidelines produced by the OGP Geomatics Committee into its regulations.

ECCMA’s 12th annual data quality solutions summit will be held in Tannersville, Pennsylvania on October 25th-27th. The summit focuses on implementation of the ISO 8000 standard—www.oilit.com/links/1109_40.

Energistics’ energy industry profile of ISO/DIS 19115-1 v1.0 release candidate (EIP) is now available for community review. The EIP geographic metadata standard is designed to enable the ‘discovery, evaluation and retrieval of distributed information resources, including datasets, documents and physical resources’—www.oilit.com/links/1109_41.

The International Foundation for Information Technology has released a number of new taxonomies including a ‘glossary of glossaries,’ terms and definitions. A software taxonomy covers enterprise and IT software—www.oilit.com/links/1109_42.

Wellstorm is sharing its Ruby software library for Witsml. The library is used in tests and simulations and comes as a Witsml runtime and command line tool—www.oilit.com/links/1109_43.

Piping systems research & engineering company NTP Truboprovod OOO has joined the CO-Lan board. CO-Lan has also released a new thermodynamic and physical properties interface specification v1.1.


TA Cook Research on turnaround/shutdown software

Review finds turnaround lost production ‘opportunity costs’ exceed cost of turnaround itself.

A 118 page review by TA Cook Research investigates scheduling practices for turnarounds and shutdowns. The report includes interviews and contributed articles from industry thought leaders. To give some examples, Martin Karges (Voith) offers a case history of the 2011 conversion of a catalyser at Total’s Antwerp refinery. TA Cook’s own Gert Müller writes on advanced schedule optimization. Tectura’s Udo Ramin reports on shutdown management with Microsoft Project Server. Tim Taylor, McLaren Software, discusses document management and control in support of ‘successful and efficient plant shutdowns and turnarounds.’

Taylor observed that lost production during process industry turnarounds can seriously impact the bottom line. Such ‘opportunity costs’ often exceed the labor and material costs of the turnaround itself. This highlights the value of planning, scheduling and procurement. But also key is good document management in support of planning. As built documentation needs to be revised and new documentation created prior to shut down. After the intervention, new documentation must validated before use by operations.

While there is significant marketing content in the study, the report does provide up to date information on turnover and project management and offers a good introduction to major players in the EU. The study is in two parts. Part I: Current status & market insights (117 Pages) is a free download. Part II: Best practices and potential (121 Pages) costs €2,200. More from www.oilit.com/links/1109_16.


Baird blogs on PPDM in a Box

Geologic Systems’ pre-populated PPDM implementation now ready for ‘at scale’ testing.

In a recent posting, Wes Baird describes the state of play on Geologic Systems’ ‘PPDM in a Box’ (PiaB—Oil ITJ December 2009). PiaB is a pre-populated PPDM 3.8 implementation including master, asset lifecycle and business process management. PiaB targets start ups, ‘green field’ developments, agencies and national oil companies. Turning the bare bones of the PPDM data model into an industrial strength product that includes Sarbanes-Oxley compliant data audit across 1,700 tables requires a considerable amount of extra work. The latest ‘preliminary build’ release of PiaB (V1.2) relies on database triggers to ensure coherency as a single update may touch multiple columns in multiple tables. Triggers ‘mindlessly and consistently perform these simple tasks over and over again.’ Triggers ‘can be a maintenance nightmare, they need a carefully planned architecture’ to work.

Baird has created database triggers for both Oracle and SQL Server flavors of PiaB which have been running trouble free for a couple of months. The next task is to test ‘at scale’ on GeoLogic’s full size data set of ‘a couple of billion production rows.’ Baird concludes, ‘Database triggers save time and money and add consistency. They just need some care and feeding and architecting to make them add business value.’ More from www.oilit.com/links/1109_18.


RPSE—government largesse for US shale gas explorers

‘And the winner is ...’ funding for fracture technologies and AI applied to poroperm prediction.

The research partnership to secure energy for America (RPSEA) has announced several award candidates under its unconventional resources program and its small producer program. Candidates are invited to enter into negotiations prior to the award of a research contract from the US department of energy’s national energy technology lab. Proposals must provide a minimum of 20% cost share with up to 50% for field demonstration projects.

Unconventional subject areas include research on ‘non-contaminating’ cryogenic fracturing technology for shale and gas reservoirs (Colorado School of Mines and Carbo Ceramics); geomechanical analysis of fracturing (Texas A&M, TerraTek and Apex HiPoint); fracture monitoring by downhole temperature measurement (Texas A&M, Hess and Shell); permeability prediction in the Piceance (Colorado School of Mines, Houston Advanced Research Center).

Awards in the small producers program cover porosity and fluid prediction from mud logs and drilling information using artificial intelligence (Correlations Company) and technology for residual oil development in the Permian Basin (University of Texas of the Permian Basin, Advanced Resources International and others). More from www.oilit.com/links/1109_19.


Stonebridge ESB for P2ES environment

Enterprise service bus provides master data management for Excalibur, Bolo and other tools.

Oil and gas IT consultant Stonebridge has announced a master data management (MDM) solution leveraging data from P2 Energy Solutions’ upstream software suite comprising Excalibur, BOLO and Enterprise Essentials. Stonebridge’s new methodology extends P2ES’ software to create MDM solutions for business intelligence solutions including key performance indicator dashboards, mashups, spatial applications and visualization. P2ES director of product management Timothy Wadle said, ‘Stonebridge has developed a repeatable approach for creating MDM solutions that extend our software, collecting key well data into a single information source.’ The MDM solution extends a standard Excalibur deployment and lets companies combine data from daily field reports, reserves, regulatory, engineering and other functions. Data can be repurposed across the organization through workflow and other productivity-enhancing concepts. The solution offers a well data ‘gold standard,’ a.k.a. a ‘single version of the truth.’ More from www.oilit.com/links/1109_ 35.


Well integrity management package for Tullow Oil

Expro Well Services’ ‘SafeWells’ package underpins global operations.

Tullow Oil’s ‘spectacular growth’ (Oil IT Journal March 2011) continues with a new wildcat success in French Guyana. The company is also extending its IT infrastructure with the global deployment of Expro Well Services’ (EWS) ‘SafeWells’ integrity management software. Expro’s Simon Copping told Oil IT Journal, ‘Work on SafeWells started six years ago with a successful implementation at Marathon Oil. Today clients include Marathon, Tullow, Ithaca, Star Energy, Conoco Phillips Indonesia, CNR and Taqa. We have six monthly user forums to align the tools with clients’ needs. SafeWells is flexible and easy to deploy and can be tailored to a company’s operating policies and procedures without the need for bespoke development.’ Tullow required a centrally managed integrity system to monitor its worldwide operations covering onshore and offshore wells including subsea completions. SafeWells was initially rolled out on Tullow’s Bangora Gas Field in Bangladesh and is now deployed on its UK and Ghana operations.

Tullow’s well integrity manager Simon Sparke said, ‘SafeWells is to be used throughout Tullow and is being rolled out company-wide. With ten different well types in challenging engineering and cultural environments, having a software package that was user friendly, flexible and working seamlessly in the background was crucial.’ Other EWS clients include BP, Centrica and Shell. More from www.oilit.com/links/1109_1.


Aberdeen Business School identifies safety ‘information gap’

Aveva-sponsored study finds poor training and lack of clarity in HSE procedures and documentation.

An Aveva-sponsored study by Professor Rita Marcella and Tracy Pirie of the Aberdeen Business School at Robert Gordon University, has investigated the health and safety ‘information gap.’ The headlines from the report are reassuring, 92% reported that their company has an information system to support health and safety and 80% described these systems as ‘uniform and consistent across the organization.’ Over 80% of respondents felt that the systems providing core metrics for safety management, ‘supported them in assessing and improving safety and in responding to an emergency.’

However, over 30% had never received training on how to access information needed to operate safely. Respondents found that accessing information was challenging due to systems failures, procedures not covering specific circumstances, filing issues, missing data, poor local communications infrastructure, complex systems and information overload. A similar proportion of respondents reported a lack of clarity in the safety information provided and the circumstances in which it should be used. 40% felt that information systems and information quality needed improvement. 35% were aware of incidents where they or their colleagues had not recorded information related to near misses.

Whilst respondents were typically confident that they were sharing information with others, 24% felt that relevant information was not being shared back! Respondents felt a need for more open communication across the industry of knowledge as to where things had gone wrong, as well as what ‘best practice’ actually means.

The efficacy of corporate commitment to safety was questioned. Some feel that safety may still be secondary to compliance or reputational imperatives, ‘despite the rhetoric.’ Safety processes comply with regulations, but there is confusion in regard to regulatory standards. It can be challenging to work for multiple clients with different systems and expectations as to how data would be presented. This can require significant resources, particularly when dealing with environmental regulations and risk. Interviewees highlighted a tension between the high expectations of extra information need encountered during an incident and what is seen by many staff as the burden and sheer scale of documentation with which they have to deal on a day to day basis. More from www.oilit.com/links/1109_16.


‘Fracture predictive technology’ models cracks in Macondo riser

MIT’s lab floats ‘substantial’ research program in oil country tubular failure analysis.

Researchers from the Massachusetts Institute of Technology have used technology developed for automobile crash testing to investigate the broken riser pipe of the 2010 Deepwater Horizon accident. Tomasz Wierzbicki, professor and director of the MIT Impact and Crashworthiness Lab believes that these simulations might help identify stronger or more flexible pipe materials that would minimize the impact of a future large-scale accident. Wierzbicki has developed a technique called ‘fracture predictive technology’ (FPT) through his work in car-crash safety testing. FPT combines physical experiments with computer simulations to predict the strength and behavior of materials under severe impacts.

The computer model of the riser, developed in a custom implementation of Dassault Systèmes’ Abaqus, included a simulation of the Deepwater Horizon blowout, the sinking of the rig and the bending moment on the riser. The predicted failure points closely matched imagery from the ROVs. The researchers believe that although no material could have survived the Deepwater Horizon disaster, many improvements could be made to enhance oil and gas tubulars.

Wierzbicki told Oil IT Journal, ‘About the time of the Macondo incident we received a small grant from Shell and we have invested some of our own resources to prepare a paper for the ISOPE conference held at Maui last June. Our presentation triggered interest from the oil and gas industry and we are now in contact with three majors. It is hard to predict the results of this initial excitement because old and traditional methods are deeply entrenched in this industry—as well as the API standards. We are now working on a fundamental paper that will compare advantages and disadvantages the different approaches.’ A two-day fracture workshop will be held at MIT on October 6 and 7, 2011, participation is free. MIT is floating the idea of a ‘substantial research program supported by a JIP.’ More from www.oilit.com/links/1109_36.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.