A presentation at the Fiatech conference, held in Austin, Texas earlier this year, underscored the gap between emerging data standards for process plant information management (IM) and the reality of today’s ‘giga-scale’ projects. Bob Donaho from Dow Chemical offered some IM lessons learned building the huge Ras Tanura Integrated Project (RTIP), a grassroots petrochemical joint venture between Dow and Saudi Aramco.
Ras Tanura, currently in the front end engineering design (FEED) phase, comprises some 25 process plants, each a major project in itself with multiple project management and contractors located on two continents. IM is considered a strategic facet of RTIP as the project’s critical capabilities are addressed in the context of ‘marketplace constraints.’ The joint venture has selected various engineering ‘staples’ including Intergraph’s SmartPlant, Primavera, Kbase, ETAP and Microstation. Other ‘giga-project’ enabling tools include Active Risk Manager, Skire cost and project management, Adobe LiveCycle, RTIPSupplier (contracts) and HP Quality Center (requirements).
Donaho described how the project set out to find ‘best of breed’ owner or project manager tools but found that there weren’t any! RTIP set out to built one—with idea of sending out standardized specs to contractors and receive consistent data back. But the reality was that such an approach proved unsustainable in view of the number of contractors and tools involved.
The initial vision also called for ISO 15926-based data for ingestion by an as-yet unspecified joint venture system. Again this proved unrealistic as it proved impossible to locate a data delivery requirements document that was a) understandable by all and b) biddable by engineering contractors.
RTIP’s vision of engineering systems with rich/robust functionality was also thwarted by the fact that the industry ‘standard’ is Excel. Engineers think in lists—the simpler the better! Contractors will commit to using Excel but ‘freeze up’ when talking databases, where there are ‘too many unknowns and too much impact on internal processes.’ Donaho thinks that the industry could use a Portal type entry point for information regarding ISO 15926 along with a biddable delivery requirements document specifying ISO 15926 or a derivative. ‘Non-invasive’ data validation tools could ensure compliance with owners’ data delivery requirements. A common 3D review platform would be nice too.
Another Fiatech presentation from Emerson’s Duane Toaves echoed the Dow findings, noting that the key common terminology section of ISO 15926 was only 15% complete. More from Fiatech in our report on page 7, on industry standards in Neil McNaughton’s editorial on page 3 and on Fiatech from www.fiatech.org.
A new report from the International Association of Oil & Gas Producers (OGP) confirms the old adage that ‘standards are like toothbrushes, everyone has one, but nobody wants to use yours.’ The report, ‘Regulators’ use of Standards1’ reflects use of standards in regulatory documentation, particularly for materials, equipment, systems and structures for the offshore petroleum industry. Standards usage comparison from around 20 countries’ found 1,140 different standard titles cited of which 87% were referenced by only one regulator! Standards emanated from no less than 60 different international, national and industry bodies.
Standards from the American Petroleum Institute dominate, with 225 references, including 49 from the API Manual of Petroleum Measurement. ISO follows with 152 of the regulatory standards referenced (59 from ISO/TC 67). Elsewhere, regulatory standards often overlap and duplicate each other—notably for offshore structures and pipelines. Most references are to undated revisions of standards. Compliance appears to be voluntary in most jurisdictions. The study prepares future OGP interaction with regulators via the International Regulators Forum (IRF) and will support further work on ISO/TC67.
This month’s issue has a distinctly engineering flavor with a lead on the Dow/Saudi Aramco Ras Tanura ‘giga project’ and a report from the 2010 Fiatech conference. Fiatech and its Norwegian analog, the POSC-Caesar Association (PCA), seem to have drunk the ‘semantic web’ Kool-Aid to the full. The latest manifestation of the ISO 15926 engineering data standard has been used in several flagship offshore projects such as BP’s $4 billion Greater Plutonia development, Chevron’s Agbami FPSO, Petronas Carigali’s Information Management system and Woodside’s ALIS. What stands out from the 2010 Fiatech meet is the fact that semantic web technology has been applied to a major engineering problem—interoperability and handover. But as the Ras Tanura experience shows, it’s proving hard to wean engineers from Excel. There is however a lot of interest in data sharing between owner operators and engineering contractors and this collaboration is probably more important than the protocol.
When I first started out writing about data management, someone observed that this was not likely to be a very interesting topic since, at the time (1996), E&P data management was pretty much a done deal—notably with the then new and authoritative POSC Epicentre data model. This did give me cause for thought—those embarking on a new venture are particularly sensitive to suggestions that their ideas or business models are anything but brilliant! But the topic seemed worthy of pursuit in a journalistic way if only because there was, at the time, a lot of politics involved, with competing data models from POSC and PPDM and vendor solutions—Landmark’s Petrobank and Schlumberger’s Finder were battling for world dominance. Plugging on and publishing the newsletter turned out to be a good call because, although data models may come and go, the vendors are still slugging it out and politics is alive and well as I will now relate.
The PNEC conferences (report on the recent 2010 edition next month) started back in 1998. Historically, PNEC has been an unofficial meeting point for the upstream standards organizations. It started out as the Geoshare user group and conference and has had a long time close relationship with PPDM and POSC (now Energistics). But Energistics was conspicuously absent from the 2010 PNEC and appears to be backing an ‘upstart’ data management event organized by the ‘World Research Group’ in September. While competition is a great thing, you have to ask whether the industry really needs yet another data show.
Getting back to the Journal, I like to think that our reports from PNEC (along with other significant conferences organized by SMi and IQPC) constitute something of a data management body of knowledge—most of which is in the public domain (all Oil IT Journal issues over a year old are freely available on oilit.com.)
I use the wording ‘data management body of knowledge’ advisedly. At the 2010 PNEC, Schlumberger’s Steve Hawtin introduced the Data Management Association (DAMA) to the upstream community. As assiduous readers of Oil IT Journal will know, we have been following DAMA since 2003, particularly when it mooted an E&P SIG (March 2008). DAMA’s current claim to fame is its Data Management Body of Knowledge (DMBoK) published last year—and which we will review in the near future. Hawtin revealed that the Society of Petroleum Engineers is getting exercised about data management too, and is setting up to create, ex-nihilo, an E&P DMBoK. We hope that we will be invited to contribute to this effort and are putting our ‘back catalog’ at the DMBok’s disposal.
Meanwhile on the other side of the pond, more E&P data politicking is going on in respect of the PESGB’s Data Management initiative. This has been kept aloft by one enthusiastic PESGB member, Paul Duller, who had the temerity to create a PESGB Data Management group on LinkedIn. Well the powers that be did not like this and Paul has had to pull the PESGB tag from his group.
These goings on suggest to me that there is a goldilocks phenomenon that affects the relationship between the Societies and their members. If you do very little, then, surprise, surprise, nothing gets done. If you do a lot, it seems like the Societies get a bit iffy about what is perceived as ‘commercialism’ with consequences such as the above! Having been personally involved in a couple of Society-sponsored initiatives I have to say that in general, the effort that goes into them is a lot less that that which is put into your average ‘commercial’ venture. There is nothing quite like the thought of a meal at the end of the day to drive initiative—and there is nothing quite like a loosely assembled committee of folks with a daytime job to suck it out of the air!
Oh, I nearly forgot to mention ECIM—the Norwegian E&P data management body. ECIM is different. It is Norwegian after all. In Norway, if you need data you go to DISKOS. If you need data management, you go to ECIM. Simple isn’t it. Well almost, except that if you want subsurface data management it’s ECIM. If you want the topside data then you will have to check in with PCA!
Is E&P data management is a ‘done deal?’ In the May 2010 issue of E&P Magazine, Energistics CEO Randy Clark states ‘The E&P industry is still at the early stage of common standards development.’ This, 20 years after POSC’s founding. PCA’s ISO 15926 is likewise reported as being on the verge of at-scale deployment, 20 years after its predecessor, the Caesar Offshore Program kicked-off.
Like everyone, I’ve been following events in the Gulf of Mexico, in particular the live feed from the sea bed. The silent ballet of the ROV’s with their Black and Deckers hacking away at the riser is mesmerizing. It is curious to think that on the surface the recriminations fly while on the seabed, robotic heros are acting out a reprise of Apollo 13.
The inaugural International Digital Oilfield Conference (IDOC) was held in Abu Dhabi last month. The show had a good turnout (138) and a strong technical program. Schlumberger’s Elie Daher, who heads up the NexT training establishment, defined the digital oilfield as ‘an integrated asset underpinned by automated decision support and pro active asset management.’ For Schlumberger, DecisionPoint can be deployed as the hub of a digital oilfield as illustrated with an example from Petrobras.
SAIC’s Bart Stafford offered a different slant, describing the ‘digital oilfield of the future’ as ‘more than software.’ SAIC has teamed with ESRI on a ‘branded’ digital oilfield offering, ‘DOF 2.0.’ This leverages a combination of SCADA data, collected over a WiMax network and an integrated asset model. For Stafford, while ‘the oilfield runs on data,’ under-investment and lack of standards reduce data quality and complicate access. Today’s reality is that ‘there are few if any broad consistent corporate digital oilfield deployments,’ although there are many pilots. DOF 2.0 leverages a ‘process-based’ model as opposed to the usual ‘application-based’ approach. The underlying visualization, collaboration and workflow tools are now mature enough to make DOF 2.0 a reality. A separate SAIC presentation from Damon Brady described how SAIC’s EngineeringEdge approach to designing and delivering solutions has been used to develop a DOF reference architecture.
Nicolas Kessler reviewed Total’s real time asset management initiatives including a ‘Full Control of Wells’ (FCW) program. This uses control algorithms implemented in the DCS for electrical submerged pumps and other optimizations. FCW was first applied on the West African Nikossa development and is now a standard for Total’s 500 operated wells worldwide. Another Total success was the pipeline management system using a dynamic multi-phase simulation tool developed for the Gulf of Mexico Canyon Express pipeline. This proved an ‘amazing success’ with a 100% increase in liquid handling capacity. On Girassol, a well performance monitoring system feeds field and facility monitoring and notification systems to provide continuous validation of online models.
Field monitoring defines Total’s approach to the digital oilfield. Kessler recommends ‘thinking before you break legacy systems!’ And also to provide comprehensive documentation. ‘Success comes from countless failures, you need to plan for failure and roll out successes quickly.’
Matrikon’s Joel Chacon-Fonseca described an ‘out of the box’ well performance management system used by Dubai Petroleum. The system, co-developed with EnQuest and Eclipse Petroleum Technology includes well surveillance and visualization of gas lift and ESP wells. A successful pilot has now been scaled up to 340 wells with robust IT and training for 50 users. Colored knowledge maps are used to show gas lift rates and well efficiency.
Klaus Muller reviewed Shell E&P’s multi-year well and reservoir management (WRM) program. This is building a ‘smart field,’ collaborative working environment (CWE) for monitoring, control and automation from a WRM. The CWE is extended to field workers with Pixavi’s VisiWear cameras for remote collaboration and situation awareness. The WRM’s exception-based surveillance has hiked production by 1%. More from www.idoc-uae.com.
Speaking at the 2010 Palisade Risk Conference in London last month, Statoil’s John Zhao advocated ‘putting more science in cost risk analyses.’ Quantitative analysis allows the measurement of risk probability and the forecast of consequences. According to Zhao, such analysis performed by Wall Street’s ‘quants’ forecast the recent economic crisis but ‘they were ignored by the management generals.’ While Monte Carlo analysis using tools such as Palisade’s @RISK are popular in oil and gas, the simplistic line-item ranging exercise fails to capture large capital project contingencies. Empirical data has shown that many disastrous cost overruns were due to poorly evaluated contingent risks. To show management a complete risk, both systemic risks that history shows to be likely and specific risks with ‘discrete’ probabilities need to be con-sidered. The technique is to blend continuous probability distribution functions (PDF) for project cost estimates with discrete PDFs from a project risk register.
Zhao describes current approaches as ‘delinquent’ because they ‘intuitively respond to risks, derive statistics from mathematical models and lack empirical validation and business knowledge.’ This leads to non-credible analyses that are not trusted by management.
Zhao works through a cost risk analysis example to illustrate the effects of various inputs such as the type of probability distribution, correlations and dependencies and historical data. Zhao recommends ‘double triangle’ distributions and a ‘risk register table augmented with risk discrete MC functions and contingencies. Statoil’s approach also handles correlated risks such as the way the value of field piping hours correlates with scaffold labor hours.
Historical data can hold valuable clues to model building and for reality checks. But ‘the sad fact is that companies are poor at keeping historical cost data’. One analysis suggested project cost in the range $170-230 million at 90% confidence. But calibrating with historical data brought the confidence rating down to 66%.
Risk analysis is a practice that can be very confusing, even for the experts. There is insufficient academic research and very little mature, pragmatic empiricism. Putting more science into risk analyses is a start but there are more unexplored facets to the overall goal of integrating the entire value chain of oil and gas risk assessment.
Zhao’s novel approach also aligns with the AACEi’s1 Recommended Practices (RP 2009) for cost risk analyses. More from www.oilit.com/links/1005_7.
1 Association for the Advancement of Cost Engineering— www.oilit.com/links/1005_8.
A new ‘virtual institute,’ OpenEM.org has been launched following last year’s Berkeley workshop on the future of electromagnetic geophysics in the USA (NSF/EAR-0901046). The web-based institute is a showcase for electromagnetic (EM) applications and R&D in geophysics, including the joint interpretation of EM with seismic and other methods. The community-maintained ‘virtual institute’ aims for an open dialogue and information exchange. Anchor communitarians hail from Schlumberger, USGS, NSF, BP, Western Geco and ExxonMobil—along with a cohort of US institutes and universities (but non US members are welcome). OpenEM.org currently has 190 members. More from, err, www.OpenEM.org!
The 3.1 release of Tibco’s Spotfire business analytics platform claims to make predictive analytics a ‘mainstream feature’ of any business application. Users can perform ‘what-if’ scenarios on demand and find new insights in complex data sets. Ramona Hovey, senior VP products and services with oil patch data provider DrillingInfo Inc. said, ‘Spotfire continues to evolve, putting more power into the hands of our analytics team and allowing us to cut time to decision, while guiding users to strategic insights.’
‘We use Spotfire’s web-based analytics to build responsive, real-time business models for our internal and external user base.’ A new Statistics Services layer allows for the central deployment and execution of statistical scripts written in ‘R.’ R, originally developed at the University of Auckland, New Zealand, has become a flagship free software project and a de-facto standard for statisticians. Spotfire 3.1 also adds integration with SAP, Siebel and Oracle along with a new ‘mashup’ API. More on Spotfire from www.tibco.com and on R from www.oilit.com/links/1005_11.
Petris Technology has released a veritable cornucopia of software enhancements to its PetrisWinds (PW) platform. PW Analytics is a business intelligence tool for energy company performance management and measurement. PWA combines the data integration capability of PetrisWinds Enterprise application adaptors with the WebFocus business intelligence toolset from Information Builders. CEO Jim Pritchett explained, ‘The lifeblood of a company is its business intelligence. PWA lets oil and gas company managers measure and manage performance by viewing trusted data combined from multiple sources into a single dashboard.’
PWA blends realtime technical and financial data into ‘rich’ interactive reports and online dashboards and transforms dispersed data into concise reports to identify patterns and trends.
Another new application is Petris’ Well Lifecycle Manager (WLM), and extension to the PetrisWinds Operations Management Suite. WLM provides access to a well’s entire history—from prognosis and construction through abandonment. Product manager Ed Castillo said, ‘WLM captures large amounts of data from rigsite and office, adds QA and provides knowledge-based decision support for well construction and operations.’ Managed data types include inventory, budget, drilling, operations and maintenance.
Other announcements this month from Petris concern product upgrades to ZEH Montage Professional 3.0, now available on Solaris and Linux. New features include PDF export, transparency, CGM Level 3 support and an enhanced GUI. The 1.7 release of PetrisWinds DrillNET adds new modules for torque and drag for liner cementing and hydraulics for high temperatures and high pressures environments. Built on the Microsoft .NET Framework, DrillNET is available in English, Spanish, Russian and Chinese. More from www.petris.com.
Two French geostatistical boutiques, Geovariances and Estimages have opened the Moving-GeoStatistics (M-GS) Consortium for sponsorship. M-GS, an R&D project that targets the oil and gas industry, plans to leverage Estimages’ patented ‘moving geostatistics’ technique for local optimization of geostatistical parameters. It is claimed that the M-GS approach offers more realistic geological models and better assessment of uncertainty. M-GS spatially optimizes structural and computational variables as a set of dependent parameters. The process may be guided by objective or subjective criteria and is claimed to better align geostatistical models with the data.
The Consortium is to develop an automated parameter optimization process, algorithms for interpolation of anisotropy orientations and the implementation of a new ‘mathematical morphology’ approach. The geostatistical theory will be developed in partnership with the Geostatistics Group and the Centre for Mathematical Morphology of Mines, ParisTech. M-GS is scheduled to run for two 2 years. More on M-GS from www.oilit.com/links/1005_12, from www.geovariances.com and www.estimages.com.
Caesar Systems has announced the 2010 release of PetroVR with enhancements to reservoir functions, and enhanced validation tracking and resolution.
The 6.5.1 release of Coreworx’ capital project management toolset introduces cost mitigation control and schedule risk management, laying the groundwork for a new ‘interface management solution.’
The 3.3 release of IHS’ Petra geological and engineering toolset sees new modules for directional well planning and decline curve analysis. A new three-panel display offers map, profile and spreadsheet views, and an improved link to Petra 3-D Viz. More from www.oilit.com/links/1005_24.
Data Matters’ ‘dM Rings’ PPDM explorer is now available from the Apple Store—see www.oilit.com/links/1005_20.
Venezuela-based Egal is looking for partners to market its ‘Sirius’ surface to subsurface geological interpretation toolset. More from www.oilit.com/links/1005_21.
Emerson has rolled out the Roxar Sensor Retrieval System for remote subsea production systems’ sensor replacement. The company is to donate the interface to the industry to ‘protect customers from proprietary systems.’ The remote operating vehicle (ROV) retrieval system can replace a sensor at ‘a fraction of the cost of traditional methods.’
The 201 version of Exprodat’s Team-GIS Segment Analyst, an ArcGIS extension for common risk segment mapping of exploration play fairways, improves usability and adds ‘combo’ probability calculations. More from www.oilit.com/links/1005_22.
Fugro-Rovtech has taken delivery of the first ‘DeepTouch’ ROV pilot training simulator from sister company Fugro General Robotics. The system will be used to evaluate subsea intervention engineering designs and train pilots. The system offers full force-modeled physics simulation for realistic ‘touch and feel’ interactivity.
New Century Software has announced version 4.0 of Gas HCA Analyst. The solution helps gas pipeline operators identify risks (‘high consequence areas’) and develop an integrity management framework. More from www.oilit.com/links/1005_23.
Petrolink recommends OpenOffice to users of its license-free ‘PowerShare’ oilfield file sharing service.
Oracle’s Primavera Inspire for SAP 7.0 now leverages components of SAP’s materials management solution. New capabilities include better stock visibility, e-mail confirmation of scheduling and performance enhancements.
Barco has teamed with Microsoft on an new ‘Envisioning Center’ demonstrator located at Microsoft’s Technology Center, Paris. The facility, which offers a media experience designed to ‘overwhelm’ customers, includes a a Galaxy NW-12, a three-chip DLP WUXGA 3D stereo projector and XDS Control Center software.
Merrick Systems has teamed with Cognex to incorporate the Cognex DataMan ID into its own RFID-based solutions for oil and gas asset tracking. Cognex’ solution adds Data Matrix 2D barcodes to Merrick’s ‘DynaCap’ life cycle traceability software and ATEX certified Diamond RFID Tags, including new HPHT tags unveiled at the Offshore Technology Conference this month.
Work on a seismic data processing system at Germany’s Fraunhofer Institute for Industrial Mathematics (ITWM) has resulted in a general purpose API for multicore architectures and high performance computing. The Global Address Space Programming Interface (GPI) promises high interconnect speed and scalability—an issue with the current MPI interconnect standard for HPC. GPI is available for C, C++ and Fortran under Linux.
GPI, previously known as the Fraunhofer Virtual Machine, underpins Fraunhofer’s PrestackPRO seismic processing toolset, now marketed by startup SharpReflections. GPI has also been used in new production code for angle domain migration for anisotropic and side azimuth seismics that Fraunhofer has developed for Statoil. This is being extended into a new interactive imaging tool—the Seismic Development and Processing Architecture (SDPA) currently under development by a four company consortium. Fraunhofer’s Franz-Josef Pfreundt told Oil IT Journal—‘GPI has given us a technology advantage over our competitors. Moreover, GPI has the potential to completely replace MPI. Oil and gas market with its high demand for HPC is the first vertical to pick up this technology.’ GPI is marketed by Fraunhofer spinoff Scapos. More from www.scapos.com and www.oilit.com/links/1005_10.
Schlumberger Information Solutions has announced the 2010 release of its flagship Petrel upstream geomodeler. Petrel 2010 adds new tools for exploration ‘risk management,’ management of multiple, large seismic volumes, a new modeling-while-interpreting function that automates structural framework building and more. A new ‘Petrel Database’ is claimed to improve scalability and multi-user collaboration. Other novelties include a drilling visualization plug-in, well path modification while drilling and automated design, placement, and completion optimization.
This month also sees the official opening of the Ocean Store—a shop window for Petrel plug-ins from third parties (see also Oil IT Journal November 2009). Plug-ins are rented, not sold, and prices range from $50 to over $50,000 for a one year rental. More from www.oilit.com/links/1005_28.
Chesapeake founder and CEO Aubry McClendon provided the keynote talk on ‘Shale gas and America’s energy future.’ Shale gas is ‘abundant’ and a ‘superior’ molecular structure means ‘freedom’ from dirty coal and ‘dangerous foreign oil.’ But producers of the new unconventional resource are being thwarted by the powerful coal lobby. McClendon sees ‘100 to 200 years’ of gas supply making for a ‘completely radical’ view of the future. More opposition comes from the chemical companies who say, ‘Not so fast. We don’t want new uses for natural gas.’ It behooves the industry to spread the word about gas in transport and power generation. Gas fits better with the environment, ‘Gas beats coal on CO2, sulfur and particulates. Nobody’s trying to coalify gas!’ McClendon finds it ‘amazing’ that Americans don’t have access to this fuel at the equivalent of $1.25 per gallon.
Richard Nehring, Nehring Associates was more circumspect about the ‘100 years or more’ of shale gas. Since reserve decline set-in in the 1970s, there have been several false dawns for new energy sources including self-sourced, deep and ultradeep onshore and deepwater gas. These were perceived as offering great potential, but such ‘transitional’ gas resources are now below 5% of the total. Transitional production peaked in 2003 and is currently declining rapidly. Energy policy options depend on ‘the timely and accurate assessment of the potential of the largest shale plays.’ But it might take decades to reduce the uncertainty. Near term implications are clearer—shale gas and CBM already make ‘LNG unnecessary for 20 years.’ In the Q&A Nehring noted a similar uncertainty over coal, ‘Coal is getting deeper and sourer. Easy coal is gone. It is a paradoxical resource that may be too expensive to produce!’
A presentation by Mark Taylor, BP Algeria, showed how a stripped down ‘fit for purpose’ geological model and a selection of application software supports complex multilateral drilling on BP Algeria’s Teguntour onshore gas field. Here complex multilateral wells are drilled in a geomechanically sensitive reservoir. Baker Hughes’ ‘AziTrak’ azimuthal deep reading resistivity tool is used to keep the hole in the 3-5m thick reservoir. EarthVision’s Coviz is the key application and is duplicated on the rig. A RACI Chart1 ensures that ‘everyone knows what they are going to do.’ This single point of accountability replaced ‘geosteering by committee.’ Taylor described CoViz as a ‘godsend,’ bringing functional 3D models to the rig to ‘keep the drillers on board.’
Shell’s Matthew Wolinsky highlighted issues in sedimentological modeling as processes are upscaled from bed to basin. Process-based numerical modeling is a popular research activity with projects such as the SAFL2 Jurassic Tank and the Delft3D3 model. Models of basins at the macro scale show lobe switching in a delta to be ‘self organizing’ under constant forcing. As models are used to investigate longer time intervals, they become sensitive to initial boundary conditions . Such ‘butterfly effects’ are well understood in weather models which need constant tweaking with new data. This is not an option for the sedimentologist. Models are also highly compute intensive. Even if the next decade sees a 1,000 fold speedup in computing, it would still take weeks to simulate a millenium.
Jess Kozman (CLTC) described CO2 capture and storage (CCS) as leveraging the same data set as conventional E&P. But the hope/expectation is that the new industry ‘will do a better job of managing its data.’ ‘Better’ in this context means the provision of a ‘sound’ data set into the public domain, managed over a ‘century plus’ project lifetime. A case study was presented using a PPDM back end and workflow application to deliver field study results to asset teams. The application from 3GIG was originally designed for gas production, but has been turned over to track CCS, connecting to other data stores and apps. Kozman noted that Schlumberger is setting up a carbon services group to advise power generators.
Cornell’s Gregory Kirkpatrick presented Matlab-based finite element code, ‘PorousM3’ tuned for CCS. The fully parallelized code can run four million degrees of freedom on an 8 gigabyte node and is being tested on a proposed CCS site near the AES Cayuga power station in Lansing, New York.
Gary Kinsland (Louisiana at Lafayette) described a collaborative effort between computer scientists and geologists which has resulted in an immersive 3D Virtual Reality (3DVR) system combining digital well logs and Shuttle Radar Topography Mission (SRTM) surfaces. A study of the coal and CBM potential of Northern Louisiana involved a VR investigation using over 1,000 logs.
Richard Denne’s (Marathon Oil) paper on microfossil taxonomy traced the subject’s rise in the 20th century with applications in paleoenvironment, sequence stratigraphy, age modeling, and pollution. The advent of computer modeling has distanced researchers from the fossils which are now ‘just a series of data points.’ Denne fears a future penury of micropaleontologists who can generate quality data and that the future will see misidentifications and the loss of local markers and zonation schemes. The answer lies in new methods of digital photography to capture key type specimens. In this context, the Zeiss Universal microscope provides a depth of field sufficient to capture a complete image of a specimen.
Anthony Gary (Utah) reports on use of semantic technologies to enable ‘real-time interaction between specialists and data stores.’ A semantic foundation is under development by the Commission for the Management and Application of Geoscience Information. Utah has started work on an ontology of foraminifera.
A more practical presentation came from Stephen Hasiotis (Kansas), demonstrating the use of multistripe laser triangulation (MLT) to characterize trace fossil morphology and ‘ichnopedologic’ sedimentary fabrics.
Ingelise Schmidt’s (Maersk Oil) presentation gave strong backing to Eliis’ PaleoScan tool to ‘fast-track’ seismic interpretation and modeling. PaleoScan uses an optimization technique to create a ‘continuous geo-model’ using ‘every seismic sample.’
1 Role, accountability, consulted, informed—www.oilit.com/links/1005_25.
2 St Anthony Falls Laboratory Jurassic Tank—www.oilit.com/links/links/1005_26.
Fiatech is on a roll, new members in 2009 included ExxonMobil, Oracle/Primavera, Petronas and Siemens. Current projects of interest to the oil and gas vertical include project planning, a global valve e-catalog, automating equipment interchange, an RFID cookbook, operations and maintenance and the flagship ‘accelerating ISO 15926 deployment Phase II.’ The latter builds on the Camelot infrastructure and the ‘iRing’ interoperability demonstrator, now delivered into public domain and available for business use. A board of directors mandate says, ‘Make ISO 15926 operational in 2010.’ Looking ahead to 2015, Fiatech plans to ‘institutionalize’ ISO 15926 as a strategic initiative to ‘project Fiatech across the industry.’
Andrew Hall provided an update on a portal called ‘ALIS,’ Woodside’s Asset Lifecycle Information System for engineering data management for owner operators (Oil IT Journal November 2009). ALIS has been a four year journey for Woodside and a success for application software provider Aveva. Integration has been achieved via a ‘tag’ paradigm across P&ID, CMMS, DMS, GIS, models and training/certification. There are some 400,000 tags in the ALIS system. Woodside is to hand over its engineering characteristics library to POSC/Caeser as a contribution to the ISO 15926 reference data library.
Thore Langeland, integrated operations manager at the Norwegian OLF operators’ trade body explained the evolving role of the Norwegian E&P Information Management Association (EPIM). EPIM was established to manage common industry solutions such as LicenseWeb, AuthorityWeb and EnvironmentWeb. EPIM also manages the content and format of XML schemas used for drilling and production reporting. From July 2010, EPIM operate NorHub1, a repository for standard equipment information, a.k.a. ‘EqHub.’ The plan is for EqHub to ‘cover at least 80% of the standard equipment in use in the E&P sector. NorHub is building on ISO15926.’ OLF has also announced RFID guidelines2.
Emerson’ s Duane Toaves provided a working perspective of interoperability and ISO 15926. The data model (Part 2) is now complete, but the common terminology (Part 4), described as an ‘open source Wikipedia-like dictionary, is only about 15% complete. Other standards components, the templates (Part 7) and ‘Facades’ (Part 9) are up and running. Work on ISO 15926 began in 1993 and is now ‘moving towards industry-wide deployment’ with some 40 oil and gas projects. Emerson is ‘embracing’ the standard with a multi purpose tool to browse equipment schemas and a new data transformation server. This will map between Rosemount schemas for P/T transmitters and ISO 15926.
Another engineering contractor, CH2M Hill is using an ISO 15926-based schema to manage its catalogs as Renee Lmoureux explained. CH2M Hill leverages Bentley Exchange to read and write ISO-compliant schemas and ‘improve project execution and performance.’
AspenTech’s Andrew McBrien made a passionate argument for ISO 15926, recalling earlier standards work with CAPE Open. Here production-scale implementation revealed ‘ambiguity, errors and impracticalities’ that drove significant evolution of the standard which is now embedded in all mainstream process simulators. Another standard, the 1995 pdXi data model was orphaned by lack of take-up—but resurfaced in the ZYQAD Process Workbench (now Aspen Basic Engineering). AspenTech’s ISO15926 implementation revolves around XMpLant—described as an easy entry-point with a strong network-effect and a good platform to contribute enhancements to reference data. McBrien concluded saying ‘ISO 15926 delivers value today, bake it into your IT strategy!’
A presentation from Fluor
on work process technology for mega capital-projects introduced IMpart,
an engineering information management solution developed for Fluor by Coreworx.
IMpart is also known as Coreworx Interface Management. More from www.fiatech.org.
Bangalore, India-based IT services behemoth (95,000 employees) Wipro Technologies is starting to productize its oil and gas SAP-centric services. The first offering is a pre-packaged service for implementers of an SAP-based enterprise information management (EIM) service tuned to the oil and gas vertical. The new service was unveiled by Wipro’s UK unit at the 2010 SAPphire Now conference in California this month.
Wipro’s EIM service allows consolidation, matching and merging for the purpose of unifying master data and analyzing global spend, products and customer footprints. The solution claims improved data visibility and compliance with multiple regulatory requirements. The EIM service leverages SAP’s NetWeaver master data management component and SAP BusinessObjects data quality management. SAP executive VP Mark From-Poulsen commented, ‘This offering complements our strategy to strengthen the SAP ecosystem and demonstrates co-innovation between SAP and Wipro, a key partner in our ecosystem.’
Wipro’s oil and gas expertise stems in part from work with Shell as a supplier of development services, packaged implementations, consulting, and application support. This includes an interesting proof of concept (POC) demonstrator that involved the migration of Shell’s retail businesses’ B2B order placement and invoice provisioning system to a hosted environment. Wipro leveraged Microsoft’s Azure application and data hosting offering and technology adoption program to allow customers ordering fuels and lubricants to place orders, view order status, and obtain account information from the ‘cloud.’ The POC has demonstrated that Windows Azure ‘can be used to build a B2B integration channel between Shell and its customers.’ However it is unclear whether Shell is ready to bite the bullet and migrate its’ tried and tested, EDI standards-based B2B infrastructure to Microsoft’s Azure cloud.
Eivind Reiten has been named chairman of AGR Group’s board succeeding Hugo Maurstad who continues as director. Reiten is a former Norwegian Minister of Energy and ex-CEO of Norsk Hydro.
Thor Arne Håland has joined Badger Explorer as manager quality, risk and supply chain.
Deepak Venugopal has joined Belsim as petroleum engineer.
Mamatha Chamarthi has been named CIO of CMS Energy. She hails from Daimler Financial Services.
Energistics has a new CTO, Tracy Terrell, formerly MD of EnterSys Group. He replaces Alan Doniger who is to become principal consultant to Energistics while pursuing other professional opportunities.
Expro Group has named Chris Mawtus as COO. He was previously with Expro’s North America unit.
Gary Yu is Geotrace’s first CTO.
Grid Petroleum has appointed Maarten Middelburg as chief petroleum engineering advisor. Middleburg was previously with Bluewater Energy Services.
Sophie Jullian is Scientific Director of the French Petroleum Institute (IFP), succeeding Philippe Ungerer.
The UK-based oil Industry Technology Facilitator has appointed Brian Mercer, innovation manager at Production Services Network (PSN), to its board of directors.
In the wake of the Deepwater Horizon incident, US Secretary of Interior Salazar intends to restructure the Minerals Management Service (MMS) to establish a separate and independent safety and environmental enforcement entity.
Clare Bond is leaving Midland Valley to take up a research position at the University of Aberdeen. The company is in the process of completing an agency agreement with El Paso-based Keith Cardon to sell its ‘Move’ software to clients in the US.
James Lamb has been promoted executive VP of sales operations for Paradigm.
PAS has named William Dennison as VP and general manager of Europe and Africa operations. Dennison recently joined PAS from Honeywell.
Chevron’s Henry Posamentier has been awarded the London Geological Society’s William Smith Medal for his work in seismic stratigraphy and geomorphology.
Carlos Albano is to head-up SMT’s new sales and support office in Rio de Janeiro. Albano was formerly with Schlumberger.
RPS Group has appointed Luis Felipe de Paula as country manager of its RPS Consultores do Brasil unit.
Wellpoint Systems has appointed Andrew Bird as senior VP marketing and Alan Hopp as VP professional services. Bird hails from Configuresoft, Hopp from ePartners.
HP has appointed Tom Iannotti, currently MD and senior VP Americas and Enterprise Business at HP, to head up its Enterprise Services unit.
PTC has named Jim Heppelmann CEO, succeeding Richard Harrison, who becomes executive chairman later this year. Heppelmann joined PTC when Windchill Technology was acquired in 1998.
The Society of Petroleum Evaluation Engineers (SPEE) has elected Tim Smith as vice chairman of the UN Expert Group on Resource Qualification.
Mike Bowyer is Senergy’s first COO. He was formerly with Halliburton.
SensorTran has opened a Sales and Service Support office in Houston.
SGI has named Philip Chua MD and VP Asia Pacific and Japan (APJ). Chua hails from HP.
Triple Point Technology has named Wah Chu as Chief Customer Officer for the Americas. Simon Woods is CCF Asia Pacific. Woods was formerly with Ikat Capital, Chu with Netkey.
Philip Wade is now operations director, Western hemisphere with Trondheim, Norway-based Verdande Technology.
Erik Rhein-Knudsen is to head up the new Schlumberger/WesternGeco Oslo Technology Center, offering a collaborative environment for clients, technical experts and researchers. The unit houses engineering, testing, integration and support facilities for the organization’s proprietary seismic and electromagnetics acquisition technology.
Energy Ventures, in partnership with National Oilwell Varco and Scottish Enterprise’s Scottish Venture Fund, has invested more than $6 million in Sigma Offshore. The deal marks the firm’s third round of funding for the company.
Energy Ventures is also lead investor in the downhole technology service provider Read Well Services, which provides cased hole logging service and hydraulically expandable well construction and tubing/casing repair services. Viking Venture, KLP and the management team of RWS are also involved in the transaction. The investors and management will own 100% of the shares in RWS post transaction for a consideration based on an agreed enterprise value of NOK 170 million.
Real-time simulation and training solutions provider GSE Systems has acquired TAS Holdings in a complex transaction including $123,000 cash plus shares and an income-related payment.
Honeywell has acquired Matrikon for approximately $142 million and will integrate the unit into Honeywell Process Solutions, part of its Automation and Control Solutions business group.
Research In Motion is to acquire QNX Software Systems from Harman International.
Thomson Reuters is to acquire Point Carbon A/S, a Norwegian-based provider of trading analytics, news and content for the energy and environmental markets. Point Carbon content will be available through Thomson Reuters Eikon, the company’s new desktop offering to be launched later this year. Terms were not disclosed.
Sergio Morales, design and installations manager with Pemex, vaunts the merits of the AuraPortal Business Process Management Suite (BPMS) in a recent webcast. Pemex has been using AuraPortal for the last four years. The tool supports purchase and procurement, supply chain management, sales and invoicing. AuraPortal displaced an earlier manual system following an evaluation of 40 BMP vendors. Morales appreciates AuraPortal’s configurability without programming and reports that the solution has eliminated non conformance and that thousands of items are now processed error free. ‘The best thing is the fact that you can easily design it and have the processes you need immediately without IT programming. Previously we needed an expert to make changes.’
Morales rates AuraPortal as, ‘Better than the solutions from the major IT vendors. I give it 9 out of 10 today. It will qualify for a 10 when it gets rolled out across Pemex. AuraPortal provides great support, immediate results and ROI. Our people are happy!’ AuraPortal describes its HQ as a ‘bicephalous entity’ split between Boston and The Netherlands. Its software is developed in Spain. Clients include Coca-Cola, PepsiCo, Frito-Lay, Toyota and Yamaha. More from AuraPortal.com. Watch the video and brush up your Spanish on www.oilit.com/links/1005_13.
A recent paper by Ipcos’ Edwin Weetink presented at a Petronas-sponsored conference earlier this year investigates how process control best practices help sustain operational excellence. The paper covers control optimization in Ammonia and Urea plants but has implications for oil producers and refiners. Process industries need to optimize operations in the face of constantly changing requirements, variations in feedstock quality and operator interventions. Relatively small short term disturbances can have a big impact on longer term operations. Best practices in automated control are reducing the frequency of ‘excursions’ resulting from disturbances or sub-optimal operator intervention. Examples of best practices include optimized base layer controls to complete ‘auto-pilots’ for the entire plant.
Weetink’s thesis is that operators’ reactions to alarms and excursions frequently make for suboptimal setpoint changes that drive the process away from optimal, leading to reduced production rates. Control acts at two levels—‘base layer’ control in PID loops providing second-by-second control of flow, temperature and pressure and ‘advanced process control’ (APC) which acts as an ‘auto-pilot’ for the plant. The full paper is available on www.oilit.com/links/1005_14 (log on required).
IBM, Acclimatise and the Carbon Disclosure Project have produced a report on the challenges that climate change will bring. The report, ‘Building Business Resilience to Inevitable Climate Change’ finds that companies do not fully recognize the changing ‘risk landscape’ for oil and gas companies. Companies need to reassess their strategies and business models in the light of increasing temperatures, rainfall, rising sea levels and other climatic changes. Such ‘increasingly severe’ impacts create ‘new and enhanced risks for the oil and gas sector.’ Moreover the current reported value of proved reserves may not be taking into account the full impact of a changing climate. The authors have prepared a ‘Prepare-Adapt’ questionnaire to help oil and gas companies take the right steps towards building corporate resilience to inevitable climate change. More from www.oilit.com/links/1005_14.
Peloton has extended its SiteView well site management application to include emissions tracking and regulatory compliance. SiteView tracks drill site equipment along with operational run and down times. The program creates emission composition profiles and manages priority pollutants and greenhouse gasses. Data can be pushed to Excel or to specific reporting formats to satisfy State and Federal regulations. Loadout, venting and flaring events can also be recorded. More from www.peloton.com.
The new ‘Energy Analytics’ solution from Iconics leverages Microsoft’s Stream-Insight complex event processing (CEP) to assist companies’ energy conservation efforts by offering ‘actionable intelligence.’ Users can reduce energy consumption, aggregate and summarize energy usage by location, and forecast energy costs over time and reduce their carbon footprint. More from www.iconics.com.
RF Code has released a new version of its eponymous wireless environmental monitoring solution. New features include BIRT, an open source Eclipse-based reporting system to turn different types of environmental data into bar charts, scatter grams and pie charts, bookmarks for frequently accessed filters and and on demand graphing. More from www.rfcode.com.
Petrobras is registering its products and substances under the EU REACH regulations (REACH stands for Registration, Evaluation, Authorization, Restriction of Chemicals.) REACH aims to ensure a high level of protection to human health and to the environment. The first date on the registration timeline is December 1 2010.
Genesis Energy has chosen energy trading and risk management (ETRM) solutions provider Allegro Development Corp’s Allegro 8 platform to manage its crude oil gathering and refined products operations.
Aker Solutions has won a four-year, NOK 170 million, detailed engineering contract with Kebabangan Petroleum Operating Company for the Kebabangan Northern Hub development.
AMEC has been selected by Chevron subsidiary Cabinda Gulf Oil to perform front-end engineering design for part of the Mafumeira Sul development project.
Coreworx has announced a partnership with Empired Ltd. to market Coreworx’ software products in Asia-Pacific and to offer application hosting from the Empired data centre.
Sharecat Solutions has secured a contract from China Oilfield Services subsidiary COSL Drilling Europe to support the spares management activity for a project at Yantai Raffles Shipyard.
Petrofac unit Eclipse Petroleum Technology has teamed with Weatherford to further develop and market its PetroAtlas production engineering mentoring and guidance system.
Emerson’s Smart Wireless technology has been chosen by Lion Oil to check product inventory, improve employee safety at its Arkansas facility.
ENGlobal has been selected by Denver, Colorado-based Merchant Energy Partners to provide front-end engineering and design services for Phase One of the 90 million East Cheyenne Gas Storage Project.
The New Zealand Refining Company has selected Energy Solutions International ’s PipelineManager software for its leak detection capabilities. ESI has also signed an agreement with Buckeye Partners to license the training simulation application of its PipelineManager software.
Al Shaheen Energy Services and GE Oil & Gas have entered into a strategic partnership to support the continued growth of Qatar’s oil and gas and energy industries and consolidate GE’s presence in the Middle East.
ENI Saipem has chosen Intergraph’s SmartPlant as its worldwide standard.
IPCOS Aptitude has appointed BPC as its partner and agent for the State of Kuwait.
Petredec Services has selected Triple Point to manage commodity trading, enterprise risk, and vessel operations across its global LPG business.
Murphy Oil and an unnamed Calgary-based energy trust have just gone live on the Cortex Trading Partner Network.
3D visualization software and services provider Octaga, has committed to provide Technia with strategic support for selling 3D solutions to all vertical markets.
Foster Findlay Associates has signed a 6-year worldwide agreement with Statoil for its AVI software.
Invensys Operations Management, a has signed a $12.4 million contract to upgrade and modernize a distributed control system for Malaysia Liquefied Natural Gas.
Shell has chosen AT&T to provide an enterprise-wide unified communications service to its 150,000 users in more than 90 countries. The three-year, $90 million initiative for new UC services forms part of an estimated $1.6bn global multiyear strategic sourcing agreement signed in 2008, when Shell selected AT&T to manage a significant part of its IT infrastructure and telecommunications services.
Techno House has entered into a product development agreement with Octaga for integration of Octaga’s 3D visualization platform Octaga Enterprise.
Wood Group subsidiary J P Kenny is to carry out front end engineering design (FEED) for the subsea development of Apache’s Julimar Development Project in the Carnarvon basin, offshore Western Australia.
The Association of International Petroleum Negotiators has approved a Data Exchange Agreement to facilitate trades of seismic and well data. The DEA recognizes host governments’ interests in data and prompts companies to ensure all necessary approvals are sought for data transfers. A model schedule establishes standard terminology to assist with data identification. More from www.aipn.org.
The ISO TC 184SC4 committee for industrial data standards has launched a preliminary work item on ‘Oil and gas asset management and maintenance operations’ in response to a proposal from the US and Norway, and an ad hoc group to define the scope of any additional standardization requirements in the field of ‘mechatronics.’ More from www.tc184-sc4.org.
Technical Toolboxes has announced the ‘Fitness-for-Service Toolbox’ to help engineers and inspectors check structural integrity against the American Petroleum Institute’s (API) 579 specification. The package also offers new analytical procedures for analysis and remaining life assessment of existing equipment.
Total CIO Philippe Malzac, in an ad appearing on the Energistics website, offers a strong endorsement to the standards body stating, ‘Total is implementing Energistics open data exchange standards to simplify IT architectures and contribute efficiently to the exploration and production of increasingly complex fields.’
The W3C has announced XProc, a new XML Pipeline Language for managing XML-based business processes.
Tracey Thorliefson (Eagle Information Mapping) is writing a noteworthy and entertaining blog offering advice on how to select a standard pipeline data model. This boils down to a choice between the Pipeline Open Data Standard (PODS, www.pods.org), the ArcGIS Pipeline Data Model (APDM, www.apdm.net) and a Spatialized variant of PODS.
As a relational model, PODS is implemented on a Relational Database Management System (RDBMS) platform such as Oracle or Microsoft SQL Server. Also PODS is Geographic Information System (GIS)—neutral. Advantages of RDBMS technology include relational integrity, ease of access via SQL and data processing with stored procedures. One PODS downside however is that its neutrality re GIS systems makes for a multiplicity of vendor solutions to mapping from a PODS database which can hamper interoperability.
On the other hand, the ArcGIS Pipeline Data Model (APDM) builds on ESRI’s Geodatabase object-relational technology. This facilitates the creation of class hierarchies—useful for describing pipeline-specific features like valves and for creating templates for extending the model in a compliant manner. The downside is that SQL cannot in general be used on the data and relational integrity is not strictly enforced. We eagerly await Thorliefsons’s next posting—on the newest of the pipeline data models, PODS Spatial. Follow Tracey on www.oilit.com/links/1005_6.
Swiss-based Sensile Technologies has embedded Telit’s GE865 quad-band GSM modem into a new ‘Netris’ product line of autonomous telemetry units. These can be installed on tanks or connected to existing meters to transmit tank levels over the cellular network to a central server. Customers monitor tank levels from a secure webpage or from their ERP systems to optimize inventory levels. Sensile Technologies currently monitors more than 25,000 tanks across Europe and has systems installed in Africa, Asia, Australia and South America. An operational life of up to 10 years is claimed for the new units. More from www.sensile.com.
Pedigree Technologies has announced ‘Tag-n-Track’ a hosted solution for tracking high-value assets for field service automation. Tag-n-Track is a turnkey, web-based solution that provides location updates for a company’s tracked assets. The application combines satellite tracking data with location and performance information from enterprise assets such as vehicles, stationary equipment and people to provide a unified view of operations.
Pedigree CEO Alex Warner said, ‘We developed this in response to our customers’ need to be able to find mobile equipment and assets in large, open spaces like farms and oil fields. Tag-n-Track shortens the search time from days to minutes.’ Tag-n-Track is the latest of the Pedigree’s OneView applications designed to address operational inefficiencies in plants and equipment-intensive industries.
Ian McPherson, VP marketing at Pedigree added, ‘This solution starts with basic location awareness and extends to condition-based monitoring for sense-and-respond maintenance, logistics and replenishment.’ The application also provides equipment performance monitoring, inventory levels and environmental condition data. More from www.locateandmonitor.com.
Shell has selected Airbiquity to act as integrator for the telematics element of its new ‘FuelSave’ solution for commercial transport fleet operators. FuelSave links customers’ Shell fuel cards, vehicle sensors and a central database and can save up to 10% on fuel. FuelSave also calculates fuel-related CO2 emissions on a per tonne-km basis. Airbiquity will manage key components of the telematics service including installation and wireless provisioning. More from www.oilit.com/links/1005_9.
Invensys’ Operations Management unit has teamed with Byres Security Inc. and MTL Instruments to deliver a what is claimed as a ‘ground-breaking’ cyber security solution. The new Triconex/Tofino OPC firewall will harden industrial safety systems against network accidents and attacks. Invensys embeds OPC servers in its Triconex safety systems to enhance interoperability. To ensure security, Byres’ Modbus content inspection firewall has been added to the system. MTL Instruments build the new hardware.
Invensys portfolio architect Joe Scalia, said, ‘Processors are continuously threatened by new and increasingly dangerous cyber attacks which requires greater vigilance and security. The OPC firewall mitigates those risks, ensuring that an incursion will not compromise integrated communications between the safety and critical control systems and supervisory HMI or distributed control systems.’
The OPC firewall protects against malicious attacks and other threats to network operations, stopping attacks and traffic storms before they reach the safety and critical control system. It automatically mitigates risks related to previously published DCOM vulnerabilities, while providing packet management and rate limiting to prevent network traffic problems that could have an adverse effect on system stability.
OPC Foundation president Tom Burke added, ‘The next generation of the OPC Foundation interoperability specifications, the OPC Unified Architecture, incorporates similar cyber security protection. Invensys’ solution is an important milestone in demonstrating that users can expose OPC Classic solutions to other applications without worrying about cyber security.’ More from www.oilit.com/links/1005_8.
Invensys Operations Management of Plano, TX has upgraded its ‘IntelaTrac’ mobile workforce and decision support system, which received an enthusiastic endorsement from BP earlier this year. IntelaTrac 4.0 offers mobile workers access to information in Invensys’ Wonderware platform, allowing them to react to evolving plant conditions in real time.
At last year’s Microsoft Global Energy Forum, BP’s Danny Williams described IntellaTrac as a key component of BP’s drive to ‘operational excellence’ through the development and deployment of an operations management system (OMS)1. BP’s OMS, a ‘system of systems’ provides BP’s workforce with clear expectations as to what needs to be managed and how good is ‘good enough.’ The OMS supports performance metrics, risk mitigation and continuous improvement. BP’s OMS hinges on Microsoft’s SharePoint/PerformancePoint combo—a.k.a. MOSS. WonderWare/IntelaTrac plugs into this infrastructure and enables ‘intelligent workflows’ as opposed to traditional, pre-planned linear processes. field workers’ actions to
The new IntelaTrac release sees the addition of a dedicated oil and gas module with specialty calculations for tank volumes and pressure chart conversions along with data delivery to hydrocarbon accounting systems such as Tieto’s Energy Components. IntelaTrac 4.0 also enhances ‘operator-driven reliability’, ‘cost-effective regulatory compliance’ and is claimed to mitigate workforce turnover with mobile on-the-job learning capabilities.
BP’s IntelaTrac implementation was delivered by system integrator SAIC. Last year, Royal Dutch Shell’s downstream division deployed IntelaTrac as a component of its ‘ensure safe production’ (ESP) initiative (Oil IT Journal - Dec 2009). More from www.invensys.com.
A new publication1 from IBM’s Global Business Services unit describes the approach taken in its Turnaround (TAR) Optimization Solution (TAR-OS). Turnaround is the process of shutting down part or all of an offshore or onshore facility for maintenance or upgrade. While necessary, turnarounds act as brakes on productivity and profitability—downtime means lost or deferred production and costs associated with turnarounds are high and overruns are common.
TAR-OS uses structured information in enterprise resource planning tools, maintenance management systems (notably IBM’s Maximo) and unstructured engineering information to anticipate problems and to optimize turnaround frequency and duration. The new TAR-OS toolset provides decision support, risk analysis and planning along with an integrated view of the turnaround.
TAR-OS comprises three components, performance monitoring, performance analysis and TAR optimization. The solution embeds techniques such as linear/nonlinear programming and derivative methods. A TAR Key Performance Indicator (KPI) Dashboard tracks turnaround execution with KPIs aligned with the enterprise KPI tree, adding loss factors, availability targets, duration, scope creep, budget, and HSE performance. IBM believes that the Performance Analyzer/Optimizer combo, with its tools for TAR scenario evaluation lets companies select a program that minimizes production loss, enabling the decision-maker to ‘use more facts and less gut feeling in the process.’ TAR-OS is a component of IBM’s ‘Smarter Oilfields’ initiative. More from www.oilit.com/links/1005_3.
Kongsberg Oil & Gas Technologies has released ‘LedaFlow’, the fruit of a decade long collaboration between Total, ConocoPhillips and Norway’s Sintef R&D unit. Kongsberg UK MD Mike Topp said, ‘Ledaflow was developed to fulfill market demand for improved dynamic multiphase flow simulation, particularly in offshore production from complex, deepwater and Arctic environments with multiphase transport and long tie-backs.’
LedaFlow adds fine grained physico-chemical dynamic modeling of flowline behavior to Kongberg’s K-Spice dynamic process simulator. K-Spice is an integrated asset modeling solution that is used to design and monitor the complete facility from wells, flowlines, subsea processing, risers, onshore & offshore processing to export. LedaFlow models water and oil dispersions and gas bubbles in liquid phase. It also allows for modeling of 3-phase flows with solid particles such as sand and hydrates.
LedaFlow has been calibrated in extensive field trials at Sintef’s multiphase flow laboratory in Trondheim, Norway. Here, Sintef’s ‘Tilda’ test data set was augmented with new experimental data on novel geometries and configurations. These include up to 12” diameter flowlines operating at 90 bars and different combinations of multi-phase flow of gas, oil, water and particulate matter in transport. Consortium partners ConocoPhillips and Total have also validated LedaFlow against data from their operated fields covering gas/condensate and oil dominated systems and a range of pipe diameters.
Kongsberg’s simulator clients include BP, which used the toolset in an operator training simulator for its Angolan Greater Plutonio development and Statoil, which has built a dynamic production model for its Snøhvit field in the Barents Sea. In all, 80 Konsberg simulation solutions have been implemented world-wide. More from www.oilit.com/links/1005_4.