Late again! It smarts to be producing a 2010 issue in the early days of 2011. At least I am not writing about the previous millennium as I am sure I did in 2000. In our defense I would point out that after 16 years of publishing, a week late isn’t too bad. Also as a rather literal person, I feel there is an argument to be made for a ‘December’ issue relating what happened in December. Although this approach is clearly not taken by many magazines that already have their February or even March issues on the news stands!
2010 website stats for www.oilit.com
Publishing late does give me the opportunity of providing definitive stats for the www.oilit.com website for the past year (see above). The stats are provided by our ISP’s Urchin 5 monitor and are raw, unaudited numbers. They include visits from robots as well as real people. But taking them at face value, we had nearly half a million visitors in the year for just over three million hits. But the stat I find most gratifying is the 10 minute average length of session. Taken literally, that means that a total of 5 million minutes were spent on the site. Humanity, including its robot forms, spent collectively nearly ten years reading Oil IT Journal online during 2010!
As many of you know, Oil IT Journal’s publisher, The Data Room, has been producing Technology Watch reports from major tradeshows and conferences for the last decade or so. We have elected to discontinue the Technology Watch service in its present form as of year end 2010. Oil IT Journal and its successful www.oilit.com companion website are unaffected by the change. As indeed are our conference attendances. We have a full program planned for 2011 and will be bringing you our usual concise reports from all the major events. In fact we are planning to re-invent the Technology Watch report service later this year with a slimmed-down, more ‘sustainable’ edition highlighting major happenings and technology. If you would like to receive more on this service which will re-launch mid year, please send an email to email@example.com.
Looking over the website with a degree of self-satisfaction made me realize that we omitted to thank our 2010/2011 sponsors this year. They are as follows
Belated thanks to these great companies who have demonstrated their commitment to the oil and gas software business with a generous contribution to our running costs, helping to keep the million-plus word public domain archive on stream and make that 10 minutes average reading time worthwhile. More from www.oilit.com.
In a webcast this month, SAP CTO Vishal Sikka unveiled the latest buzz-cum-breakthrough to hit the business intelligence scene, ‘in-memory’ processing of high volume data streams a.k.a. complex event processing. SAP’s High-Performance Analytic Appliance ‘HANA’ is a hardware and software combo that can scale from a few hundred to 1,000 plus cores and beyond, enabling ‘massive parallelism for enterprise applications.’
According to Sikka, the first HANA-enabled applications have demonstrated the ‘disruptive’ nature of the new technology and the ‘unprecedented’ speed of in-memory computing. HANA has also shown-up the ‘latency’ of clients’ current IT systems. HANA promises to wipe out layers of IT complexity and goes beyond real time to be a foundation for SAP’s next generation applications in planning, forecasting and ‘big data.’
Delivered on a high end cluster from suppliers such as IBM and HP, HANA exposes terabytes of memory for compute-intense tasks. SAP claims HANA pilots have shown a 1,000 fold speedup for common business scenarios. CEP is of course not new, Microsoft, OSIsoft and others have offerings in this space (OITJ May 2009). Moreover, HANA’s big iron appliance-based approach is not limited to SAP. Oracle, Teradata and Netezza have similar offerings.
An SAP TechED presentation earlier this year featured an application that presages HANA usage in the oil and gas vertical. The SAP Oil & Gas Dashboard is currently built on a NetWeaver, BusinessObjects and Sybase stack. A demo showed real time data streaming from Norwegian oil rigs where around 600,000 sensor events are recorded per minute. ‘Very low’ bandwidth between the rigs and the central office mandated that high volume processing was performed offshore, in real time, with the SAP complex event processing (CEP) stack. The solution has allowed Statoil to drill-down through summary field-level data to pinpoint an underperforming well and fix compressor problems in a timely manner.
Under the hood, SAP’s CEP development environment was configured to track and monitor significant real-time events. Today, this can take a lot of ‘painful programming,’ in SAP’s ‘NetWeaver’ web services-based environment along with the ‘EventInsight’ CEP enterprise portal component.
The HANA concept, as Sikka states, does indeed sound disruptive, heralding a possible retreat from ‘loosely coupled’ sluggish web services-based enterprise systems to hardwired systems with gigabyte per second processing bandwidth. In fact, if HANA is half as good as it sounds, those most ‘disrupted’ may be SAP’s current business intelligence customers—but it will be in a good cause! More from www.sap.com.
Shell has signed an R&D
cooperation deal with Schlumberger covering two research projects investigating
reservoir surveillance for enhanced oil recovery (EOR) projects and a ‘digital
rock’ (DR) project covering fine grain numerical modeling of the pore space.
Gerald Schotman, Royal Dutch Shell CTO described the ‘multiyear’ agreement as
marking ‘another step in our technology strategy of delivering energy solutions
through open innovation. The cooperation will enable us to continuously improve
recovery factors and at the same time lowering unit costs.’
The deal will see Schlumberger’s formation evaluation and reservoir characterization know-how combined with Shell’s subsurface laboratory and reservoir expertise. The game plan is to develop new tools for field data acquisition, better numerical models and ‘enhanced field development methods.’ DR targets pore scale investigations of sandstone and carbonate reservoirs, building on recent developments in scanning technology, fluid dynamics, modeling, and high-performance computing. The research will be conducted in the US, UK, Russia, Oman and the Netherlands. More from www.slb.com.
Houston-based seismic imaging boutique Panorama Technologies reports a six fold speed up using Nvidia’s Tesla GPUs over CPU-only hardware. Panorama’s ‘Merlin’ seismic acquisition design too simulated a 12 second, wide azimuth shot over a 16 sq. km. block in 12 minutes on using a single Tesla board. Panorama also uses Teslas for imaging with its Marvell code base. CEO Chris Bednar told Oil IT Journal, ‘$250,000 worth of hardware can turn around a 30,000 shot deepwater wide azimuth survey in a couple of weeks.’ Code was ported to the Tesla at the compiler level. While CUDA and OpenCL are supposed to hide the complexity of the GPU, ‘optimization is always hard work.’ 25% percent of Panorama’s compute capacity is now GPU-based. This will double as more Tesla GPUs based on the Fermi architecture are installed. More from www.oilit.com/links/1012_8.
Hard on the heels of Petris’ release of its new Operations Management Suite, Oil IT Journal chatted with CEO Jim Pritchett and project manager Greg Palmer about how the new tool was designed.
JP—We acquired the original Operations Center (OpsCenter) when we bought Production Access back in 2007. Ops-Center is a drilling and production database built on Microsoft Access. It was a functional product but its scope was limited to US units of measure and an English language interface. It was not saleable internationally. But it was unique in the marketplace as the only integrated drilling and production system. OpsCenter has around 45 corporate clients today. We undertook a major revamp and re-wrote the whole thing with .NET, ported the database to SQL Server and made it language and currency independent. This took 2 years!
JP—Today’s application market is dominated by Halliburton’s TOW,1 which is a great product but one that we believe is nearing the end of its life. We believe the market is ready for a new toolset. Also many, perhaps most, potential customers still use spreadsheets for drilling and production. There is a huge potential value to a company in weaning its engineers from the spreadsheet.
What main technologies have you selected?
JP—Along with the Microsoft stack I just described, we have deployed an extended ‘superset’ of the PPDM data model. We also enable users to define their own workflows using technology we developed in our PetrisWinds Enterprise product. Users can for instance connect to SCADA for RT data, PetrisWinds Analytics compares real time data with costs in the ERP system.
How do your clients make sense out of RT data?
JP—Actually we use WITSML objects which provide summary data for, say, 24 hours of drilling parameters. It is not raw SCADA tag data.
How did you adapt to international reporting requirements?
JP—Reporting is a done deal for North America. Internationalization can be done in the framework leveraging the new units of measure functionality and Microsoft’s SQL Server Reporting Services.
GP—We see an evolving market with the global search for shale gas, HPHT2 and harder to find stuff. Operators are driving better information access. Running operations from an Excel spreadsheet is inaccurate and error prone. Users need integration and better data management. There is a push for more collaboration around an open solution. One that works across drilling and production and through planning, execution, operations at both well and asset levels.
Where did you extend PPDM?
GP—The main extensions concern AFE workflows and field modules. We also plan to offer API access real soon now. The offering is available in house or hosted by our partner data center SunGuard. More from www.petris.com.
1 The Oilfield Workstation.
2 High pressure, high temperature.
Philipp Janert’s new book1 ‘Data analysis with open source tools,’ (DA) described as a ‘hands-on guide for programmers and data scientists’ is more than that. Janert sets-out to address two common data analysis ‘risk areas.’ One is the use of statistical concepts ‘with a limited understanding of what they really mean.’ The other is the deployment of ‘complicated, expensive black box solutions’ used in the place of a ‘simple, transparent approach.’
One scene setting anecdote involved a client whose IT department recommended a cluster-based neural net to analyze product defect data. The solution Janert found was rather more economical—a one line calculation—not even a code! Janert offers some useful advice on educating the customer—observing that ‘few clients are in a good position to ask meaningful questions.’ This inevitably means that the statistician needs to be cognizant of the client’s business and terminology.
This book offers a deep approach to real world tasks. There is much more text than code. Janert’s contention that ‘statistics is usually equated with a college class that made no sense at all’ will chime with many. His promise to ‘explain what statistics really is’ should excite. He also provides insight into computational issues. For instance the value of the sine and cosine function for large values of x eventually degenerates to a random number as the limit of a float’s resolution nears.
While the thrust of DA is solving business (rather than scientific) problems, Janert is a polymath who is interested in his subject. This is conveyed particularly in the chapter on classical statistics which includes exposés on significance, design of experiment and a fascinating section on the Bayesian and frequentist approaches that are pretty a propos to the seismic imaging folks as we learned last month during the SEG’s Albert Tarantola memorial.
Also of interest to the oil and gas community is the section on financial analysis with clear explanations of net present value, risk analysis and opportunity costs.
Janert’s tools of choice are Python (with NumPy and SciPy) and Unix. On which topic, Janert curiously saves some unequivocal advice for page 494, ‘Work on Unix—I mean it. Unix was developed for precisely the kind of ad-hoc programming [ ] that encourages you to devise solutions.’
It is hard to fault this book except perhaps that at nearly 500 pages, it is too short! The couple of pages on Map/Reduce, for instance, fall way short of Janert’s pedagogical aims. But quibbles apart, DA gets a double thumbs up from this reviewer.
1 O’Reilly, ISBN 9780596802356.
A report from IHS Herold—the Oilfield Services Sector Review sees ‘a healthier outlook’ for 2011 despite the Gulf of Mexico drilling moratorium. The recovery is driven by ‘rising oil prices, unconventional drilling in North America and multiple opportunities offshore worldwide.’ The review compares financial data from major service companies including Baker Hughes, Halliburton and Schlumberger. More from www.ihs.com.
An audit by the Norwegian Petroleum Safety Authority (PSA) of Statoil’s 34/10-C-06A well highlights the ‘increasing risk’ of drilling in the Gullfaks area, with its abnormal pressure régime. Following a series of well control incidents, the well was plugged and temporarily abandoned in July 2010. The audit found ‘serious deficiencies’ in well planning. Non-conformances were identified in the areas of risk management, knowledge of and compliance with governing documents, documentation of decision-making processes and planning of managed pressure drilling operations. Well planning ‘made insufficient use of experience from earlier wells, such as well incidents, pressure measurements and general knowledge in the area.’ More from www.npd.no.
Following the huge success of non-conventional gas production in the US, Cheniere Energy Partners has engaged SG Americas Securities to advise on the development of liquefaction facilities at its Sabine Pass LNG terminal. Meanwhile, ConocoPhillips received the first LNG shipment from its Qatargas 3 project delivered to Canaport terminal in Saint John, New Brunswick, Canada. More from www.cheniere.com.
The Extractive Industries Transparency Initiative reports that Cameroon, Gabon, Kyrgyzstan and Nigeria are ‘close to compliance’ with EITI’s transparency rules. Indonesia and Togo have signed as candidates. To date, 33 countries have ‘started to implement’ EITI transparency standards. More from www.eiti.org.
The news ‘silly season’ was opened this month with an oil price forecast from Seismic Micro Technology. Respondents to SMT’s ‘Geoscientist View of the Future’ survey expect the oil price to ‘close in on $100 per barrel in 2011, and move towards $150 by 2015.’ Perhaps more reliably, 41% reported increased exploration expenditure during 2010 (16% saw a decline). Spend was especially high for organizations engaged in unconventional exploration. Microseismics was ranked as top new technology for 2011. More from www.seismicmicro.com.
Leo King, writing in Computerworld UK1 reports that National Oilwell Varco refused to provide access to its proprietary ‘HiTech’ application claiming this could ‘mislead’ government investigators. Investigator Fred Bartlit wrote to the Oil Spill Commission complaining of ‘a roadblock’ in the investigation claiming NOV was ‘generally uncooperative, either in the form of refusal or delay.’
Access to the package is required to allow investigators to recreate the surveillance data available to the Deepwater Horizon’s crew in the hours prior to the explosion. Investigators note BP has already provided data, and cementing contractor Halliburton is ‘in talks to provide key algorithms for data conversion.’ NOV responded claiming ‘manufacturing guesses as to what was displayed on the rig’s computers runs a serious risk of producing a misleading picture of what actually happened.’ Computerworld also reported that a slide prepared by investigators was displayed briefly on the OSC website indicating that BP had ignored the ‘advice’ of Halliburton’s cement modeling software in an attempt to save time.
A posting2 on the UK Guardian newspaper’s website, citing ‘confidential correspondence’ obtained by the newspaper, described the effects of an oil spill computer modeling exercise performed by Chevron on the West of Shetlands Lagavulin prospect.
The study investigated the effects of a ‘worst-case’ scenario of a 77,000 bopd spill lasting 14 days. The following day Computerworld’s redoubtable Leo King added to the story3 with the information that the Microsoft Windows-based Oil Spill Information System from BMT Argoss crashed repeatedly during modeling, limiting the run time to the 14 day time frame. In an email to the Offshore Inspectorate Chevron said it was working ‘at the boundaries of modeling capability’.
Geotrace’s Data Integration Services unit (a.k.a. Tigress) has announced GeoBrowse 4.0, a major update to its map-based E&P data browser. The new release includes an optional ArcGIS front end (legacy GeoBrowse is MapObjects based), data links to several new interpretation systems and a new deployment model that separates front and back office functions for enhanced IT integration. A secure ‘cloud’ implementation is also now available leveraging either Amazon’s EC2 or an enterprise private cloud based on the open source cloud solution from Eucalyptus Systems.
GeoBrowse 4 also makes extensive use of saved sessions and macros to automate data loading workflows with templates for (e.g.) daily production data. Tigress general manager Stephen Shorey told Oil IT Journal, ‘Usability and integration with existing systems may be old sentiments but they remain key to our philosophy. Users don’t want to replace existing data repositories. We offer support and customization services to improve data usability and security. And a solution for specialist and non-specialists alike – one that works as an exploration aid rather than just another work flow tool.’ More from firstname.lastname@example.org.
Release 202 of Exprodat ‘s Team-GIS KBridge adds licensing flexibility to the SMT Kingdom to ESRI ArcMap data link – www.oilit.com/links1012_10.
Wilson Urdaneta has just released a beta of ‘eXtendedSU’ his new SeismicUn*x GUI – www.oilit.com/links/1012_11.
Reality Mobile ‘s RealityVision 3.0 video collaboration software now supports Android allowing users of smartphones from Motorola, Samsung and HTC to contribute to the platform. A Screencast feature lets users turn their computer screen into a live video feed – www.oilit.com/links/1012_13.
Hampson-Russell Suite Version 9 sees a major functional rationalization and introduces pre-defined workflows. A demo video using the Colony sand dataset from Alberta is available from www.oilit.com/links/1012_12.
Merrick Systems and HDD Rotary Sales have demonstrated that Merrick’s High Pressure High Temperature (HPHT) Diamond RFID Tag survives ‘hardbanding,’ a protection applied drill pipe for use in abrasive formations – www.merricksystems.com.
V4.0 of Senergy Software’s Interactive Petrophysics (IP) adds an image log processing and analysis module providing workflows for processing and interpreting wireline and LWD image logs leveraging patented technology – www.senergyworld.com.
Kalido’s Data Governance Director offers data policy management via a centralized policy layer spanning business process, systems and data. A snip at $200,000 - www.kalido.com.
Transzap/Oildex Spendworks 4 claims speedier access to operational data and enhancements derived from a usability study conducted with Spendworks. More from www.transzap.com.
P2 Energy Solutions’ has re-jigged Excalibur V7’s Land and Gas Processing modules leveraging UniData’s SystemBuilder user interface development environment – www.p2es.com.
V 8.3 of Caesar Systems’ PetroVR upstream decision support tool adds functionality in sensitivity tools, scenario builder and validation –
Quorum Business Solutions’ Pipeline Transaction Manager now includes revisions to the NAESB 1.9 standard – www.qbsol.com.
V10 of ExperTune’s PlantTriage now offers real-time trending of process variables on a smart phone giving users access to control-room data while out in the plant. Users can browse reports and dashboards to pinpoint instrument and valve issues or tuning problems – www.planttriage.com.
Emerson Process Management’s Raptor tank gauging system includes instrumentation and inventory management software. Emerson also reports that it has achieved Achilles certification for its Smart Wireless Gateway. Achilles provides independent benchmarks to assessing network security and infrastructure resilience – www.emerson.com.
ERDAS 2011 includes global localization capability, support for Bing Maps imagery and map data, and a new product, ERDAS Engine, an application accelerator for ERDAS IMAGINE and LPS – www.erdas.com.
Expro ‘s HawkEye IV downhole camera enables an operator to control the lighthead and switch views. A ‘turbo’ mode ups the frame rate to 30 fps. Images are stored in the camera and transmitted by batch to the surface. The higher rates are used to identify fluid type and entry location – www.exprogroup.com.
Energistics’ 2010 Annual Member Meeting was held in Houston last month. President & CEO Randy Clark kicked-off the proceedings noting the addition of 20 new member companies including new members in China, Russia and Indonesia. Energistics’ standards effort revolves around the trinity of Witsml, Prodml and the emerging Resqml standards for, respectively, drilling, production and reservoir description. Reporting from a Witsml meet held earlier in the month, Jerry Hubbard noted the extension of the standard to include a ‘StimJob’ object for fracturing reporting, a new error model object and ‘tightened’ schemas for better interoperability.
Prodml V2.1 is scheduled for release during Q2 2011 and includes extensions for wireline formation testing, facility reporting and better documentation. Prodml is being adapted to comply with the new ‘Energyml’ standard, an overarching protocol that is set to subsume all Energistics modeling efforts.
Also new is the Prodml shared asset model (SAM), designed to help maintain a hierarchical organization of assets spanning different geographies, organizations and operations. SAM will provide a cross reference of asset identifiers and a directory of services to retrieve XML data objects via the Generic Data Access (GDA) protocol, one of the first Energyml components that provides a single ‘CRUD1’-style service for all data types. Implemented in Prodml, GDA supports all Witsml and Resqml documents.
Energyml is both an installable XML object and a specification for ‘supportability and interoperability’.
Energyml will be leveraged across all Energistics schemas, services and technical architectures. Energyml is scheduled for release in February 2011. A possible migration from Energistics current Soap to ‘Restful’ bindings is under evaluation.
Resqml V1.0 is on track for a publication date around year end 2010 and will include most functionality of the older Rescue standard. These include handling of large data models through the hierarchical data format (HDF5), traceability and georeferencing.
For 2011 Energistics is to expand its website and collaboration centre and plans to work with members and standards organizations to ‘identify synergies and potential avenues for collaboration.’ More from www.energistics.org.
1Create, read, update and delete.
The inaugural SMi Energy Data Storage conference held in London last month was interesting in that solution providers hailed from both upstream oil and gas and utilities. But there was a dearth of end user presentations and something of an excess of sales pitches—albeit of a fairly educational nature. More on SMi Conferences from www.smi-online.co.uk.
Volantice’s Ugur Algan set the scene with a general introduction to E&P data types and the niceties of managing entitlements, documents, GIS and legacy media. Data volumes continue to rise vertiginously. Algan recently came across a single survey of 100TB – and this will be 4D, repeat acquisition. Complicating issues for the upstream data manager include replication, archive and backup, ‘data just multiplies!’ Storage infrastructure is only a small part of the story. The future will see high capacity everything, a likely move to the cloud and better data management with software for de-dupe and replication. But Algan insisted, ‘The issue goes beyond storage, the real problem is data management.’ More from www.volantice.com.
Neil Brown (DeMontfort University, UK) has been investigating how ‘intelligent’ corporations are with their energy use. Not very it seems. We generate lots of energy that is thrown away as can be seen from a nighttime visit to London’s Canary Wharf business center where most offices remain ablaze with light. Also, around 30-40% of PCs never get switched off. For a large corporation, it easily pays to for a dedicated person to go around switching PCs off—or you can get the cleaners to do it. Justifying such action requires ‘longtitudinal’ energy data to pinpoint problems. You can use water meter data as proxy for building occupancy and compare with energy use. Smart meters also ran More from www.oilit.com/links/1012_6.
Andy McDonald described how Isilon is addressing the ‘petabyte problem.’ Isilon claims traction in energy with over 50 large customers. Seismic processing is a target market as is interpretation and visualization. Isilon’s claims to fame include low sysadmin overhead, commodity components and its own software, in particular the proprietary OneFS file system. McDonald warns against using less performant open source file systems. OneFS exposes a file system ‘like a 10 petabyte USB key!’ Energy clients include Kelman and Spectrum Energy. More from www.isilon.com1.
Sajjad Khazipura described how Wipro has leveraged its experience building data centers for its upstream clients noting that ‘for every dollar spent on data, $8 is spent on storage.’ Wipro’s upstream storage reference architecture (WUSRA) is described as a ‘business process aware storage application.’ WUSRA ‘understands’ seismic processing environments like Omega and ProMax and software including OpenWorks R5000, Petrel, Eclipse and VIP. WUSRA is designed ‘to assure data access across the workflow.’
To address proliferating data volumes (growing at 70-80% per year!) customers are asking for a ‘step change.’ Wipro’s answer is tiering, starting with the high performance storage for the cluster and moving out to lower cost solutions—ultimately to the cloud. The cloud is now a ‘viable option’ according to Wipro which is working with partner Microsoft to leverage its ‘Azure’ solution—although Wipro claims to be supplier agnostic. The idea is to ‘combine business process outsourcing with infrastructure optimization.’ The user just sees a portal. More from www.wipro.com.
NetApp’s Peter Ferri showed off a telling slide of upstream storage clients for its unified approach to multi (storage) vendor data management. The advent of 64 bit Microsoft Windows desktop platforms with the CPU and graphics horsepower necessary to drive data intensive interpretation and modeling applications is impacting networking and storage needs. NetApp’s latest concept is the ‘Petrotechnical Cloud,’ consolidating resources and offering users thin client-based access to 2D and 3D visualization. More from www.netapp.com.
Simon Mitchell (Spectra Logic) put in a stalwart defense of tape storage noting that while storage demands explode, ‘70% of all capacity is misused’ as in inert, allocated not used, orphaned etc. Tape is now reliable, very high density and has a smaller footprint than disk. The Ultrium LTO roadmap scales out to 13TB/tape and a 470MB/s data rates (both uncompressed). More from www.spectralogic.com.
John Bell outlined EMC’s attempt to catch up with NetApp in the upstream. NetApp had successfully ‘wiped out’ HP—what could EMC do? Client surveys determined pain points such as ‘cumbersome’ applications that run slow, exploding real time data volumes and storage growth. EMC’s answer is ‘fully automated storage tiering’ and multi path file share. This has cut BP’s storage costs by 20% and is used by BG Group in an ‘aggressive global expansion.’ EMC’s technology powers Schlumberger’s new data profiling service that optimizes Petrel, GeoFrame and other upstream data management. More from www.emc.com.
Panasas’ Derek Burke showed off another blue chip client list including BP, Statoil, ConocoPhillips and Aramco. Prestack depth migration testing with Paradigm has shown Panasas’ Parallel NFS to be ‘2.5 times faster’ than vanilla NFS. Similar results are claimed for Landmark’s ProMax. More from www.panasas.com.
Another ‘in praise of tape’ presentation came from Quotium’s Fernando Moreira. Tape is ‘green,’ high performance and offers good lifetime if properly managed. Quotium’s StorSentry runs in parallel to an archival application, collecting quality and performance data and advising on drive and tape changes. Tape is still the most cost effective solution for archives of over a petabyte. For long term storage and compliance, ‘tape remains the only choice.’ As capacity reaches 10TB/tape, real time tape monitoring is critical. More from www.quotium.com.
1 Shortly after the event, Isilon was acquired by EMC.
This article is abstract of a longer Technology Report produced by Oil IT Journal’s publisher, The Data Room. More from www.oilit.com/tech.
Martin Rayson described how Blue Marble’s Geographic Calculator has been leveraged inside Shell’s geodetic framework (SGF). The SGF was initiated to bring order to geodetic parameter usage across Shell’s application portfolio. Applications tend to use different coordinate transformations and units of measure with little indication of authority. A central geodetic database has been established to feed reliable information to applications and enforce naming conventions. This, the Shell Geodetic Parameter Registry (SGPR) , is a proprietary version of the European Petroleum Survey Group’s (EPSG) database. SGPR was developed for Shell by Galdos Systems in 2009.
Shell’s set-up presupposes that vendor applications are capable of consuming such information in an automated fashion which is not always the case. In such cases, manual data population is required. This is the situation for many E&P apps including R5000, ArcGIS, PowerExplorer and others. Others, notably Blue Marble’s geoCalculator allow for ‘hardwired’ automated population from the registry.
For the geodetics purist, Noel Zinn of Hydrometrics (Zinn was previously in ExxonMobil’s survey department) described an earth-centered earth-fixed (ECEF) scheme for geodetically rigorous, 3D visualization, powered by Blue Marble’s Geographic Calculator. ECEF schemes locate points on the earth’s surface (or anywhere else) with Cartesian coordinates, avoiding the distortion of map projections – so long as you have a 3D visualization system. The advent of ‘Globe’ GIS systems such as ArcGlobe and Google Earth has brought increased attention to ECEF schemes – although neither of these deploy such. It is in geoscience visualization that the ECEF scheme comes into its own – especially as we try to achieve ‘plate to pore scale’ visibility.
Zinn proposes a ‘revolutionary’ step that uses an ECEF coordinate system to bring a 3D Earth into the visualization environment, maintaining geodetic rigor and eliminating projection distortion. At which point, ‘Each prospect can be worked locally, all projects fit together globally and are suitable for both local and regional studies.’ More from this interesting presentation from www.hydrometronics.com and from www.bluemarble.com.
Jim Stolle (TGS-Nopec) advocates capturing well data quality, a.k.a. ‘goodness’ somehow in the PPDM data model - particularly with regard to spatial data. According to Stolle, the reality is that between 10 and 20% of the wellbores are missing from all industry sources. Frequently missing are pilot holes, wellbores to alternate targets and some redrills. Stolle suggests adding quality fields to the data model to indicate inadequate or absent documentation for wellbore definition. Stolle followed up by describing a string of geodetic ‘howlers’ including a database of surveys from ‘around the world’ which were actually all from the Barnett Shale area where survey reporting is notoriously unstandardized and inconsistent.
Paul Haines (Noah Consulting) outlined how geotechnical data in a PPDM data store can be blended with financial data from enterprise resource planning applications such as SAP. Many frequently asked questions (by management) can be rather hard to answer with current business systems – e.g. ‘Have we decreased work orders or improved work order turnarounds?’ or again, ‘Have we decreased unplanned downtime or optimized maintenance procedures based on market conditions?’ And in both cases, ‘What is the financial impact?’ Such questions are hard because accounting has a different world from operations – in fact, ‘We may as well be speaking different languages.’
Brian Boulmay (Tibco OpenSpirit) described how the OpenSpirit reference value catalog is used to add ‘out of the box’ conversion of units and coordinates to a multi vendor environment – including PPDM thanks to the new PPDM data connector. OpenSpirit supplies a lot of the extra ‘stuff’ that is required to make PPDM work with applications, GIS and data in general.
Deborah Henderson from the Data Management International (DAMA) organization outlined the benefits from the recent alliance between DAMA and the PPDM Association. These include complementing skill sets and mutualising member services to meet users’ information and data management needs. Members now benefit from reciprocal membership.
Rama Manne’s (Infosys) talk on master data management for E&P included a detailed comparison of solutions from Oracle, Informatica, Landmark, Schlumberger and Petris. Oracle MDM provides strong data integration and a built-in business rules engine and strong reporting but it lacks E&P specificity particularly in terms of its data quality engine. Informatica MDM shares these weaknesses but offers strong ETL capability and good integration with BPM/workflow solutions. Manne went on to compare Landmark’s PowerHub/CDS combo, Schlumberger’s Seabed and Petris’ OMS in rather more detail than we have space for (although we have to report that Petris’ OMS got a tick for PPDM compliance). More on Infosys in oil and gas from www.oilit.com/links/1012_9.
These and other presentations are available for download on www.ppdm.org.
Dave McCurdy has been named president and CEO of the American Gas Association, succeeding Dave Parker.
Andrew Way is VP services for GE Oil & Gas.
AspenTech has appointed John Hague Senior VP and MD for MENA.
Bob Austin is president of the Geospatial Information and Technology Association (GITA). Libby Hanna and Dana Wood have also joined the staff.
CGGVeritas has appointed Ron Smaniotto US Sales Manager at its Hampson-Russell Software and Services division.
John Lindsay has been named Executive VP and COO of Helmerich and Payne.
Maryanne Maldonado has been promoted to VP and MD of ‘energy acceleration’ at the Houston Technology Center.
Gary Kohrt has joined Iconics as VP Marketing.
IDS has appointed Shannon Cameron as sales executive for North America.
French Petroleum Institute (now ‘IFP Energies Nouvelles’) president Olivier Appert has been re-elected VP of the European Zero Emissions Platform, an industry grouping for the development of Carbon Capture Storage in Europe.
Ikon Science has joined CDA as Associate Member. Richard Swarbrick has been appointed to the company board and is now global director geopressure. Swarbrick founded GeoPressure Technology, acquired by Ikon in 2006.
POSC Caesar Association has TechInvestLab as a new member.
Roger Read has joined Morgan Keegan as a senior equity research analyst following the oil services industry. He was formerly with Natixis Bleichroeder in Houston.
Gareth Johnson and Steven Clellend have re-joined Midland Valley.
John Hanko has rejoined Ryder Scott as a geologist. Marylena Garcia has joined the company as a petroleum engineer. She hails from Conoco Phillips.
Shell has appointed Dirk Smit, VP of exploration technologies as chief scientist for geophysics, John Karanikas, chief subsurface engineer in unconventional oil as chief scientist for reservoir engineering and Vianney Koelman, team leader of in-well technology as chief scientist for petrophysics.
Duncan Irving has joined Teradata as EMEA industry consultant for oil and gas. He was previously at the University of Manchester.
TerraSpark Geosciences has appointed Karen Sherlock director of product management. Sherlock hails from BHP Billiton.
Weatherford International has promoted Peter Fontana to Senior VP and COO. Fontana was previously with The Western Company of North America.
John Cusick is now Senior VP, energy sector equity research with Wunderlich Securities. Cusick was previously with Oppenheimer & Co.
Ray Leonard, president and CEO of Hyperdynamics Corp., has been named to the University of Arizona’s Geosciences Advisory Board for a five-year term.
Matrix Service Co. has promoted Kevin Cavanah to VP finance and CFO.
Jim Sledzik, President of Energy Ventures and formerly of WesternGeco, is to join Wireless Seismic’s board of directors and Gary Jones is chairman of the board.
Atos Origin has acquired Siemens’ IT Solutions and Services unit for €850 million. The combined operations’ 78,500 employees should generate €8.7 billion revenues worldwide for 2010.
Acorn Energy has sold its Coreworx unit to its employees and private investors including Golden Triangle Angelnet.
EMC Corp. has acquired Isilon Systems, now a division of EMC’s Information Infrastructure Products business. Isilon founder, president and CEO Sujal Patel reports to EMC’s Pat Gelsinger.
GE is to acquire Wellstream Holdings for $1.3 billion cash.
Energy Ventures has partnered with Chesapeake Energy in a $19.5 million investment in Wireless Seismic.
Fugro is to acquire Riise Underwater Engineering which will be renamed Fugro-RUE.
Geovariances has acquired the Australian geostatistical resource evaluation and consulting company Tenzing, setting up a Geovariances office in Brisbane.
Halliburton has paid the Nigerian Government $32.5 million in regard to improper payments of government officials by former subsidiary KBR.
Idox Group has acquired McLaren Software.
McGraw-Hill’s Platts unit is to buy energy market analyst Bentek Energy. Platts is also acquiring the Oil Price Information Service from United Communications Group.
Siemens has received authorization to engage in banking operations. Siemens Bank will support sales at the company’s operating units with loans and guarantees.
A Statoil presentation at the Calsep1 User Group meeting this month described how the company is unifying its multi-phase meter allocation solutions. Multi-phase meters are used to determine gas oil and water flow rates upstream from the separator but their use requires a significant calibration effort. This is achieved through simulation of phase densities and volume conversion factors. Statoil had three different MPM solutions working on different fields and decided to unify these under a single PVTsim-based DLL2 with different entry points for each field. The DLL was developed as a VBA.NET plug-in to Excel. Statoil’s unified MPM solution is now being tested and has been extended to use on another field. The DLL has also been used to run PVTsim under control from Honeywell’s UniSim.
Calsep has embarked on a major revamp to ots PVT3sim flagship. The new PVTsim ‘Nova’ release is being developed to assure continuity for PVTsim. PVTsim allows reservoir engineers, flow assurance specialists and process engineers ‘to combine reliable fluid characterization procedures with robust and efficient regression algorithms to match fluid properties and experimental data.’ Data is then available for use on reservoir, pipeline and process simulators. Originally released in 1988, PVTsim is the PVT engine inside SPT Group’s ‘Olga’ dynamic flow simulator.
Nova adds new functionalities and models and will be easier to maintain and extend with new database content thanks to a code rewrite by ‘professional’ programmers and modern tools. Nova targets enhanced oil recovery, flow assurance and new thermodynamic models. An ‘open structure’ improves database communication and links from 3rd party software. Nova also allows clients to add their own algorithms, simplifies reporting with links to Excel and Word and adds industry standards reporting and compliance.
For more User Group presentations including a flow assurance and wax deposition study on an offshore Norwegian field, on hydrate inhibitors and on the development of an equation of state model for an gas injection EOR project on KOC’s Sabriya field, visit www.calsep.com.
1 CALculation of SEParation processes.
2 Dynamically linked library.
3 Pressure, Volume, Temperature.
Prof. Rita Marcella, Dean of the Aberdeen Business School, speaking at the Opito1 safety and competency conference last month, stated that the prime driver behind health, safety and emergency response training was the risk of a major accident. Marcella questioned the efficacy of much of today’s training. Anecdotal evidence suggests that training standards are not always high and there is a need for greater personal responsibility for training, more realistic ER simulation and improved safety leadership. There are significant differences in the quality of training delivered around the world. To fix this, Marcella advocates a common, global standard to provide ‘consistency across all international locations, resulting in improved quality of training and more capable emergency response personnel. Such a standard would be a benchmark for organizations to assess requirements of jobs and resources. The problem though is not a lack of HSE standards, rather that there are too many that impact an oil and gas company. Marcella believes that by working with the standards bodies, with industry and government, a global standards framework is achievable. This would ultimately benefit all with better workforce mobility, reduced training costs and the assurance of best-practice emergency response from better standards and information and technology sharing at times of crisis. More from www.opito.com.
1 Opito is an employer and trade union led organization providing skills, training and workforce development to the oil and gas industry.
The Houston Advanced Research Center (HARC) is seeking participants in an initiative to promote technologies for ‘low-impact’ oil and gas drilling. The environmentally friendly drilling (EFD) systems scorecard sets out to minimize the impact of drilling in sensitive areas.
EFD program manager Rich Haut said, ‘The goal is a common methodology that can be used across the USA to document the environmental and societal tradeoffs associated with energy development. Land owners, regulators and the general public can use the scorecard to objectively assess operators’ performance. Operators can compare their own operations with industry best practices.’ More from www.harc.edu.
Object Reservoir reports that Chesapeake, Southwestern, Seneca and Talisman have joined its Marcellus Shale joint industry project, bringing the number of partners to ten. The JIP sets out to establish best practices for shale well spacing and stimulation and remains open to new members for a limited time. More from www.objectreservoir.com.
Virtalis is to supply a supply a Highly Immersive Visualization Environment (HIVE) to the University of the Western Cape, South Africa. Funding for the unit is provided by the University and BP. More from www.virtalis.com.
Epsis is to deliver a system for monitoring drilling operations to Petrolia for its drilling operations in Africa. More from www.epsis.no.
Allegro Development Corp.’s Allegro 8 platform has been selected by National Grid to manage its power, natural gas, LNG, liquids and renewables trading operations in the US. More from www.allegrodev.com.
CGGVeritas has created a joint venture with Petrovietnam Technical Services Corp.(PTSC) for 2D and 3D marine seismic operations in Vietnam. More from www.cggveritas.com.
Chevron Indonesia has awarded the contract for its Gendalo-Gehem natural gas development to Technip Indonesia, Worley Parsons Indonesia and Singgar Mulia.
CNL Software is the latest company to join MatrikonOPC ‘s Global Partner Network. CNL will integrate Matrikon’s range of OPC Servers into its IPSecurityCenter – a software-based control system integration platform. More from www.cnlsofware.com and www.matrikonopc.com.
Cortex Business Solutions and Powervision have signed a new hub project with a ‘mid-sized natural gas-focused Canadian energy corporation based in Calgary’ to implement the Powervision Workflow Management software and automate 80% of each company’s invoicing process by on-boarding the hub’s suppliers onto the Cortex Trading Partner Network. More from www.cortex.net and www.powervision.com.
Petrobras has selected Emerson Process Management to provide process automation technologies and services for the Petrochemical Complex of Rio de Janeiro (Comperj) in Brazil. More from www.emerson.com.
FMC Technologies has signed a memorandum of understanding with Petrobras to develop future subsea technology solutions for its oil and gas projects offshore Brazil, for both pre-salt and mature oil and gas fields. FMC has also signed an agreement with Norske Shell for the manufacture and supply of subsea production equipment for the Ormen Lange development project in the North Sea for approximately $95 million, and won a $75 million contract with Statoil for the Vigdis North-East development. More from www.fmctechnologies.com.
Halliburton has been awarded a contract by ConocoPhillips for directional drilling, logging-while-drilling (LWD) and surface data logging (SDL) services to help develop the high temperature Jasmine discovery in the central North Sea. More from www.halliburton.com.
Hyperion Systems Engineering has delivered a Fluid Catalytic Cracker Unit operator training simulator to Preem Petroleum’s Lysekil refinery in Sweden. More from www.hyperion.com.
Hess has selected a suite of products and services from KSS Fuels to provide day-to-day fuel price management and optimization. Hess will utilize PriceNet’s pricing performance management and KSS Fuels consulting services. More from www.kssfuels.com.
WhiteStar Corp has named LandWorks as a reseller of the WhiteStar Exploration Cube product line. LandWorks will bundle the WhiteStar map products with its LandWorks GIS software. More from www.whitestar.com.
High accuracy GPS service provider Leica SmartNet Bulgaria has launched a partnership with IPOS to operate a control center and Metrisys (a Leica Geosystems distributor) for user support. More from www.leica-geosystems.com.
Sercel has sold a Unite cable-free acquisition system to Paragon Geophysical Services. The telemetry-based Unite system is used for large-channel-count surveys in challenging environments. By year end 2010, a total of 4,000 channels will have been delivered. More from www.sercel.com.
The US Pipeline and Hazardous Materials Safety Administration (PHMSA) has completed the migration of the 510,917-mile national pipeline mapping system (NPMS) to a database based on the Pipeline Open Data Standards organization’s data model, PODS. Using PODS means that the PHMSA can now track attributes in a ‘pipe-centric’ manner and view changes to a pipe segment throughout time. Operator performance can be separated from pipeline performance and changes in submissions identified. The results can be viewed on www.npms.phmsa.dot.gov.
The Open Group has announced a service-oriented ontology standard (SOOS) leveraging the W3C’s Web Ontology Language (OWL). SOOS targets business users and developers with a 90 page how-to guide for SOA and OWL deployment. More from www.theopengroup.org.
Laurent Liscia, Executive Director of the Oasis standards body, in his 2010 roundup reflects on ‘pushback’ from the ‘de jure’ European Standards Organization with regard to the validity of industry-led consortia such as Oasis. Fortunately, Oasis’ ‘indefatigable diplomacy’ has meant that the organization is now regarded by the European Commission in the same light the W3C and IETF and it is expected that upcoming procurement legislation will reflect such esteem, allowing Oasis standards to be referenced in public procurement. More from www.oasis.org.
ExxonMobil has expanded its LaBarge, Wyoming, carbon capture facility which now has a 365 million cu. ft./day capacity – equivalent to taking 1.5 million cars off the road. CO2 is captured from natural gas production and made available for enhanced oil recovery and other industrial users.
Qatari RasGas has deployed emission reduction technology supplied by GE at its Ras Laffan LNG Complex. GE’s dry low NOx (DLN) combustion technology is being used to reduce gas turbine emissions. RasGas MD Hamad Rashid Al Mohannadi said, ‘This retrofit on six gas turbines has halved NOx emissions. We are now fitting of the rest of our gas turbines in compliance with the State environmental regulations.’ More from www.ge.com.
A report from the Carbon Sequestration Leadership Forum (CSLF) indicates ‘significant’ international progress on ‘advancing’ carbon capture and storage (CCS). However challenges remain, notably the ‘sheer scale’ of creating a CO2 transmission system in populated areas. There are 32 active or completed CCS projects and ‘significant’ investment in technology. More from www.cslforum.org.
Computer Science Corp. (CSC) has launched a ‘carbon managed service’ based on SAP’s ‘carbon impact on-demand’ solution. The service helps companies develop a ‘proactive and profitable’ carbon management process, capturing emissions and establishing an accurate and auditable carbon footprint. CSC also helps develop mitigation strategies. Carbon managed service is component of CSC’s compliance and sustainability portfolio delivered from CSC’s ‘trusted cloud’. More from www.csc.com.
Petroleum Development Oman, a joint venture between the government of Oman, Shell, Total and Partex, is to use technology from UK-based Z+F UK and VRcontext of Belgium to ensure that its 3D CAD asset models are aligned with current ‘as-is’ plant data. Current plant data captured by on-site laser scan surveys is processed with Z&F’s LFM Server prior to integration with VRcontext’s ‘Walkinside’ virtual reality 3D model.
On-site laser scan surveys generate massive datasets that need to be reconciled with 3D CAD from engineering contractors. The Walkinside-LFM Server is used to blend ‘point cloud’ data with the 3D CAD model and highlight ‘clashes’ of proposed plant revamps. Pipes can then be re-routed around obstacles. Saving of ‘tens of millions’ of dollars are claimed in project execution by preventing or minimizing unnecessary rework, on-site reassembly, and by avoiding associated plant shut-downs and lost revenues from asset operation.
The Walkinside-LFM Server lets engineers adjust the displayed resolution of the 3D point-cloud in real time to optimize data integration. New 3D CAD models can be generated from the point clouds using Z&F’s LFM Modeler and used as the starting point of new projects. The new CAD models support PDO’s transition from 2D to 3D design. Walkinside allows multiple stakeholders to work from the same time-stamped reference model that will evolve throughout the life time of the facility. More from www.zf-uk.com and www.vrcontext.com.
Attendees at the 2010 American Petroleum Institute Information Security Conference held in Houston last month heard from a succession of IT security vendors bent on putting the fear of, if not God, then of the hacker, terrorist, disgruntled and/or careless employee and other sources of IT insecurity into them. While the need for a security crackdown is emerging, the debate is also opening-up to embrace the emerging field of social networking inside the organization—bringing a boatload of new potential security risks.
Kevin Cearlock of the FBI’s Houston division reported that the focus of foreign intelligence has shifted from military secrets to critical technology and economic information. While China is the most aggressive country conducting espionage against US interests, political and military ‘allies’ are as active in technology/economic collection as the US’ traditional adversaries.
Economic espionage tradecraft works through visitors, trade delegations, joint ventures and traditional espionage techniques such as intercepts, hidden cameras, ‘dumpster diving’ and casual ‘overhear.’ The oil and gas industry is at risk of attack and there is no magical appliance or software that can guarantee protection. Many networks are misconfigured, easing penetration.
Anti-virus alerts, intrusion detection systems (IDS), network logs analysis all help but no anti virus program has every signature, IDS rules are usually too relaxed and administrators are inundated with false positives. Logging is usually turned off and anyway, logs are rarely checked.
A big risk comes from the innocent user, a potential victim of phishing, malicious websites, trojans and application vulnerabilities such as SQL injection, buffer overflows and unpatched web servers. Both foreign intel services and economic adversaries may try to gain a foothold in critical infrastructure for strategic and tactical military advantage. Others may be trying to ‘exfiltrate’ bid data and other information on the quantity, value and locations of oil discoveries, news briefings, internal reports, business data.
Mitigation revolves around user awareness and a good understanding of your network and traffic. Penetration tests can expose vulnerabilities. Networks should be segmented isolate sensitive servers which should be monitored closely with a ‘crown jewels’ policy. Strong passwords and two-factor authentication is a must and unnecessary services and accounts should be removed. More from www.api.org.
Empresa Nacional del Petróleo (ENAP), Chile’s national oil company, has selected AuraPortal’s business process management suite (BPMS) to support its fuel refining and distribution business throughout Chile. ENAP operates three refineries with a total capacity of 230,000 barrels per day and a network of oil and gas pipelines.
AuraPortal deployment was undertaken by partner BPMConsultancy, whose MD Jaime Krumel said, ‘ENAP will begin with the automation of its logistics management process of sea fuel transport, and targets improved management of the entire operating cycle of fuel transportation, in both national coastal trade and imports-exports.’ The ENAP sale follows-on from a success with Mexico’s Pemex NOC (OITJ May 2010), an AuraPortal user since 2006.
Through its Sipetrol unit, created in 1990, ENAP now has operations in Peru, Ecuador, Colombia, Yemen, Iran and Egypt. More from www.auraportal.com.
San Diego-headquartered On-Ramp Wireless has teamed with Croatia-based Koncar INEM on a pilot project to deliver a wide-area wireless condition monitoring solution. Koncar will embed On-Ramp’s Ultra-Link Processing (ULP) technology into its pipeline monitoring offering. On-Ramp’s ULP system combines a pressure sensor with a Modbus gateway and claims a significant improvement in range over current technologies. ULP networks already cover large industrial campuses with ‘minimal infrastructure.’ Pressure sensors can be monitored up to 10 km distant from wireless access points.
Koncar INEM CEO Željko Tukša said, ‘We are one of the first companies to deploy ULP technology for condition monitoring. We look forward to at-scale delivery to the oil and gas and other process industries trying to address remote operations and improve safety.’
On-Ramp’s ULP claims extended battery life and ATEX-certification for operation in explosive environments.
On-Ramp CEO Joaquin Silva added, ‘This pilot project is a first for our technology – delivering monitoring performance at low cost. We have already broken the adaptation barrier for wireless device monitoring and condition monitoring is a key next step for our company.’
Operating in un-licensed spectrum, ULP uses signal processing to identify weak signals in high noise environments. More from www.onrampwireless.com and www.oilit.com/links/1012_2.
Netherlands-based operator Liander has commissioned the TNO R&D organization to develop a sensor network for condition monitoring of its gas pipelines. Pprogram manager John Weda explained, ‘We are constantly seeking new solutions to manage our gas and electricity networks and predict where problems might arise. In the coming decades we plan to replace gas pipelines that are less resistant to subsidence. Sensor data and computer models will improve prediction and help prioritize intervention.’
TNO’s ‘STOOP-IJknet1’ will extend Liander’s current systems and risk models. TNO’s previous experience with sensor monitoring of dykes and viaducts has shown that it is important to monitor not only the pipes themselves but also the ground in which they are buried. The network monitors settlement, vibrations and groundwater movement. These all feed into computer models along with materials, corrosion and geological data. Liander is currently working with TNO on a proof of concept test. Operators Deltares and Kiwa Gas Technology are also involved in the IJknet project.
TNO project lead Wim van der Poel added, ‘Our knowledge of IT, predictive models, fracture mechanics, risk management, sensor technology and geology is highly applicable in this context. All the pipelines are subject to risks. By measuring in real time and processing the data straightaway, we believe we can make very accurate predictions.’ More from www.tno.nl.
1 STOOP is both the Dutch acronym for ‘Sensor technology applied to underground pipeline infrastructures’ and a reference to TNO researcher Ben Stoop who died last year.
Engineering design and information management specialist Aveva is to offer its clients the choice of Microsoft’s Windows Communication Foundation (WCF) as a component of its Aveva Global workshare and collaboration environment. WCF will be leveraged to enable global interaction between partners, suppliers and contractors. The Foundation is claimed to improve project manageability, reduce cost and risk and shorten timescales.
The WCF protocol provides a secure platform data exchange and can be configured to a client’s individual security requirements. WCF provides security options such as authentication, encryption and a selection of suitable transport mechanisms to assure project information is shared securely.
Aveva’s Thierry Vermeersch said, ‘Effective collaboration is dependent on keeping the system agile, by transmitting only changes rather than full datasets. This mandates an equally agile security strategy, that is flexible enough to meet the changing needs of the customers’ evolving environments. We have listened to our customers and included the flexible security of WCF into Aveva Global.’ More from www.aveva.com and www.microsoft.com.