March 2008


Shell’s VoIP infrastructure

Microsoft and Nortel have been testing wall-to-wall IP-based communications for Royal Dutch Shell. A move to Microsoft’s 2007 communications toolset sees scale-up to ‘thousands of users.’

Back in 2006, Royal Dutch Shell announced a ‘multiyear migration’ to a ‘unified communications’ infrastructure built on technology from the ‘Innovative Communications Alliance’ (ICA), a Nortel/Microsoft joint venture. The first phase of the pilot tested two voice environments: Microsoft Office Communicator (MOC) 2005 and Nortel IP ‘hard phones.’ The trial also involved ‘presence awareness’ instant messaging (IM) and for 200 testers, voice over IP (VoIP).

Legacy

Shell’s legacy telephony system is a traditional PBX or, more technically, a POTS (plain old telephone system). Local equipment availability and cost has led to the deployment of hundreds of disparate systems with video and web conferencing tools from multiple sources.

Krebbers

Shell is now upgrading the pilot deployment with Microsoft Office Communications Server (OCS) 2007. Royal Dutch Shell Group IT Architect Johan Krebbers told Oil IT Journal, ‘Microsoft is now well underway to make OCS2007 the basis of enterprise IP telephony and real-time communications. We now have thousands of users of the system and will be moving to much larger numbers in 2009.’

Reduce travel

Krebbers explained, ‘Multiple technologies increase costs and VoIP means it no longer makes sense to run separate systems for voice and computing. Shell wants to reduce travel—we want to bring work to the people, rather than take people to the work.’

Presence awareness

A new ‘soft-phone’ function means users can ‘click-to-call’ from within Microsoft Office. Microsoft and Nortel endpoints operate under a single, Active Directory-based dial plan. Employees use PCs to initiate conversations and conferences. ‘Presence awareness’ shows colleagues’ availability and users can switch channels when, for instance a ‘chat’ develops to the extent that arm waving is required. Krebbers noted, ‘Presence awareness is one of the most heavily used features.’ Shell is also to leverage the unified messaging functionality of Microsoft Exchange Server to collect voice messages, faxes, e-mail, appointments and contacts into a user’s Outlook inbox. Over the coming years, Shell hopes to progressively retire its PBXs and will host communications from three global data centers in Houston, Amsterdam and Kuala Lumpur. For legislations which prohibit VoIP Nortel will support Shell with connections to legacy ‘POTS’ exchanges.

Pemex

In a similar deal last year, Pemex’ ‘GIT’ communications infrastructure unit contracted with Microsoft to implement Office Live Meeting and OCS 2007 to ‘provide a better meeting experience and reduce the need for employees to travel to remote locations for face to face meetings.’


Petrobras’ quality

Intervera’s DataVera Suite is now Brazilian NOC’s ‘gold standard’ for well data quality. Intervera believes awareness of quality issues growing.

Petrobras is to implement Intervera’s DataVera Suite to improve its well data quality. DataVera ‘HealthCheck’ and ‘Clean*’ have been selected as Petrobras’ ‘gold standard’ for well data quality profiling, reporting and cleaning.

Gregory

Intervera president and managing partner Paul Gregory said, ‘Awareness of data quality is gaining a lot of momentum across the oil and gas industry—especially with the industry-wide drive towards Master Data Management.’ DataVera comprises a growing repository of reusable, industry-specific rules created by Intervera’s E&P clients. HealthCheck lets users audit and validate their data in a matter of days. Clean standardizes data by deriving missing values and performing grid shifts, changing depths or units of measure.

Roadmap

Gregory concluded, ‘While it’s impossible to prevent all data errors, DataVera can certainly assist in minimizing the impact of poor data quality and help to initiate a continuous and repeatable data quality roadmap within an organization.’ More from www.intervera.com.

*The HealthCheck/Clean combo is also known as the DataVera ‘Standardization Suite.’


Heat pumps, phlogiston and the world wide web

Oil IT Journal editor Neil McNaughton produces hot air from cold, puzzles over thermodynamics, seeks help on the internet and asks what ‘phlogistons’ we have hiding in our daily workflows.

At university, I knew a chap who, when strapped for cash, would go out and spend his last few pounds and shillings on a hat or some other such unnecessary paraphernalia. In the midst of the credit crunch, the subprimes and a dollar in free fall, we are doing likewise—acquiring, not a hat, but a holiday home in the Euro zone. While house hunting, I came across some rather intriguing notions about ways of heating and cooling your home in a particularly ‘ecological’ fashion.

Geothermal

The system is a variant on geothermal energy—taking heat from the ground. But in these ‘air geothermal’ systems the energy is taken from the air. In the winter, these heat pump-based ‘solutions’ take cold air from the outside and blow hot air into your living room. I must say I was perplexed. As a geophysicist, I like to take a ‘black box’ approach to such problems—I mean, whatever goes on inside the heat pump, calories, BTUs or whatever have to be added to raise temperature.

Cold air

Yes indeed, say the geothermologists—but these calories come from the cold air on the outside. The result is that the system blows even colder air out to the outside, extracting its calories and warming up the inside of your home. On one blog posting, someone working for an air geothermal company claimed that useful heat could be extracted from air as cold as - 45°C!

Global warming

When I got home I sat myself down in front of our old fashioned fire in a comfy chair, poured myself a drink and got thinking. These systems are quite incredible—not only do they give you calories for free—but they actually cool the atmosphere down—an instant solution to global warming! I had to check this out even though I was getting a bit sleepy by then. I decided to do a simple experiment...

Refrigerator

We all have heat pumps in our refrigerators—and so with a few doors open, some plastic sheeting to duct cold air here and hot air there, I quickly turned our fridge into a geothermal heat engine. Next I ran around the house and turned every electrical appliance (including my wife’s bedside light) off. Then, with a Maglite, I shot down into the basement to check the electricity meter. There it was, incontrovertible evidence of energy saving—the meter was turning slowly around in the wrong direction! Electricity was being pumped out of the cold air and back into the grid. Just to check, I ran outside under the sheets with a handy thermometer - minus 50°. Wow! I thought, I didn’t know that our jam thermometer went down that low! And then I woke up...

Skeptical

My immediate reaction to hearing this tale of free energy was—no way! I studied physics a long time ago and vaguely remember some stuff (was it the first or second law of thermodynamics?) that says that you need a heat source and a cold heat sink to use thermal energy. I mean if there was free energy to be had from cold air, then there would be dirty great big air thermal heat pumps providing us with electricity wouldn’t there? I fondly imagined that a quick visit to Wikipedia would sort out the facts of the matter.

Wikipedia

I was wrong. The heat pump Wikipedia page kicks off with the thermodynamical impossibility of getting energy from cold air. But if you visit the air source heat pump page you get an unequivocal pitch for the merits of the technology (lifted from a manufacturer’s sales document.) The French version of Wikipedia pitches right in with an explanation of how air heat pumps work and reports on tax breaks encouraging their installation. More publicly available information on the heat pump comes from blogs—usually rehashing the same sales pitch. Although one or two express disappointment at their high heating bills despite heat pump help!

Monty Hall

Air geothermal heating and cooling is rather like the Monty Hall problem. Its logic seems to shift around every time you think about it. If someone knows where there is a definitive explanation—or more likely a debunking, I would like to hear it. But my failed attempt find THE answer made me reflect on the reliability of the world wide web as an information source.

Phlogiston

Unfortunately, the preponderance of information on the web is unscientific chit chat. The web is a medieval world of charms, half-truths and snake oil—with air heat pump physics as a modern day equivalent of phlogiston. This is compounded by the fact that while the chit chat is free, science often is not. Books and scientific publications are hidden from public view and require subscriptions to read. I would suggest that the most likely source of truth in a scientific context is a university or learned society. But the paradox is that universities have a business model to defend, and learned societies tend to jealously guard the intellectual property (even when this has been given to them for free.) The result? As we see with the heat pump, on the web you are more likely to find bull from enthusiastic amateurs than gospel from the experts. There is a distinct weighting of the knowledge scales in favor of the unqualified hordes.

Corollary

There is a curious corollary to all this. No scientific community really works in isolation. For instance, if you are a geologist working on say, reserves, then you will need information from some domains that are not really in your bailiwick. A geologist may need information on economics or statistics to complete his or her study. Where is this to come from? Unfortunately, unless the geologist reads the Journal of Statistics (I made that up) his or her knowledge of things statistical may be brought in from the above public (dubious) sources. This then feeds back into the geological domain and leads to questionable science and circular reasoning along the lines of the air heat pump. I would submit that a lot of the current discussion on reserves reporting and ‘risk management’ falls into this category. But I may be wrong on all counts...

References on www.oilit.com/links/0802_1.


Oil IT Journal Interview—Robert Graham, BHP Billiton

BHP Billiton’s GIS Data Manager describes Google Earth Enterprise deployment and integration.

Speaking at the ESRI PUG, Graham presented BHP Billiton’s Map of Maps* ArcMap document indexer (developed by Xavier Berni of Geodynamics), a geographic front end to its 4,000 plus map catalogue, comparing it to the frontispiece of an atlas. MoM makes BHPB’s catalogue accessible from BHPB’s ‘Earth Search’ application which combines Google Earth with a ‘PetroSearch,’ BHPB’s implementation of the Verity content management system. After the PUG we met with Robert Graham in BHP Billiton’s Houston visualization center.

How does the consumer version of GE compare with the Enterprise edition?

Enterprise accesses both the Google data and the BHP Billiton server with our own imagery and petroleum data like wells, fields, and leases. You can see more high resolution data on the version the GE provides for public use in most populated cities. But the Enterprise server lets you add in your own content—which is what we have done with Earth Search. Where Google Earth for public use does not have high-res or up-to-date images, we can serve out our own imagery quickly and intuitively. Data integration uses the Keyhole Markup Language (KML).

What does KML bring?

KML lets you mash-up base maps with data from various sources—with volcanoes and earthquake data from the USGS, photographs from Panoramio (e.g. of Venezuelan refineries). You can also link any well data source through its UWI so that you can fire up other viewers or data sets to retrieve information about the well. Typically these would be in-house applications like GetKnowledge, PetroSearch—or Schlumberger’s eWellFile.

What about vendor data?

BHPB has loaded its IHS Probe, WoodMac PathFinder , Fugro-Robertson Tellus, ESA Gom-Cubed vector data. Satellite images are provided from our retailer Spatial Energy. Some large feature classes like wells (800,000+) took nine days to rasterize and load into the Earth server through the Fusion interface. We also link well data to eWellFile, a DecisionPoint components that aggregates well information from Finder, OpenWorks and Enterprise Upstream. Since this goes through the web client, access respects user privileges. We also have a link to public Google text search from each feature.

How hard was it to integrate vendor data?

The pain is having to convert to data that is not delivered in shapefiles. The IHS wells layer was hard as it contains so many wells. We had to divide the IHS data into smaller chunks to load. Once under 200,000 features the wells loaded in minutes rather than days. We use the IHS Spatial Layers Manager to spatialize data from OpenWorks, Finder, Stratabugs etc. from our multiple locations. The layer is rebuilt overnight.

What is the Fusion server?

The Enterprise edition of GE comprises the Fusion Server (for data input) plus the Earth Server device. The globe is all pre-built and rasterized at 24 levels. It takes 30 hours to build once per week. And it took a significant effort to get here. Three months of development to load, tune and link the layers to other sources. The Earth server now supports 250 concurrent users all over the world from one device.

What about high-end GIS?

Well there is no geoprocessing in this environment, no complex queries. But for specific purposes you can build a kml file (we use Arc2Earth) for, say, a bid round with the relevant data. There are natural boundaries between the consumer oriented apps like GE and ArcGIS. Simple queries, printing and labeling should be done with a high-end GIS, but quick look viewing of large data volumes is easy.

How does the imagery in Enterprise GE compare with the public stuff?

That’s the strange thing! You only get the relatively low resolution data in the Enterprise edition, not the high-res public imagery. We do order higher resolution imagery for our areas of interest through the Energy Partner Program from Spatial Energy. We ordered some very high resolution satellite images from GeoEye for a geological field trip to the Grès d’Annot. At home people are spoilt with access to a lot of ‘free’ high resolution data and assume that when you buy the PRO license that the resolution and freshness of the images gets better all over the world. Some users even imagine that the data they are seeing is ‘real time’! In fact managing user expectations is an important aspect of a big deployment like this.

* Map of Maps is a free download from http://arcscripts.esri.com/details.asp?dbid=15451.


Weatherford launches ‘Production Office’

New software collection addresses surface to subsurface modeling and production surveillance.

Weatherford has just announced ‘Production Office,’ (PO) a collection of production and well management packages for surface to subsurface modeling, surveillance, analysis and monitoring. The full-field package offers standard workflows to optimize artificial lift and enable production surveillance, reservoir monitoring and basic well testing. A modular approach means that PO can be customized to suit the needs of each client. By choosing only the modules needed, production companies are able to address current needs while building a plan to expand the system in the future.

Standards

The suite is based on open standards including web services and PRODML and OPC for accessing SCADA data. Connectivity with popular engineering and simulation tools ensures the package integrates with existing reservoir management workflows. A standard platform for all components minimizes training time. PO embeds proved applications from recent acquisitions including eProduction Solutions, Edinburgh Petroleum Services and Case Services—notably WellFlo, PanSystem, PanMesh, MatBal, ReO, ReOForecast, DynaLift, Well Service Manager and LOWIS software. Over 90,000 wells worldwide are optimized with software from Weatherford.

Underperforming

A typical use case provides notification of underperforming wells—replacing rote field surveillance with automated event detection and field data consolidation. Network management tools model how field component changes interact, enabling discovery of an optimum development scenario.


Channel architecture reservoir analogs database announced

C&C Reservoirs extends ‘Digital Analogs’ repository with channel geometry and fill data.

C&C Reservoirs has added another component to its ‘Digital Analogs’ global field and reservoir database. C&C Reservoirs’ Channel Architecture Reservoir Database (CARD) is the first phase of an extension to C&CR’s analogs library to offer ‘comprehensive reservoir architectural data.’ The database includes published reviews of subsurface, outcrop and modern analog data and will provide a framework for the study of channel reservoir architectures worldwide. CARD will include information on reservoir dimensions, connectivity, compartmentalization and productivity.

Lithofacies

CARD data will cover channel dimensions, internal geometries, lithofacies, compartmentalization, stacking patterns and connectivity. Water flood and sweep characteristics will be characterized to enable study of oil and gas productivity.

EOR

CARD is designed to help companies involved in field development planning, optimizing enhanced oil recovery (EOR) programs, establishing well spacing and infill drilling programs. Deliverables include the analogs database, a detailed classification of channel types and a ‘web-based knowledge platform’ for search and analysis. The first phase of CARD will be completed by Q1, 2009.


High resolution SPOT imagery for Western Canada

Iunctus and geoLOGIC Systems offer detailed imagery to geoSCOUT users.

GeoLogic Systems of Calgary is to provide users of its GeoSout E&P decision support system with access to up-to-date, high-resolution satellite imagery of the Western Canadian sedimentary basins. Through an agreement with Iunctus Geomatics GeoSout users will get ‘point and click’ access to high resolution SPOT imagery of Western Canada’s oil provinces—North-East British Columbia, Alberta and Southern Saskatchewan.

Hood

GeoLogic president David Hood said, ‘Satellite imaging is a powerful technology whose time has come. When we evaluated the SPOT High Resolution Imagery from Iunctus, it was clear the comprehensive coverage and data currency represented a major improvement over other services that have previously been available to our customers.’ High resolution satellite imagery is used in planning pipeline corridors, site access evaluations and environmental monitoring and assessments.

Johnson

Iunctus President Ryan Johnson added, ‘This is a tremendous opportunity for us to gain industry exposure. Also GeoScout users will be basing their decisions on the most current imagery available and will receive annual imagery updates of the entire area.’ Data from the French SPOT satellites is downloaded directly to the Iunctus ground station in Lethbridge before orthorectification, tonal balancing and mosaicing of the 1.2 million square kilometer dataset.


DO2 for Brigham, OpenInvoice 7.3 and ‘Kiosk’ for field workers

With 8,000 suppliers onboard, transitioning from paper to electronic invoicing is quick.

Brigham Exploration Company has selected DO2 Technologies’ (previously Digital Oilfield) ‘OpenInvoice’ electronic invoicing to automate its invoice reconciliation and approval process. Brigham is migrating its legacy paper-based process to OpenInvoice and is also to implement DO2’s PriceBook application for automated line item reconciliation against contract pricing.

Brown

Malcom Brown, VP and Controller with Brigham said, ‘DO2 now has over 8,000 suppliers transacting which means that the majority of our transactions now come in electronically. The ability to check invoices against contract pricing is also a big plus.’ PriceBook goes beyond invoice presentment, routing, and approval to offer integration with back-office procurement systems from SAP, Oracle and Ariba. Open Invoice is a software as a service (SaaS) solution, hosted by DO2 and delivered over the internet.

R 7.3

DO2 has also announced release 7.3 of OpenInvoice with a new ‘image module’ for tracking scanned paper invoices. Another new product, OpenInvoice Kiosk adds electronic approval of delivery tickets and goods receipts from an ATM-like touch screen in the field.


M2M announces Intelligent Data Acquisition telemetry service

New service provides offline data collection and upload for entry-level field data acquisition.

A new field data acquisition service from M2M Data Corp., ‘Intelligent Data Acquisition’ (IDA), targets entry level data collection. IDA software runs on an operator’s laptop or PDA and collects field data at the rig site. This can later be uploaded to M2M’s iSCADA remote monitoring service.

Economical

IDA is an economical alternative to satellite-based telemetry where real time data is not required. IDA includes M2M’s standard security and authentication processes. Data collection can be on a scheduled or ad hoc basis. Once data has been uploaded over an internet connection, remote access to the data is possible from an authenticated client.

Wallace

M2M Data CEO Don Wallace said, ‘IDA is a cost-effective way of deploying iSCADA without investing in satellite telemetry.’


Software, hardware short takes …

BG, dGB, Paradigm, Lynx, Mercury, Geodynamic, Caesar, Leica, Neuralog, Quorum, Safe, Schlumberger.

BG Group is sponsoring dGB’s OpendTect package with enhancements to functionality, visualization and performance. Improvements include pre-stack displays and the efficient use of clusters. The software will be available as open source and some of the BG contribution will go towards the Madagascar plug-in project.

Paradigm’s ‘Common Reflection Angle Migration’ high-resolution subsurface imaging solution is now available for worldwide customer licensing, as well as use by Paradigm’s consulting group.

Lynx has just announced Seis-Map, an ESRI ArcGIS add on for display of seismic data. SeisMap was originally developed for ION (previously Input-Output).

3M Dynatel has announced an electronic marking system of RFID ‘marker balls’ that can transmit precise location and identity information of pipes, valves, bends and other buried assets. The balls can be combined with GIS and GPS systems to avoid excavation accidents.

Mercury Computer Systems has announced ‘Avizo Earth Edition’ for high-end 3D visualization. Avizo was previously marketed as Amira.

Geodynamic Solutions has announced Layer Wizard, an ArcGIS extension for identifying, loading and symbolizing spatial data from Geodatabases and shapefiles.

The new 6.1 release of Caesar Systems’ Petroleum Ventures & Risk (PetroVR) package includes ‘Monte Carlo on Scenarios’ for comparison of probabilistic analyses over multiple scenarios.

Leica Geosystems has released Image Web Server (IWS) 8.5. Previously an ER Mapper product, IWS is a high-speed server for high volume geospatial image data. IWS provides online processing of images, transforms and mosaics. New technology serves scale dependent images from Open Geospatial Consortium (OGC) compliant Web Mapping Services (WMS). IWS 8.5 provides over 3,200 predefined coordinate systems, 1100 datum shifts and 50+ mathematical projections, local and global vertical datums.

At the February NAPE tradeshow, Neuralog was demonstrating new functionality in NeuraMap. NeuraMap, traditionally a map digitizing application, now performs calculating volumetrics and reserves calculations and automatically generates summary reports.

Quorum Business Solutions has released Quorum GIS 4.0, an ESRI ArcGIS 9.2-based version of its mapping solution. Quorum GIS extensions for ArcGIS Server and ArcGIS Desktop lets users develop, manage and secure layers, maps and plats. Other tools assist with data capture workflows—including ‘rich’ textual and spatial search capability, data capture for metes and bounds, an aliquot (quartering) tool and batch polygon generation from legal descriptions.

Safe Software has announced FME 2008 which includes a new FME Server application. FME Server is an ‘enterprise-scale’ spatial extract, transform and load (ETL) that supports users inside and outside the organization. A scalable, service-oriented architecture (SOA) is used to centralize spatial data conversion and distribution tasks. FME Desktop now supports multiple formats and database formats including Oracle 11g and Microsoft SQL Server Spatial 2008. KML 2.2 and GeoJSON are also provided for web services connectivity.

Schlumberger’s new ‘WellWatcher’ fiber optic technology targets harsh, high-temperature wells. The distributed temperature acquisition system (DTS) provides ‘rugged reliability’ for permanent real-time monitoring.


Absoft and Arnlea team on oil country asset tracking

Barcode asset tracking system developed for CNR/KCA Deutag now on general release.

Aberdeen-based SAP consultancy Absoft has teamed with Arnlea Systems on a barcode asset tracking system. The system was developed in cooperation with drilling contractor KCA Deutag for use in CNR International’s North Sea operations. The system tracks assets and materials between onshore and offshore drilling sites.

Handheld

Handheld computers with bar code readers identify materials and capture transactions. Information is transmitted via a docking station into SAP for integration with logistics and maintenance management systems. Workflows have been developed for goods receipts, issuance, physical inventory management and returns processing. Following a successful pilot, the system is now being rolled out to all five of CNR’s operated North Sea platforms where KCA Deutag provides drilling and maintenance services.


Stratavia data center automation for Xcel Energy

Data Palette optimizes server provisioning and database administration across distributed resources.

Electricity and natural gas provider Xcel Energy has deployed Stratavia’s data center automation platform to streamline IT operations. Stratavia’s Data Palette helps IT professionals define and orchestrate standard operating procedures and reports. An ‘intelligent rules engine’ lets IT administrators combine automation with predictive analytics to move from ‘chaotic, reactive activities’ to a ‘higher level of automation intelligence.’

Carlson

Xcel CIO Michael Carlson said, ‘This technology allows us to optimize system performance across the entire energy grid, automating critical IT processes, such as server provisioning and database administration.’

Culverhouse

According to Stratavia president and CEO Thor Culverhouse, ‘Data Palette automates data center operations in critical and complex environments, mitigating IT disruptions. Xcel Energy is pioneering methods that set the standard for availability in their industry.’ Xcel was founded some 150 years ago and now operates 34,500 miles of natural gas pipelines. Its data center operations are outsourced to IBM’s Global Services.


ESRI Petroleum User Group 2008

With around 1800 registered, ESRI’s Petroleum User Group is on a roll. We report on how Google Earth is spreading the GIS ‘word,’ on ESRI’s own Arc Explorer spinning globe app, on mashups and a ‘magic pen’ for field workers. Company presentations include Chevron Angola’s ‘turtle’ GIS study, Shell on hydrology and bar coded well heads, Shell’s global GIS and Anadarko’s hardware tests. Pipeline GIS usage is rising—as witnessed by papers from NiSource and BP. We also report on refinery GIS, Python GIS programming and on ESRI’s geological mapping developments.

Clint Brown, ESRI’s head of software, believes GIS is evolving from its project focus to cut across the entire organization—even though integration is still ‘a goal.’ Brown sees GIS as ‘strategic IT, a window into the database.’ Which implies that all feature collections (pipelines etc), image data (rasters) and attribute data is georeferenced. According to a Gartner study, ‘GIS is now as ubiquitous as the database or as ERP.’

Google Earth

To date there have been 350 million downloads of Google Earth (GE). The GE phenomenon has brought ‘consumer mapping’ to a larger audience and has raised the bar for ease of use. GE’s KML (Keyhole markup language) lets users add properties to map layers. But ‘true’ GIS is a lot more than this. Brown’s examples included information deployment across multiple mapping applications, custom maps and mobile applications for field workers. A key trend in ArcGIS 9.3 is the ability to ‘mash up’ information. ArcGIS 9.3 exposes a REST API that ‘turns every web service into a URL for easy, programmable assembly.’ Everything you can do with a desktop GIS application can now be served leveraging the ESRI Web Application Development Framework.

Workflow management

The ESRI Job Tracking Extension (JTX) allows a workflow to be captured and distributed to colleagues. A demo showed JTX use in pipeline construction. The JTX Desktop application integrates with GIS and non GIS applications and provides a list of job IDs with spatial references. A flow chart shows connected jobs that are stored in a central database. JTX manages preliminary CAD design, database setup and high consequence analysis (HCA) with ArcMap fired-up in context and with instructions. The workflow then moves on to the regulatory approval stage with generation of a PDF document.

Mashups

Dale Hunter demoed ESRI’s ‘mashup’ offering (a.k.a. map services). A map service is created that generates KML. Manager is used to add queries (on wells from drop down list) and then the .NET code is generated for the web map application. On the user’s browser a map appears with a tree view—layers can be turned on and off. Web map layers can be viewed in Microsoft Virtual Earth. REST is used to make calls back into ArcGIS Server to load layers from ArcGIS Online. Any Geodatabase along with its business logic can be published as a service. The PostGres PostGIS extension format is also supported. Mashups integrate GIS with SharePoint with, for instance, OSIsoft’s PI Web Parts linked to maps for real time data analysis and reporting. Brown claimed that it was these windows into real data (as opposed to just ‘a picture’) that distinguished web GIS and ‘consumer mapping.’

Mobile workforce

Mashups for iPhones or higher end devices let you take ‘smart technology’ to the field. ArcGIS Mobile is used in the pipeline industry. Mxd files can be deployed on a field device and drop down lists for pipeline types/materials are populated from the geodatabase to assure consistency. A field worker spots a new housing estate with a playground and can add a new feature on the spot. EnCana helped out with this workflow and demo.

Magic pen

Field workers with no fancy handhelds—or who prefer dealing with pen and paper might like the ‘magic’ pen from Adapx and its companion Capturx for ArcGIS software. The device lets you annotate maps, adding pipelines, wells, whatever from a printed legend. Back in the office the pen docks and data is uploaded. Special paper with a discreet pseudo-random background of dots allows the video camera in the pen to ‘know’ exactly where on the sheet it is.

ArcGIS Explorer

ESRI’s answer to Google Earth is ArcgGIS Explorer (AGE). AGE connects to the ESRI-backed Geography Network for soils, landmasses and high resolution US topographic maps from ArcGIS Online. AGE ‘does a better job of connecting to your internal base maps than Google Earth or Visual Earth.’ On startup, AGE looks for the ESRI home server, or it can be pointed at a user’s home server. Explorer can spider the network looking for other data sources like Shapefiles, KML and imagery. AGE provides server side caching, pre-rendering maps on the server and presenting them to AGE. It is possible to disconnect from the server and work on the cache. An Explorer SDK is available for extending and customizing. Any url can be accessed and drill-down through popups to other urls—for well information, core slab photos etc. Tasks have a ‘send to’ function for driving directions, weather finder or Wikipedia search. The Geonames web service (geonames.org/export) provides a lexicon of place names. ESRI plans to offer high quality international data from Digital Globe as a premium service. But while Google can afford to ‘dump’ this data, ‘we have to stay in business!’

Chevron Angola

Greg Slutz described Chevron’s ArcGIS-based map of its Malongo base in Angola. The mapping project includes an environmental facet as there are some 200 turtle nests along the beach. GIS is also applied to facilities, field outlines and bathymetry. Imagery has been acquired along the Cabinda coast for turtles and seismic planning. The mapping system provides access to a workforce spread between San Ramon, Houston and Cabinda. Chevron’s Project Development and Execution Process was used, a phased approach to deployment with standard tools and an executive review process. The system is built atop an ArcSDE database and data is loaded from IHS and other vendors. ArcGIS 9.2 has been deployed for its replication features. Maps are generated as PMFs or GeoPDFs. In the future Chevron is to offer GoCAD to GIS data conversion.

Shell hydrology

Keith Fraley’s presentation outlined Shell’s data cleanup project in support of hydrological studies in the Piceance basin oil shale play. The project involved a move from ‘convoluted’ well names and ‘Excel hell’ to a standardized hydrological database. A well identification program was initiated to clean up well headers in OpenWorks and environmental data in Earth Soft’s EQuIS. Data verification was performed with photos and field videos of each pad. DuraLabel vinyl ‘2D QR Barcode’ stickers were placed on each well. Now field crews visit wells with ArcGIS Explorer on Panasonic Toughbooks and rugged bluetooth barcode readers. GPS Navigation is performed with a free application, EarthBridge—this takes GPS and turns it into KML on the fly. Fraley liked ArcGIS Explorer and its API—AGE is ‘the only spinning earth application that does subsurface.’ There is also ‘good support for ESRI data—geodatabases and shapefiles.’

NiSource Gas

NiSource’s GIS Portal supports a diverse user community in 17 states according to Debra Rohrer. It was developed with ESRI GIS Portal Toolkit. The rationale behind NiSource’s Portal is that while ‘pipeline is our passion, others do endangered species, land etc. better than we can.’ GIS is considered as a ‘breaker-down of silos.’ A demo showed drill down from the map of pipeline system for a project—including drilling new wells, workovers and or acquisition and bringing in new production to the pipe network. Portal acceptance is now ‘very high.’

Python and GIS

Chad Cooper (Southwestern Energy) got a good turn out for his introduction to GIS programming in Python. Python is fast, reading a 100,000 line text file into memory easily. The CeODBC third party module is used to connect to a database. The smtplib offers easy emailing and Wget offers a command line interface for file, http or ftp retrieval. This can easily download data from sources such as the www.archive.org/download (US Quad data.) Third party modules support Excel read/write, mySQL database access and geocoders for GE, Yahoo and more.

Anadarko

Peter Moreau offered an under the hood look at Anadarko’s ArcGIS deployment (co-developed with Geodynamic Solutions). Early tests on a virtual machine failed. System design used Dave Peters’ capacity planning tool and ended up with two quad core dual socket servers with 16GB RAM running 64 bit Windows 2003 Server. Anadarko’s main use is web map publishing from ArcGIS Server Manager. Data packages include Geologic Data Systems which comes packaged as ArcGIS projects. These were getting light use before this project. ArcGIS Server offers simple list-based access to data and publishes the full functionality of packaged products. A production surveillance application displays variance on the last two days of production—this is easy map view and print to PDF.

GIS and the ASME B31.8S

John Lineham (Enbridge) described the book ‘American Society of Mechanical Engineers (ASME) B31.8S’ a.k.a. ‘Managing system integrity of gas pipelines’ as ‘great value at $110.’ The 2001 standard covers integrity management planning, detection, prevention and mitigation. Risk is defined as the product of the probability of failure and its consequences (to service, humans and environment). GIS is used for threat analysis, maintenance, communications (who to call), reporting (maps and documents) and data storage.

BP’s refinery GIS

Boris Kowalewski’s company, ViaSecure, has developed a GIS solution for BP’s German refining unit. A ‘comprehensive data model’ covers facilities, tasks, network analysis, cost and reporting and supports construction, monitoring, networks, warehouse logistics, safety, fire brigade activity and maintenance. The solution had to be ‘no hassle’ for refiners with no GIS experience. The system is built around an ArcGISServer, and an ArcGIS Desktop extended by Plant Data Model and standard tasks. The geodatabase runs on SQL Server (BP policy) to ArcGIS and out via an RSA Server to login from a geobrowser.

Shell

Berik Davies presented Shell’s Global GIS, built atop SDE on Oracle and AIMS clients. Data is replicated from the Amsterdam master out to Shell’s worldwide units with NetApp’s ‘Snap Mirror’ on a monthly basis. Usage monitoring with Nagios open source monitoring software allows for scripting for fancy monitoring of servers, SDE performance and drill down into users, warnings etc. This information lets Shell go back to IT with documented service level agreement issues and helps pinpoint problems. SnapMirror operates at the block level so only changed blocks are replicated. In 2008 ArcGIS Server is to replace ArcIMS. Data KPIs are captured via crawlers and show on a standard dashboard. Shell data quality KPIs include broken links, data with no CRS, duplicates, unused data, stale data deprecated. Exprodat’s IQM is ‘pervasive in Shell’.

BP’s offshore pipelines

New Century Software, working for BP’s Gulf of Mexico unit, has extended the PODS base with offshore components. These have been returned to PODS and are now in the 3.2 model. Additions include physical inspection, cathodic protection and online video. BP’s Gulf of Mexico pipeline management system includes hot links to BP’s Deepwater Documentum EDMS for engineering documents and links between GIS and streaming video sources. This includes ROV surveys of subsea pipes. Sidescan sonar is displayed as a backdrop. Search is possible from IMS across map features and the database. The yearly Oceaneering/Fugro ROV surveys create a huge dataset. Data is time stamped for comparison with ‘as built.’ BP has a lot of reporting requirements to fulfill.

Geological mapping

Steve Grisé presented ESRI’s work on cartographic representations of geological maps, linking GeoScienceML to the Geodatabase. Challenges include the ‘complex and interpretive’ nature of geology and its multiple lists of descriptions and a ‘disconnected’ between map and data views. The GeoSciML prototype is designed to get GeoSciML content in and out of a geodatabase. XSLT is used to go from GSML to the geodatabase then ETL to get back to XML of web services. A demo involved data from USGS NGMDB and OneGeology (for mineral exploration and geohazards.) A geological map was made showing the name of a formation and its dominant lithology—with data extracted from a labyrinth of GSML formation identity data. Tools used in development include Stylus Studio Enterprise XML editor and Microsoft Vision to browse the geodatabase.

This article is taken from a longer, illustrated report produced as part of The Data Room’s subscription-based Technology Watch Service. More from www.oilit.com/tech.


Folks, facts, orgs ...

Aveva, FIATECH, Circle Oil, Invensys, PPDM, Hubwoo, DEAL, Petris, Eurotech, OHM, Fugro, dGB, Honeywell, Ingrain, SpectrumData, Intellection, MVE, NGA, NetApp, OFS Portal, PGS, Woodside.

Aveva executive VP Derek Middlemas has been appointed to the FIATECH board of directors.

Adrian Burrows and Stuart Harker have joined UK-based Circle Oil. Both were previously with PGS.

Mohamed Saad Yahia heads up a new ‘Industrial Automation Engineering Centre of Excellence’ in Egypt, a joint venture between Invensys, ENPPI and GASCO.

Steve Cooper has joined the PPDM Association as chief communications officer. Cooper was previously CTO with IHS Energy.

cc-Hubwoo has changed it name to just ‘Hubwoo.’

Users of the UK’s DEAL data website will now have to pay a registration fee to download data, £250 for a single user and £2,500 for a corporation.

The UK’s ‘Capturing the Energy’ historical archive is now live on www.capturing-the-energy.org.uk. The archive currently covers the recently decommissioned Frigg gas field.

John Archer is now product manager of Petris’WINDS Enterprise. Archer hails from BEA Systems. Buddy Rhodes is moving to Calgary to kick-off Petris’ new Canadian presence.

John Green has joined Eurotech as onsite IT support engineer for CSEM specialists OHM Ltd.

Michele Angel has joined Fugro-Jason as business manager for North America. Angel was previously with Marathon Oil.

Nabil Habayeb, President and CEO of GE, Middle East and Africa opened a new dedicated calibration, repair and training service centre in Abu Dhabi this month at the CERT Technology Park. The center will be managed by CEO Bob Richards.

Honeywell’s UOP unit has opened a natural gas processing design center in Kuala Lumpur, Malaysia.

Amos Nur has left Stanford University to join Houston-based reservoir-rock analysis firm Ingrain.

IES is to establish a network of Petroleum Systems Centers (PSC) with Kristijan Kornpihl heading-up the Houston PSC, Harald Karg in Mexico and Oliver Swientek in Germany.

SpectrumData has just published a comparative review of taps and disk storage technology. The 27 page informative review is available from www.spectrumdata.com.au.

Intellection is setting up a new oil and gas lab in Abergele, North Wales.

Cecile Allanic has joined Midland Valley as consulting geologist. Allanic is a recent graduate of the Neuchatel Institute of Geology, Switzerland.

Neil McKay has joined NGA as project geophysicist. McKay comes from MANTEC.

Dan Warmenhoven is now NetApp’s CEO while president and COO Tom Georgens joins the company’s board of directors.

Maggie Montaigne has joined OFS Portal’s board of managers. Montaigne is Halliburton’s ExxonMobil account VP.

Jon Erik Reinhardsen is now president and CEO of PGS replacing succeeding Svein Rennemo.

Jorge Machnizh is Paradigm’s president and COO succeeding John Gibson who retains the CEO role.

Woodside has joined dGB ‘s Seismic Stratigraphic Interpretation Consortium.

~

Tip of the month—gleaned from a vendor working on major master data project. If you have multiple, inconsistent data sources spread across the organization, get a handle on your master data with a Google Appliance! This not only ‘googles’ text, but can also read into database tables and extract lists of aliases for synchronizing master data.


DAMA talk by BP and IPL on ‘self-service’ data modeling

‘Data Modeling as a Service’ manages ER/Studio access for BP’s data modeling community.

Christopher Bradley (IPL) and Ken Dunn (BP) presented a paper on ‘Data Modeling as a Service (DMaaS) at BP’ at the Data Modeling Association (DAMA) annual meet in San Diego this month. BP has a huge dispersed IT system as witnessed by some statistics from its Digital and Communication Technology (DCT) division. BP has 250 data centers (these are consolidating to three ‘mega data centers’), 80,000 desktops, 6,000 servers, 7,000 applications including 33 SAP instances, 26 ‘major’ data warehouses from SAP and Kalido all running across 30 petabytes of spinning disk.

Reasons to model

BP models big time! To capture business requirements, promote data reuse and assure quality, to bridge between business and technology and to assess the ‘fit’ of packaged solutions. Modeling helps BP identify and manage redundant data and provides ‘context’ for projects—facilitating systems integration.

Vision

BP’s vision is for data and information that are managed as an easily accessible shared corporate asset. This means integrated, enterprise-wide global data, a single view of customer and product master and ‘real-time’ processing where needed. The DCT unit has a role to play in coordinating initiatives that target data quality, visibility and ownership.

Decentralized

BP’s decentralized environment has led to a situation where there is no single subject area model and few modeling standards. Historically a project focused approach led to documentation getting lost. Continuity of resources has proven problematical as much work is out-sourced. Another driver for change was the need to avoid BP’s ARIS process modelers developing data models independently.

The solution

The solution, co-developed by BP and UK-based IPL was for ‘self service’ administration—with registration required for use of Embarcadero Technologies’ ER/Studio modeling tool. When a model is approved, an automated publishing process moves models to BP’s Data Modeling Environment SharePoint repository.

Community of Interest

A modeling ‘community of interest’ (COI) shares business cases, best practices and proposes ‘domain directives’ for data modeling at BP and also avoids ‘re-inventing the wheel.’ The COI can also influence Embarcadero’ product development. The COI also handles certification of internal and suppliers.

2008 on ...

Looking forward BP is working on a model validation service, with promotion of ‘approved’ master data models and industry standard models such as PODS*. The COI is also driving a ‘quality model culture’ and seeking to develop cross domain modeling governance. BP is also anticipating a ‘services-oriented architecture (SoA) world’ where ‘definition of data and results from services are vital. Other facets of BP’s ongoing modeling effort address reverse engineering of legacy models in business warehouse information ‘cubes’ and ‘business object universes.’

* Pipeline Open Data Standard—www.pods.org.


Denbury tracks well failures with Winds Enterprise

Petris Technologies’ package used in root cause analysis of stripper well failure.

Petris has just published an analysis showing how information management (IM) performed with its PetrisWINDS Operations Center (PWOC) can lead to cost efficiencies and reduced well failure. The PWOC well failure module provides processes and key performance metrics of failure histories, production rates, repair histories and chemical treatments. Denbury Resources has been using PWOC to track well failures for the last five years.

McMann

Denbury Operations Manager Bill McMann said, ‘The only way to mitigate well failures is to look for trends at the granular data level. Knowing the well bore configuration is only part of the equation, you also need other factors like rod failures, tubing wear, drilling depths, mud properties and environmental conditions at the time of failure.’

Similar

One Denbury analysis compared found a trend of unusual rod and tubing failures at a particular depth and direction. Simple changes were effected to bring the wells back on stream. Petris’ back of the envelope economic calculations suggest a million dollar saving per year is possible on a 100 stripper well field. More from sales@petris.com.


Award for Odfjell’s PhoDoc drilling monitoring application

Web based tool leverages open source software for real time data integration, collaboration.

Norwegian Odfjell Drilling Technology has won an award from the European Multimedia Forum for its ‘PhoDoc,’ web-based remote real time drilling monitoring solutions. StatoilHydro and ConocoPhillips both use the package which is also deployed in Odfjell Drilling’s operations.

PHP

PhoDoc embeds open source software including a MySQL database backend and a PHP middle tier. Two user interfaces are available, the Dashboard (developed with the PRADO framework for PHP5) and the Browser (which uses Flash). All communication between the Browser and the middle tier is done using XML.

SOA

To facilitate secure SOA, PhoDoc implements several prevention mechanisms including a Secure Socket Layer (HTTPS), sensitive data encryption using SHA1 and using ViewState, part of the PRADO framework. The application exposes detailed information about installations through a ‘project based collaboration interface.’ PhoDoc integrates with foreign real time data sources. For example it is possible to integrate logistics information from SAP and view it in PhoDoc. PhoDoc can communicate with WITSML and PRODML data sources although these do not comprise the backbone of the system.


P2 Energy Solutions hosts Highland’s financials

Petroleum Financial Software as a Service offering ‘perfect fit’ for growth-oriented start-up.

Fort Smith, AK-located Highland Oil & Gas has contracted and P2 Energy Solutions (P2ES) to provide a hosted petroleum financial management solution in support of its early stage natural gas exploration effort. Highland wanted to avoid a large investment in accounting, land systems and personnel and instead opted to outsource ‘non-core’ aspects of accounting and land administration.

PFI

P2ES’ Petroleum Financial (PFI) hosted solution helps oil and gas companies manage land and accounting operations without the typical costs and capital expenditures associated with running a back office. PFI provides web-based access to reporting, document management and business analysis tools.

Rugg

P2ES senior VP Jerry Rugg said, ‘Our broad range of services and large staff are a perfect fit for a growth-oriented company like Highland and we look forward to a long and successful relationship with their management team.’


Baker Oil Tools I-Well for Saudi Aramco

Zuluf field gets ‘Equalizer’ optimized natural gas lift intelligent well completion system.

Baker Oil Tools has successfully installed an optimized natural gas lift intelligent well completion system in the Zuluf field offshore Saudi Arabia. The system combines Baker’s Equalizer openhole completion with a cased-hole intelligent completion system. This controls gas injection into production tubing to enhance heavy oil lift.

HCM-A

The high end completion includes ‘HCM-A’ remote controlled hydraulically actuated sliding sleeve valves facing the gas zone. These are hydraulically actuated to control lift gas injection. The completion provides isolation, mitigation and control of water/gas production, enhanced oil production and recovery with minimized well intervention.


API 2008 Basic Petroleum Data Book released

American Petroleum Institute online compendium of US and international statistics from 1947.

The American Petroleum Institute (API) has just published the 2008 edition of its Basic Petroleum Data Book—now in its 28th year. The Data Book is a compendium of US and international petroleum statistics beginning, in most instances, in 1947. The Data Book contains historical data on worldwide oil and natural gas reserves, exploration and drilling, production, refining, transportation, historical prices, product demand, imports, exports and environmental information. A glossary and a source/contact list are included in the 600-page volume.

API DATA

The Basic Petroleum Data Book is available electronically through API’s electronic statistical service, API DATA. The electronic version is updated continuously and files are posted as Excel spreadsheets. Subscribers to the electronic version receive unlimited access to all updates for an annual fee of $5,295 for non-API members, or $2,720 for members. Subscription to the two issues of the print edition costs $1,430 with a 20 percent discount for API members. More from http://apidata.api.org.


‘Visual KPI’ embeds OSIsoft’s PI System Analysis Framework

OEM agreement sees key performance indicators addition to plant and process infrastructure.

Transpara Corp. has just announced ‘Visual KPI for AF’ an operations intelligence add-on to OSIsoft’s Analysis Framework (AF), an enterprise-wide infrastructure for real time plant and process data modeling. AF is used to define a ‘consistent representation’ of an asset, providing attributes and context for PI System and non-PI System data. Visual KPI for AF adds web-based access to AF data, and links any AF element to key performance indicators (KPIs), scorecards and views in Visual KPI.

Saucier

Transpara CEO Mike Saucier said, ‘Visual KPI for AF needs no configuration or programming. AF brings order and structure to many kinds of data and Visual KPI for AF delivers this to users via the web. By adding a visual component to the monitoring, aggregation and analysis of AF data, Transpara maximizes an OSIsoft investment, extending the power of AF in asset-intensive verticals.’

OEM

Transpara has signed an OEM agreement with OSIsoft under which Visual KPI is to be embedded in the PI System. The agreement will enable KPI status and trends to be evaluated over time. Visual KPI allows users to create their own time based scorecards such as, ‘plot all KPIs that entered the ‘High’ state in the last 60 minutes.’


Done deals, acquisitions, mergers and more ...

TietoEnator, eLynx, Fugro, Aspentech, Energy Ventures, IHS, ION, Granherne, Kongsberg, P2ES.

TietoEnator (TE) has received a public tender offer from Cidron Services. Cidron of €15.50 per share cash. Cidron is owned by Nordic Capital Fund VI. The offer represents a 6% premium over the six month average of TE’s share price. TE chairman Matti Lehti said, ‘Our preliminary view is that this unsolicited offer does not reflect TE’s true value.’

~

Tulsa-based eLynx Technologies has has selected Windstorm Corp. as a reseller and installer of its CompressorLynx and WellLynx products.

~

Fugro is to acquire 60% of Russian EM specialists Electro Magnetic Marine Exploration Technologies (EMMET). Fugro is to pay €10 million now and will have an option on the remaining shares.

~

Following its delisting from the NASDAQ (OITJ Feb 08), AspenTech has appointed KPMG as auditor replacing Deloitte & Touche. KPMG was chosen for its ‘experience working with global software companies’ according to AspenTech CFO Brad Miller.

~

Stavanger-based oil and gas technology VC Energy Ventures has closed its third fund with committed capital of $243 million. The company has also opened an office in Houston and plans to focus investments ‘on small companies with new technologies that address challenges such as the personnel shortage and the development of smaller, complex assets.’

~

IHS is acquiring Dolphin Software and Environmental Software Providers for a combined purchase price of $43.5 million.

~

Fletcher International has exercised its remaining right to purchase $35.0 million of ION Geophysical Corp.’s (previously Input-Output) Series D-3 Cumulative Convertible Preferred Stock, a newly designated series of preferred stock.

~

KBR’s consulting unit Granherne has been awarded a four-year frame agreement by Petoro AS to execute front-end studies and project reviews of Petoro’s Norwegian continental shelf projects.

~

Kongsberg Maritime is to acquire simulator specialist GlobalSim Inc. in a 100 million NOK cash transaction. The acquisition reinforces Kongsberg’s training simulator activity for maritime and offshore applications.

~

P2 Energy Solutions has signed a letter of Intent to acquire 100% of the outstanding shares of Houston-based LandWorks Inc. including its wholly owned subsidiary Geodynamic Solutions.

~

WellPoint Systems has signed two non-binding letters of intent with Quorum Funding Corp. in respect of private placements to investment funds to the tune of approximately US$18 million.


Chevron’s Web 2.0/3D real time ‘dazzles’ SPAR attendees

LIDAR conference presentations on ‘as built’ virtual models and new 3D imaging standards from NIST.

Speaking at the 2008 SPAR conference Kevyn Renner, strategy and planning leader, control and information systems with Chevron’s global refining unit gave a ‘dazzling’ demonstration of laser scan for 3D data capture. This is leveraged with Web 2.0 technology to serve up plant asset information in a ‘virtual environment that supports remote collaboration, real-time immersion and expert knowledge capture.’

Standards

At a well attended session, Alan Lytle of NIST and chair of the ASTM E57 committee on 3D imaging systems, gave an update on the committee’s work on standard terminology, instrument performance evaluation, best practices and data exchange protocols. The E57 committee meets twice yearly to discuss 3D imaging systems including laser scanners (LIDAR) and optical range cameras. Other conference tracks illustrated high-dynamic-range and gigapixel imaging with 3D laser scanning, fusing laser scan and sonar data, managing very large 3D databases. SPAR 2008 had over 700 attendees—30% up on last year. More from www.sparllc.com.


Shell, StatoilHydro deployments push SmartWell count past 300

High end completion technology from WellDynamics for Agbami and 8 year, 1bn NOK frame agreement.

Shell/Halliburton joint venture Well-Dynamics has successfully deployed its ‘SmartWell’ intelligent well completion technology in Chevron’s Nigerian Agbami development. Abgami is operated by Chevron’s ‘Star’ deep water unit.

Dual zone

The dual-zone completion, a first for offshore West Africa, will balance production between zones and control water encroachment into the wellbore. The SmartWell is equipped with WellDynamics’ Direct Hydraulics and an Accu-Pulse downhole control system, multi-position HVC interval control valve and downhole gauges, as well as a multi-phase downhole flowmeter and an automated surface control system.

Mathieson

WellDynamics president and CEO Derek Mathieson said, ‘We are confident that our SmartWell technology will prove reliable and fit-for-purpose - and will help Star meet its financial, technological and operational goals.’ The field development plan consists of 20 oil producers, 12 water injectors and six gas injector wells with flow control, permanent monitoring, or both. The single-wellbore completions are expected to produce from, or inject into, two distinct sand units.

StatoilHydro

In a separate announcement, Well-Dynamics has been awarded two frame agreements for work in the Norwegian North Sea by StatoilHydro. These cover the installation of electronic flowmeters and gauges in SmartWell completions. This agreement is part of a 1 billion NOK, eight-year contract. Some 300 wells worldwide have now been equipped with WellDynamics’ SmartWell systems. More from smartwell@welldynamics.com.


Control Dynamics selects IntelliMax for real time offshore data

Joint venture extends IntelliMax real time data platform to offshore industrial automation.

Control Dynamics International (CDI) and Sensys are to embed the IntelliMax real-time data management platform in offshore industrial automation systems. CDI supplies such systems to the energy industry while Sensys develops software for real-time data management. The initial application of the joint venture will be the control system for a hydraulic workover unit being designed and built by CDI for a major offshore production facility in the Gulf of Mexico.

Wilson

CDI president Van Wilson said, ‘We selected IntelliMax over other human-machine interfaces for its high performance and integrity, something our oil and gas clients require. IntelliMax comes bundled with a complete set of OPC-based interfaces, which can be readily used without additional development. The scalability of the product was also a factor in our decision as our clients require systems that can be easily expanded as needed over time.’

Qadir

Sohail Qadir, president of Sensys added, ‘IntelliMax is already in use in both upstream and downstream applications. Our partnership with CDI adds a system integration and consulting offering.’

Weidmuller

In a separate announcement, CDI has partnered with Weidmuller to offer its connectivity and switching solutions to the US energy market.


Western Refining deploys KSS fuel price management suite

PriceNet and Visualizer deployed in ‘end-to-end’ retail fuel price management system.

El Paso-based Western Refining has selected software from KSS to support its product pricing effort. Western is to deploy KSS PriceNet and Visualizer tools to improve and simplify fuel pricing at its 150-plus locations across New Mexico, Arizona and Colorado.

End to end

PriceNet is an ‘end-to-end’ solution for retail fuels pricing. KSS Visualizer adds web-based data visualization to flag up significant changes in key performance measures such as volume, running rates, margins and more. The Visualizer supports all level of the business from individual stores to whole networks.


BP proposes DAMA Energy special interest group

San Diego meet hears proposal to integrate energy data models from POSC/Energistics and PPDM

At the annual meeting of the Data Management Association (DAMA) in San Diego this month, Mona Pomraning, E&P enterprise data architect with BP proposed an Energy Data Management special interest group (SIG). The energy SIG is to address data issues that face energy companies with out-sourced, virtual teams, videoconferencing, intercontinental travel and multi-cultural teams. The old Houston DAMA chapter included many industries, with the energy sector representing a significant portion of the membership. Rather than reactivating the Houston chapter, Pomraning proposes the creation of an Energy SIG to give members a forum to address ‘vendor neutral issues and concerns.’ At the top of the list of topics is ‘integration of the various energy sector data models from PPDM, Energistics and others.’

Data warehouse

Other high level objectives for the SIG are integration of data modeling, data warehouse and master data management tools and third party applications—along with the thorny issue of the quality of deliverables from third party vendors selling data structures and content. Before joining BP, Pomraning was involved in data management of Boeing’s 777 airplane.


OSIsoft showcases real time WebParts at Microsoft DevCon

Paper by Prabal Acharyya highlights increased data visibility and PerformancePoint business intelligence.

Speaking at the Microsoft Office DevCon event in San Jose, CA last month, OSIsoft’s Prabal Acharyya presented a paper on ‘unlocking dynamic data in PI System with Microsoft Office 2007.’ The process industry needs data ‘visibility’ across ‘multiple systems and sites.’ Without the business intelligence (BI) ‘big picture,’ companies may miss opportunities to reduce costs and improve operations.

PI System

PI System, now integrated with Microsoft Office 2007, is the answer. An IT stack builds on data servers such as MS Dynamics, SAP, Siebel and PI. These feed Office clients leveraging the ‘open’ XML Office formats and the shiny new ‘Fluent*’ interface. Orchestration and data management is provided with SharePoint Server, Designer and VisualStudio 2008.

PerformancePoint

OSIsoft’s WebParts are used, along with Microsoft’s BI tool ‘PerformancePoint,’ to provide key performance indicators of a plant’s operations. A demo of a test bed deployment with Calgen showed data blending from SAP and PI—using PI real time WebParts. A ‘PI to Excel’ WebPart displays tree views of equipment schematics, tables and gauges.

* http://office.microsoft.com/en-us/help/HA101679411033.aspx.


Valero deploys ‘geo-clustering’ for Windows 2008 Server

Refiner distributes IT in resilient infrastructure to minimize risk of loss or downtime.

Valero Energy Corp., North America’s largest refiner, has several of its 17 refineries located in coastal areas prone to severe weather. To ensure a ‘resilient’ IT infrastructure in case of disruption and to protect mission-critical applications and communications, Valero has deployed Windows Server 2008 with failover clustering to 90 users in its IT Infrastructure team. Microsoft Exchange Server 2007 has been deployed on a two-node failover cluster with a storage area network (SAN) for data storage. Each server has its own independent data and cluster disks. By activating the high-availability components of WS2008, Valero plans to improve server stability, reducing system recovery time from days to minutes and optimize hardware utilization and security.

Katrina

Following hurricane Katrina, Valero has strengthened its infrastructure with ‘cluster continuous replication’ and is now looking to deploy geographically dispersed failover clustering (geo-clustering).

Manager

To monitor and manage its systems and applications Valero is deploying Microsoft Operations Manager 2005 and Microsoft Systems Management Server 2003. Valero expects to upgrade to Microsoft Operations Manager 2007 next year. Valero also plans to deploy read-only Active Directory domain controllers (RODC) to minimize ‘at risk’ information at remote locations.


Information Builders and Upstream Professionals team

WebFOCUS business intelligence to underpin new well lifecycle efficiency framework.

Business intelligence software specialist Information Builders (IB) has teamed with Houston-based consultants Upstream Professionals Inc. (UPI) to extend UPI’s well life-cycle efficiency framework with IB’s WebFOCUS platform. The combined offering constitutes a new well life-cycle management (WLCM) reporting tool for oil and gas and will include a web-based front-end to analyze oil field asset metrics and key performance indicator.

Cohen

IB CEO and founder Gerald Cohen said, ‘UPI’s upstream experience coupled with the reporting and visualization capabilities of WebFOCUS will transform data and workflows into knowledge environments that foster operational efficiencies from pre-drill to production.’ The WLCM covers a broad spectrum of services including data integration, dashboards, data drill down, analytics and reporting. Links to accounting systems provide financial insights into daily operations. IB clients include Hunt, Petro-Canada, Valero and West Coast Energy.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.