April 2008


Virtual Seismic Atlas

UK academic extends BHP Billiton’s getKnowledge application with data from CGGVeritas and Web 2.0 technology in online library of seismic image data. System to roll-out at AAPG this month.

In a webinar last month, Rob Butler* described how the University of Leeds, along with Endeca and Blue Fish Group developed the ‘Virtual Seismic Atlas’ (VSA). VSA is an online library of seismic image data classified according to a flexible taxonomy that leverages work done in the Turbidites Research Group. Partner BlueFish developed the solution around Endeca’s ‘Profind’ search and navigation platform. VSA evolved from BHP Billiton’s ‘getKnowledge’ application and rolls in content from CGG-Veritas, the Geological Society, TGS-Nopec and the BGS. Butler described VSA as a ‘subsurface version of Google Earth.’

Flickr?

This is a bit of an exaggeration. The VSA is more like Flickr with seismic images tagged according to multiple taxonomies and relationships. Seismic data is stored as jpeg thumbnails within Documentum with access to higher resolution imagery on demand. VSA offers local and regional project views of documents and interpretations. The tool targets both researchers and folks looking for a particular seismic image in commercial libraries.

Related

Alongside the current images, the VSA offers ‘related documents,’ i.e. interpretations and documents showing similar deformation, sedimentology or other geological context. The VSA can store multiple ‘competing’ interpretations along with the base seismic image. Butler believes that as it is populated, the VSA will benefit academics by exposing students to the best imagery available. For oils the VSA will help in training interpreters. The VSA also has applications in disambiguating interpretations and evaluating uncertainty. Butler also expects the Turbidites Research Group to benefit from the showcase.

Q&A

In the website Q&A Butler was asked if the VSA would take contributed interpretations from the general public in Wikipedia fashion. The answer is no. Public access would require more development resources than are currently available. But the VSA team is working on functionality that would allow an individual to download a seismic image, re-interpret it and post it back as a ‘competing’ interpretation. The VSA provides serendipitous insights from the juxtaposition of seismic sections. The VSA is currently hosted by Leeds University’s IT Department. The Endeca solution is considered scalable—although currently ‘cost constrained.’

Planet Earth

Butler told Oil IT Journal, ‘Linking seismic data in the VSA to public GIS platforms such as Google Earth would make it even easier to find images globally and the team are planning this for the next stage of development. A tie-in with the Google Earth-based geology project, ‘Planet Earth’ is planned. Test drive the VSA at www.seismicatlas.org.

* Butler is currently professor of tectonics at the University of Aberdeen.


BHI bags Gaffney

Baker Hughes has bought both Gaffney Cline and GeoMechanics International—boosting its reservoir engineering and management line of business.

Baker Hughes Incorporated (BHI) has acquired two consulting firms, Gaffney, Cline & Associates (GCA) and GeoMechanics International (GMI). GCA was founded in 1962 and now has principal offices in London, Houston, Moscow, Buenos Aires, Singapore and Sydney. The company advises on exploration, reservoir evaluation, field development, drilling and production, pipeline, refining and LNG projects throughout the world. GMI was spun off from Stanford University in 1996.

Deaton

BHI chairman and CEO Chad Deaton said, ‘These acquisitions extend our wellbore-related technology to encompass comprehensive reservoir and midstream solutions. Both firms will continue to operate as stand-alone consultancies while providing advice and services to enhance the reservoir-related offerings of Baker Hughes product line divisions.’

Meehan

BHI has also hired Nathan Meehan (previously Principal of CMG Petroleum Consulting) as VP Reservoir Technology and Consulting. Meehan will lead the company’s reservoir engineering, reservoir consulting, and hydrocarbon development activities, including the GCA and GMI businesses.


A data management 101

Oil IT Journal editor Neil McNaughton, back from the 2008 PNEC Data Integration Conference, decided that world needs a data management backgrounder. He provides some data definitions—of sample, trace, meta and master and notes the impact of the data warehouse community on the upstream and the duality of master data management and data quality. Both hot topics at PNEC.

One speaker at PNEC intimated that those who define meta data as ‘data about data’ merited a ‘punch on the nose.’ Since I have always considered this to be a rather elegant definition, and since the speaker failed to offer an alternative (to the definition, not the punishment), I thought that I would devote this editorial to a review of data, data management and the state of the industry.

PDM

It struck me that this might be a good thing because I noted a degree of obfuscation in some presentations—and I will try to explain why this has come about too. Our credentials for this I believe are reasonably good. Oil IT Journal started life back in 1996 as Petroleum Data Manager and we have covered very many data conferences on both sides of the pond since then. So here we go with Data Management 101. I would politely ask those of a pugilistic disposition to read to the end before donning their gloves.

Data

Much oil country data is a record of something happening against time. The amplitude of a seismic recording or the volume of oil produced. Other records are made as a function of depth like a well log. All of these can be plotted out on a graph—or traced. Hence they are called ‘trace’ or ‘raw’ data. Digitizing such data involves sampling the continuous trace at regular intervals (of time or depth). Hence a trace is made up of data samples.

Meta Data

Traces and samples are recorded in a wide range of more or less standard formats like Log Ascii Standard (LAS) for wells and SEG-Y for seismic. These share a common approach to how the data is recorded, just as digital camera images are stored as jpegs or whatever. The record begins with a header that contains both master data and meta data. Wait a second for the master bit. Meta data is data about data (so punch me!). Meta data may be the sample rate—the number of feet, meters or milliseconds between samples or the scale factor of the individual samples.

Master Data

Like I said, also contained in the header record is ‘master data.’ This is not data about data, but rather data that identifies the record as a whole. Master data could be a well name, a seismic shot number or a survey name. It is easy to see how a hierarchy of master data can be built up. From survey to line to shot to sample. Or from well to wellbore to log to sample. There is no reason that some data elements in the middle couldn’t be considered as both master and meta depending on your viewpoint. This is not an exact mathematical science. What is important about master data is that it has ‘currency’ outside of its immediate data object. Master data is what ties disparate data stores together. So the well name in a production test can be tied to the same well in the log database.

A caveat

It is worth observing that folks recorded ‘header’ records before they were called meta or master data. And also that they have been calling the whole header ‘meta data’ before the term ‘master data’ was even invented. The master data terminology is a retrofit to digital recording that goes back to the 1970s. I think it is a very good retrofit, but it does cause some to want to punch people because its requires some adjustments to prior usage. But this is the nature of progress.

Data mining

Part of the trouble comes from the fact that the whole ‘master data’ concept came, not from E&P but from the data warehouse community. Banks and large retailers have transaction systems that are unsuited to analysis and so build separate ‘warehouses’ that replicate information in the transactional systems. This has introduced concepts (data mart and master data) that upset the terminological applecart.

Newcomers

They have also brought new service providers who are touting their wares as a panacea to the upstream’s woes. At one level they are very welcome because E&P has grown up with a large number of data stores for well data, seismics, production and so on.

Quality

As I said above, master data is what ties different data stores together. If you have different well names in different systems or if individual systems contain different names for the same well then you are in trouble. You have a data quality problem. I state this obvious fact because master data management and data quality are actually two sides of the same coin. This was made clear at PNEC when the master data managers started jumping up and down during the data quality presentations and vice versa.

Finance?

But the big question is, does the upstream, which has a considerable history of analyzing and managing its specialist data types, have much to learn, beyond some useful terminology, from the master data community? One answer to this is that it may have no choice. Your finance department may be driving the MDM project and need access to some well or production master data.

BP

As for MDM in a more geotechnical context, you may be interested in our piece on page 9 of this issue, a synopsis of a significant PNEC presentation by BP and SAIC on the deployment of Kalido’s master data management solution to tie together Finder and BP’s drilling, completion and production databases.

Obfuscation

I mentioned ‘obfuscation’ earlier on. To a consultant from outside of the business trying to peddle a horizontal solution into a technical vertical, it may seem neat to make things seem harder than they really are. In E&P, obfuscating the simple stuff (above) is a really bad idea. If you don’t believe me, check out some of the hard stuff like RP66, SEG-D and WITSML.


Rice University meet checks out GPU-based HPC for oil & gas

Academics plus Hess, CGGVeritas and CAPS report on cutting-edge high performance computing.

Over 100 attended an oil and gas high performance computing (HPC) workshop held at Rice university in Houston last month. Scott Morton kicked off the event with a presentation of Hess’ experience of seismic processing on graphics processing units (GPU). A decade ago, Hess’ geophysicists began investigating alternatives to PC clusters for seismic processing. Hess trialed digital signal processors, field programmable gate arrays (FPGA), the IBM Cell BE and GPUs.

NVIDIA

Current effort focuses on NVIDIA’s G80 architecture and the CUDA programming environment. Benchmarks show up to a 48 fold speed up in wave equation migration over a CPU architecture. While the speed increase for other algorithms varies, Hess is encouraged by the performance improvement GPUs bring and is now building a ‘substantial’ GPU-based cluster.

Utah University

Samuel Brown (University of Utah) has been testing multi core architectures built on the Cell (that powers the Sony Playstation). The Utah tomography and modeling/migration consortium got ‘exclusive’ access to the Playstation 3 for the study. Modeling trials were run on the EAGE’s Marmousi 2 dataset. Despite the complexity of coding for the specialized processor architecture of the Cell, the study found ‘superior scaling beyond two threads for the heterogeneous Cell processor over homogeneous Intel multi-core processors.’

CGGVeritas

Guillaume Thomas-Collignon shared CGGVeritas’ experience of accelerator technology applied to wave equation migration. In the GPU stakes, NVIDIA with its volume technology and aggressive roadmap is ‘way ahead’ of other accelerator technologies. NVIDIA GPUs gave good performance with code that was easy to learn. They are a convenient add-on to existing systems. On the downside, full performance was tricky to achieve and only realistic for very intensive computations. Current systems suffer from PCI Express bandwidth limitations, limited memory and relatively poor debugging and profiling tools. Collignon’s wave equation test required ‘deep’ re-engineering of the code so that multiple GPU kernels could be used. But the result was ‘a 15 fold performance improvement*’.

Habanero

Vivek Sarkar reported on Rice University’s ‘Habanero’ project which began last year. Habanero is investigating how to bridge the gap between traditional sequential programming languages and today’s multi-core machines and parallel architectures. The goal is to ensure that future software rewrites are done on software platforms that enable application developers to reuse their investment across multiple generations of hardware. Habanero is building an open source test bed based on the X10 (http://x10.sf.net) for use on multi-core architectures. Habanero is also working on a toolset for automated parallelization of legacy ‘dusty decks’ code.

Mellor-Crummey

John Mellor-Crummey’s (Rice) talk on HPC performance analysis addressed the fact that the gap between typical and peak performance of computer systems is growing with each new generation of processors. Performance analysis and tuning are now critical. Rice University’s HPCToolkit project is working on open-source, multi-platform tools for profile-based performance analysis of sequential and parallel applications. A GUI—the hpcviewer is also in the works.

CAPS

Stéphane Bihan described how French GPU/FPGA programming specialist CAPS Enterprise is addressing the heterogeneity issue in multi-core applications. Bihan’s use case is reverse time migration, a particularly compute intensive task at the cutting edge of modern seismic techniques. GPUs and other hardware accelerators bring significant speed improvement, but there is no consensus on the programming mode or architecture. Enter CAPS’ HMPP, a software layer that sets out to integrate the accelerators rather than porting the application to them. HMPP is a set of compiler directives, tools and software runtime that supports multi-core processor parallel programming in C and Fortran on Unix platforms. The directives produce ‘codelets’ suitable for hardware acceleration. The 2D seismic test ran on an NVIDIA Quadro FX5600 and showed an 8 fold speed up. Bihan is now working on 3D seismic benchmark.

* We did not capture quite what was being compared to what.


BP shows off HIVE collaboration centers

Louisiana Immersive Technology Enterprise emulates BP’s high end visionarium.

BP’s Global Visualization Leader Jim Thomson addressed the Techsouth/Louisiana Governor’s Technology Awards Luncheon in Lafayette this month on the subject of ‘BP and big picture performance.’ Lousiana has a special interest in high-end visualization environments having built its own ‘Louisiana Immersive Technology Enterprise’ (LITE) center. Thomson reported that BP’s first Highly Immersive Visualization Environment (HIVE) was installed in 2001 and cost $1 million. But it saved BP $5 million in the first week of operations.

Growth

BP now has a family* of HIVEs. High end visualization provides ‘the foundation for continued growth.’ Thomson’s role covers upstream visualization technology which includes the HIVE program of collaborative visualization rooms. His mandate extends to the investigation of novel visualization tools and the dissemination of best practices within BP. BP is serious about innovating through visualization. Thomson explained, ‘BP has a number of vertical divisions, each addressing a specific domain. My position cuts across all of these verticals. BP uses visualization in nearly every facet of its business.’

Spears

TechSouth board member Mike Spears commented, ‘This technology was once only available to a handful of large companies. Now it’s available to anyone through LITE.’

* Speaking at the 2007 AAPG, Thomson reported that BP had 17 HIVEs in operation. As Oil IT Journal reported (April 2007), ‘the massive, centrally funded investment came from headquarters, “no one asked for them.”’


Saudi Aramco builds data infrastructure on Petris’ technology

‘Data Services’ technology extends PetrisWINDS Enterprise to seismic data management.

Omar Akbar’s presentation at the 2008 PNEC described Saudi Aramco’s authoritative data store and data services toolset (DSS). The system, developed around PetrisWinds Enterprise, revolves around, ‘smart business objects,’ data plus intelligence exposed as services. Aramco wants to be independent from vendors (including Petris!) so that it can independently define and change business rules and workflows.

Seismic

The system is used in seismic acquisition, processing and interpretation where a taxonomy has been defined. The object approach means that an interpretation can be stored as a part of the seismic section business object as dictated by the business rules. These have been set up so that they can be changed in a configuration file, no programming is required. DSS works alongside Documentum and OpenWorks.

Petris

DSS is being productized by Petris Technology as ‘Petris Data Services.’ This is extending the footprint of Petris Winds Enterprise into the seismic domain. Conversely, Aramco expects the toolset to extend into well management.


EMGS builds ‘Aurora’ supercomputer around Fusion

Controlled source EM specialist builds 70 teraflop cluster around Ibrix Fusion file serving software.

Controlled source electromagnetic (CSEM) imaging specialist Electromagnetic Geoservices (EMGS) is boosting its high performance computing facility in Trondheim, Norway with the deployment of a new data center capable of performing 70 trillion operations per second. The new set up, named ‘Aurora,’ comprises 1,830 Dell PowerEdge dual-core servers and 106 terabytes of EMC CX3-40.

Ibrix

Ibrix Fusion file serving software is deployed to manage the cluster’s storage array. Fusion provides cluster, grid, and enterprise computing environments with a scalable solution to growing data volumes.

Stranden

EMGS IT manager said, ‘Growth has placed a severe demand on our computing resources, increasing the number of jobs we process from 300 to over 15,000. Since 2005 Ibrix has demonstrated terrific performance in our high volume cluster and, most importantly, we have stable connections from all the nodes, enabling our always-on environment.’

TOP500

EMGS claims the world’s largest fleet and has conducted over 300 CSEM surveys to date. For what it’s worth, 70 TF would put the EMGS calculator around the number 13 slot in the TOP500 list of supercomputers. But this list omits most industrial and commercial machines. More from Phil Haskell—info@ibrix.com.


Enigma Data Solutions teams with high-end hardware provider

Project Archive and Retrieval System (PARS) successfully tested on NetApp NearStore virtual tape.

Oil and gas data archiving specialists Enigma Data Solutions reports the successful testing of its geotechnical archiving solution, PARS, with NetApp’s NearStore Virtual Tape Library (VTL) appliances. This new development enables oil companies to manage data archiving more cost effectively by embracing modern storage media without losing the advantages of tape-based backup systems or compatibility with legacy systems.

Bowler

Enigma VP Tim Bowler said, ‘VTL allows seismic data formatted tapes to be stored on disk and accessed as if they were on a very fast physical tape. By combining the VTL with Enigma’s I/O layer and a small tape library, we provide an independent and cost effective information lifecycle management solution from archiving and tape management to the creation of secure tape copies of geotechnical data and projects for deep archive purposes.’

Mimic

NetApp’s VTL appliance mimics tape-based systems, saving space and eliminating the need to maintain and operate a robotic automated tape storage facility on-site. The local VTL appliance can be mirrored at a purpose built, off-site tape storage facility for long term storage. VTL also provides high-performance disk compression and de-duping capabilities. Testing was carried out at Enigma’s Houston facility. The results demonstrated successful archiving of data from projects written with major oil and gas applications including Landmark’s SeisWorks and OpenWorks and Schlumberger’s GeoFrame. More from gavin.keeler@netapp.com.


Multi million dollar extension of well log ASCII digitizing program

New underwriter for TGS-NOPEC program in record US onshore LAS/SmartRaster deal.

TGS-NOPEC’s Geological Products and Services unit (formerly A2D) has secured another multi-million dollar underwriting commitment to its US Log Ascii Standard well log digitizing program. The unnamed client will receive approximately 400,000 LAS and four million smartRASTER well logs over three years.

Hobbs

TGS COO Robert Hobbs said, ‘This transaction is larger than the record deal we announced in 2007. The onshore US program now has six underwriters and commitments to nearly 850,000 wells covering much of the Rocky Mountains and the onshore Gulf Coast regions. This is in addition to our extensive Gulf of Mexico coverage.’

5 million

TGS’ well log database contains over five million well logs from key exploration areas worldwide. The data is available to clients via the LogLine Plus. More from info@tgsnopec.com.


Software, hardware short takes …

News from Geovariances, Eurotech, FFA, AGM, Paradigm, OpenSpirit and INT.

Geostatistics boutique Geovariances has just released Isatis 8.0. New features include remote access via a ‘license borrowing’ function and a spare dongle. The on-line help now includes a beginner’s guide, case studies and technical references. Of interest to oil and gas users are the exponential cosine variogram model, said to be good for seismic modeling, and the new Petrel plug in.

~

UK-based Eurotech has rolled out a new ‘deskside’ IBM 3592/TS1120 tape drive enclosure. The system houses one or two drives with a standard IBM dual redundant power supply chassis for future upgrade to rack if necessary. 3592E05 capacity is up to 700GB native per media.

~

Foster Findlay Associates (FFA) has released SVI Pro 2008, the latest version of its 3D seismic analysis package. SVI Pro 2008 runs on Windows and Linux and is claimed to speed 3D data set screening and to provide ‘robust’ delineation of complex geological entities. Also new is ‘CarbApp,’ a new imaging methodology that addresses carbonate analysis.

~

Austin Geo Modeling (AGM) has announced a new release of Recon, its geological modeling and interpretation package. New Recon 3.0 features include direct connectivity to Landmark databases, interactive cross-section animation while interpreting, dynamic stratigraphic termination rules and more.

~

Paradigm has introduced four new workflow-based applications in the latest release of its GoCad reservoir modeling suite. The new modules support 2D and 3D restoration, finite element mesh (FEM) construction (a.k.a the ‘tessellation’ workflow) and GoCad reservoir risk assessment (formerly Jacta). The FEM builds tetrahedral meshes for use in 3D restoration while the risk module leverages compute clusters to speed reserve estimation and risking.

~

OpenSpirit’s 3.1 release sees new data connector support for GeoFrame 4.4 and Kingdom 8.2, and a pre-release version ofOpenWorks R5000. A new API eases data access and reporting for users familiar with JDBC or ADO.NET-based development. The developer kit also includes a new metamodel service and expanded unit and cartographic services.

~

INT’s new 3.0 release of INTViewer claims to be a ‘simple, affordable and extendable’ geoscience data viewer. The ‘workflow-based’ solution allows users to view live geoscience data sets and to share snapshots of their work as ‘INTSessionShots.’ The new tools make up an ‘open environment for large geoscience data sets.’ INT’s tools have to date targeted third party software developers. Increasingly INT is selling direct to end user organizations.


Companies team on data quality initiative

Intervera Data Vera quality toolset to integrate PetrisWINDS Enterprise data access technology.

Petris Technology is to team with Intervera Data Solutions of Calgary on a strategic partnership to combine Intervera’s E&P data quality solution, DataVera, with Petris Winds Enterprise solution for consolidating data from disparate data sources. Initial trials have been completed using both technologies to monitor data quality in a number of industry data stores.

Gregory

Intervera President Paul Gregory said, ‘Companies now have an easy way to identify and fix problems so that they can focus on exploration decision making instead of questioning the data every time.’

DatVera

The DataVera suite leverages an industry-specific rules repository to find and fix data issues. The new deal makes it possible to profile data quality, standardize data, derive missing values and manage duplicates across PetrisWINDS-connected data stores.

Pritchett

Petris President Jim Pritchett added, ‘This partnership has far-reaching implications and opens up numerous opportunities for DataVera to play a key role in making quality data available to any application connected to PetrisWINDS Enterprise.’ The companies plan to continue enhancing other offerings through this strategic alliance. More from www.intervera.com.


Roxar announces FieldManager, FieldWatch

New Windows-based tools offer production data visualization and analysis.

Roxar has announced two new software tools, Fieldmanager and Fieldwatch to support visualization, analysis and interpretation of oil and gas production operations. Fieldwatch is a Windows-based remote field monitoring system while Fieldmanager is a comprehensive production data management system with interpret features. Both products leverage Roxar’s Dacqus data acquisition technology. Fieldmanager is a component of Roxar’s Integrated Reservoir and Production Management (iRPM) suite.

Hviding

Roxar CEO Gunnar Hviding said, ‘Companies aiming for a complete picture of a reservoir are often overwhelmed by raw data. A single field can generate up to one terabyte of data per day. The new software turns temperature, pressure and flow rate data into decision-making information, providing a complete solution from field to desktop.’

Irap RMS

The new tools offer connectivity to downhole gauges and meters and integration with Roxar’s reservoir modeling software Irap RMS and third party petroleum engineering applications. Data from onshore and offshore fields can be accessed at corporate headquarters or at any remote location.


PNEC Data Integration, Houston

The 12th international conference on petroleum data and information management broke all records with a final head count of 430. Data quality, master data management and unique identifiers remain at the top of the data managers’ wish list. We report on clean-up initiatives from ExxonMobil, Shell, Aera Energy. Petrobras and ENI—and from an enthusiastic panel discussion on the merits of metrics.

According to Robert Meith (Shell E&P Americas) seismic data vendors internal standards are not necessarily aligned with the Shell workspace. To assure consistency in data delivery, Shell drafted a 13 page document ‘Guidelines for Seismic Data Formats and Delivery.’ The document provides details of roles responsibilities, formats (SEG/UKOOA) and transmittals.

Petrobras

Raul Dias Damasceno described how Petrobras used to have multiple SeisWorks projects, many servers and poor project organization. Today there is a single project per basin stored on a filer provided and managed by the IT department. A move from 3DV to CMP files resulted in a 90% disk space saving. Petrobras has also been working on seismic amplitudes in the Santos basin, mistie correction and on the SIRGAS (Geocentric Reference System for the Americas) cartographic reference system cleanup. Petrobras’ seismic data amounts to some 3 petabytes of data.

ENI

Mario Fiorani outlined ENI’s essentially in-house developed E&P data and document management system. This spans G&G, production, physical and electronic documents—along with a uniform GIS/Portal interface. Vendor-supplied components include DecisionPoint, Petrobank and the Windchill document management system. The system uses the business object concept as federator, well and other object names are defined in the corporate database and presented in controlled lists—’you never type in a well name!’ Measuring the value of such a system is hard—tangible savings rarely justify the cost of implementation. A better approach is to assess the value of productivity and production improvements. ENI is generally disappointed with vendor’s interoperability offerings.

Quality

The issue of data quality is being addressed bottom up—by the ‘quality’ community and top down, by the master data managers. ExxonMobil (XOM) has been using a phased quality improvement approach to its well log curve repository as John Ossage described. XOM is merging its Recall projects to one domain, removing duplicates and synching ids. Data profiling involves locating and validating existing data and deciding where to put the clean-up effort. A composite ‘usability factor’ (UF) was established to facilitate the evaluation. Data quality tests are run as Unix cron jobs and the results presented as a table of UF metrics. UF metrics drive ‘proactive data management.’ Every couple of months ExxonMobil checks its legacy data to make sure there are no process busts. Answering a question on standard mnemonics, Ossage explained that although one might think that ExxonMobil had the kind of clout that would oblige its suppliers to conform, the reality is different. Richard Wylde (XOM) noted how hard it was to execute a sustainable long term quality initiative. Cleanup is too often justified by an initiating event—but sustainability needs a different paradigm. XOM’s engineers have joined the E&P quality initiative. Exprodat’s IQM toolset is used to color code quality and show a scorecard of trends by asset and data type. This initiative is helping to avoid engineers hoarding data in local spreadsheets—still one of the biggest barriers to good data management.

Aera Energy

Bob Bates presented Aera Energy’s enterprise architecture at last year’s PNEC. Since then, the Shell/Exxon joint venture ‘flipped the switch’ to find that it didn’t work! Only around 50% of data was accessible because of quality issues. A data governance program was quickly put into place with process owners, steering teams and data stewards. It’s working now as the data quality improves.

Shell

John Kievt described Shell’s search for the holy grail of E&P, a unique well identifier (UWI). A consistent descriptor is required to map between data sets. Mistakes are easily made and hard to fix later and the risk of mistakes increases with inclusion of meaningless numbers or well coordinates. Shell’s UWI is a unique, non-changing identity for the well lifecycle. The system provides a single authoritative source along with aliases used in other systems. Landmark’s Jeremy Eade demoed the system. The process begins in the well design workshop and the UWI is broadcast to engineering systems—along with sequential well bore numbers. A nightly synch pushes data out to other systems (GIS, subsurface projects). Post drill workflows add a definitive directional survey to the corporate database. Geomatics then check it over before pushing the data out to other systems. One commentator remarked that this was all very well but the UWI problem remains with the 5 million or so wells outside of Shell! In which context, the Energistics GWUI project is still completing ‘real soon now.’

Search in E&P

Under the guise of a general paper on search in E&P, Andy James (Halliburton) offered a moderately commercial introduction to the new web interface to Landmark’s Petrobank. After a comparison of search technology from various vendors and a prompt from a questioner, James revealed that Landmark has selected Autonomy’s search engine to power Petrobank Explorer.

Metrics

A panel session looked at the thorny question of data metrics. Total asked what is the ROI of a $10mm investment in data management? Metrics are in general not done. For Shell, the reserves issue was a ‘wake up call’ that led to more investment in ‘holistic’ IM, bringing databases together. This has reduced time spent looking for stuff by over 50%. Another speaker pointed out that metrics are not to be had for free. They may add 5% to the cost of a project and this money is not usually there. Furthermore the value of metrics is subjective—it could be a week of data loader’s time—or a lost well. Some are uncomfortable with the idea that a digital well log system will help find more oil! Sometimes the metrics have to be dumbed-down to make them believable. Production data often gets more attention, ‘C-level people are more interested in production data than a gamma ray log.’

This article is taken from a longer report produced as part of The Data Room’s subscription-based Technology Watch Service. More from www.oilit.com/tech.


SPE Intelligent Energy, Amsterdam

The Society of Petroleum Engineers/Reed Expo ‘Intelligent Energy’ conference was a popular event. A high tech scene setter event included live hook-ups to ‘digital oilfields’ around the world. We report on presentations from BP’s Field of the Future project, Petrobras’ geDig, Saudi Aramco’s ‘Intelligent Fields’ and smart wells and Semantic Web work on Chevron’s i-Field.

The SPE/Reed Exhibitions Intelligent Energy conference in Amsterdam proved very successful with a reported 1500 attendees from 47 countries. A ‘scene setter’ session co–chaired by Satish Pai (Schlumberger) and Sjur Bjarte Talstad (StatoilHydro) provided a snazzy introduction with live hooks-ups to remote locations from Aberdeen to Azerbaijan. Pai noted advances in technology for high performance computing (HPC), visualization and data storage—but cautioned that good data management was essential to support the technology and decision making.

ACE

A live link to Baku showcased BP’s ‘Advanced Collaboration Environment’ (ACE) and the ‘Field of the Future’ (FotF). ACE’s live video helps to build relationships between field personnel and head office. The FotF tool kit is leveraged to maximize production and minimize down time. Pan over to Shell’s Groningen mature gas giant where the 40 year old field is expected to see a further 50 years of production thanks to gas compression and high-end monitoring. 25 years ago there were 50 people working on the field now there are only 4.

Sleipner

StatoilHydro showed off the Sleipner field which doubles as a CO2 sequestration facility. Here, offshore personnel head count is kept to a minimum with remote support from the onshore operations center. The Schlumberger real time center provides voice, chat, desktop sharing on 20 workstations and four large visionarium screens.

GeDIG

Petrobras’ ‘GeDig’ collaborative decision environment has been deployed on the Carapela ‘digital oilfield’ (DO). Petrobras sees the DO concept as a solution to the production surveillance problem of being overwhelmed by real time data volumes. The DO is used to identify and diagnose problems as they occur and for ‘proactive’ monitoring of electric submersible pumping (ESP). Pai remarked that digital technology was making the world smaller.

Aramco

Amin Nasser reported that Saudi Aramco’s average recovery factor is 50% which is expected to increase to 70 % over the next 20 years. Aramco’s ‘Intelligent Fields’ use surface and subsurface sensor data to continuously update Earth models. Nasser showed animation of real time pressure data and smart well technology used to minimize water cut. All of Aramco’s new fields are intelligent—with remote adjustable chokes, driver pumps and smart completions. The work environment is changing—with real time monitoring, decisions have to be made jointly and quickly.

Nano robots

Nasser’s vision for the future is one of wells with up to 50 laterals and, more controversially with ‘nano robots,’ autonomous micro machines that go forth into the reservoir pores to map tortuosity, to deliver chemicals to targeted zones or to collect fluid properties. The robots go in one well and pop up in adjacent wells. Nasser did not say whether the robots are currently being produced in a manufacturing facility or in the fertile minds of Aramco’s engineers.

ExxonMobil

Jane Shyeh’s paper (ExxonMobil) on ‘right time decisions’ provided some context for those who wrongly equate ‘digital’ with ‘new.’ Exxon’s first Computerized Production Control system was implemented on its US assets in 1967! Today reservoir management leverages modeling and downhole control valves, electronic flow meters and Bayesian networks for adverse event detection. These techniques have allowed well work to be identified and effected up to two months earlier than before. Production has increased thanks to ‘active’ monitoring—chokes are aggressively managed and large number of wells monitored per engineer.

Gould

Schlumberger CEO Andrew Gould cautioned that although the concept of real time optimization has been around for a decade or so, along with the expectation that recovery rates would be boosted, this has not really happened. Gould’s crystal ball sees more smart stuff like intelligent completions but also ‘arthroscopic’ drilling leveraging real time monitoring and knee surgery-like techniques. Instrumentation is likewise developing into a ‘high value area.’ A questioner from the aerospace vertical asked about the industry’s reliance on software and collaboration. Gould replied that the common data model is now capable of supporting a more collaborative environment but queried, ‘Do we have an industry wide collaborative attitude? I doubt it, it will be a long time before we get there. This is a more competitive industry with far more players [than aerospace].’

BP—FotF

David Feineman described BP’s ‘Field of the Future’ (FotF) program—a portmanteau concept that covers a multiplicity of BP initiatives. One such program is the integrated surveillance information system (ISIS). ISIS supplies real time sensor data from BP fields, delivering well performance data to remote offices. Feineman noted that BP’s staff productivity has risen by 25% but added enigmatically that ‘less experienced staff are less resistant to change.’ Dave Overton’s paper revealed that ISIS is now operational on 20 fields around the world providing event detection and notification, data visualization of well schematics, flowlines separators and trends. Bryn Stenhouse’s presentation on modeling and optimization in BP E&P described work in progress on optimization—a.k.a. ‘model-based decision making.’ Tests on the Prudhoe Bay super giant with several thousand wells were successful in tuning the model but the optimization advice was not acted upon.

SemWeb

Chevron has set up an educational program at the University of Southern California where professor Amol Bakshi is investigating applications of the W3C’s ‘semantic web’ technology to the ‘i-Field.’ SemWeb techniques (as leveraged in the ISO 15926 program—see page 11 of this issue) promise synchronization across data sets and a standard way of deploying machine readable taxonomies. Chevron is an early adopter of the technology with its Generic Modeling Environment and with other R&D work at the Norwegian EPSIS R&D center.

This article is taken from a longer report produced as part of The Data Room’s subscription-based Technology Watch Service. More from www.oilit.com/tech.


Folks, facts, orgs ...

This month’s movers and shakers hail from Aker Solutions, Erdas, Caesar Systems, CDA, Chevron, Energistics, Geotrace, PIDX, ING, Intervera, Grupo Pragma, Logica, Ingrain, Paradigm and Roxar.

Nils Arne Hatleskog has been appointed executive VP of Aker Solutions’ (formerly Aker Kvaerner) field development and maintenance, modifications and operations.

Leica Geosystems Geospatial Imaging unit has been renamed ‘Erdas.’

Jeff Nelson is now Client Services Manager with Caesar Systems. Nelson was previously with Decision Strategies.

Recent new members of the UK Common Data Access (CDA) data service include Idemitsu E&P and Lundin Britain.

Melody Meyer has been appointed president of Chevron Energy Technology replacing retiree Mark Puckett. Louie Ehrlich will become president of Chevron Information Technology and chief information officer replacing Gary Masada who is also retiring. Meyer is currently VP of Chevron’s Gulf of Mexico unit and Ehrlich is VP services and strategy and CIO for the global downstream unit.

Cherry Dodge (Shell) and Paul Koeller (Landmark) have been elected to Energistics board of directors. Stewart Robinson (UK DTI/BERR) and Peter Breunig (Chevron) were re-elected to additional 3-year terms. Jana Schey has joined Energistics as director of program planning and management.

Scott Humphrey has joined Geotrace as marketing manager for Latin America Marketing. Humphrey was previously with GX Technology.

Steven Carter, co-founder and Principal of Eirô Consulting has been appointed Vice Chair of the PIDX Global Business Practices Workgroup.

Charles Hall is to head-up ING Wholesale Banking’s new ‘Structured Finance Natural Resources’ practice in Houston. Hall hails from Comerica’s upstream and midstream energy banking business.

Intervera Data Solutions has teamed with Argentinean Grupo Pragma Consultores to offer data quality solutions to the South American market.

Logica has hired Josh Strasner as senior VP sales. Strasner was previously with EDS’ oil and gas unit.

Ingrain has a new board chairman, Gary Jones, previously with WesternGeco and a new director, Paul Ching, previously with Shell. Other Ingrain appointments include Tom Guidish (CMO) and Barry Stewart (CFO). Ingrain derives reservoir rock properties using patented methodologies developed by Dr. Amos Nur and Henrique Tono of the Stanford Rock Physics Lab.

Paradigm has relocated its CIO Clay Miller to Kuala Lumpur, Malasia. The move is part of Paradigm’s initiative ‘to relocate corporate governance globally.’ Paradigm also plans to leverage more direct access to the region’s IT Super Corridor.

Ordin Husa has been appointed MD of Roxar Software Solutions. He was previously director of sales and services in Roxar’ Flow Measurement division.

Correction

In our report last month from the 2008 PUG we wrongly described Anadarko’s use of pre-packaged ArcGIS projects from ‘geoLOGIC Data Systems.’ We should have said, ‘Geologic Data Systems, Inc. out of Denver.


Safe Software FME Workbench users meet

Users from Shell, GE Energy and CH2M Hill report on deployment of popular geo ETL toolset.

The 2008 Safe Software FME User Conference, held last month in Vancouver, heard several presentation from oil and gas practitioners of GIS ‘feature manipulation.’ Safe’s FME is a geographically aware set of extract transform and load (ETL) tools that can be used for data migration, project building and in other GIS workflows.

Shell

Marc van Nes outlined some of the challenges of spatial data management in Shell. E&P data has a long lifetime during which it is used by many applications running on different platforms. Data management requires a flexible approach and a toolset like FME. 80% of all information has a spatial geo-reference—making geo information the ‘backbone’ for business. GIS data is gathered from many sources. Point data may be scraped from state web sites (van Nes recommends imacros from iopus.com). Image data is likewise available from USGS and other public sites. Such cultural data is combined with domain specific data from a variety of coordinate databases to create projects for the interpreter.

Migration

FME was used by Shell to migrate its legacy GenaMap data to ESRI, for data movement between various Oracle Spatial instances and other geo processing tasks. van Nes also reported use of Open GIS web feature services and the increasing use of CAD as a GIS data source. Looking to the future, van Nes anticipates a services-based infrastructure with data servers able to supply context-tuned data on request. The integration of GIS with 3D exploration systems like Petrel is also a pressing need. Summing up, van Nes described FME Workbench as a ‘valuable and versatile tool.’ FME plays a central role in Shell’s spatialization effort. With correct usage it is ‘almost self documenting,’ while its batch capabilities make it a workhorse.

GE Energy

John Snell introduced GE Energy’s ‘FME design patterns’ which produce simple, maintainable GIS workspaces which minimize duplication. Snell’s talk was subtitled, ‘avoiding the chamber of horrors!’ A design pattern is a ‘general solution to a common software problem.’ One example is the ‘central workflow’ pattern. This migrates complex point to point workflows into a ‘central workflow’ pattern where multiple sources enter the workflow from a single point. Filters and transforms are then applied in a way that minimizes duplication. The concept gets more interesting when the central workflow is combined with GE’s external attribute schema map pattern. This simplifies and manages the mapping of attribute names between source and destination feature types.

CH2M Hill

Jubal Harpster described how CH2M Hill uses the FME Server to implement enterprise data delivery. CH2M has been working to deliver data from the new FME Server along with web platforms such as Google Earth. The prototype FME Server was demoed streaming data from multiple back-end technologies through Google Earth Enterprise. Harpster also mentioned the GeoServer open source workflow project which has a ‘large and growing community of users.’ Other ETL tools from Talend and Camp2Camp also ran.


RESCUE moots XML formulation of model data standard

POSC/Energistics spin-off plans to update reservoir model exchange format.

The Rescue standards body—a spin off from a POSC/Energistics initiative—supports inter-application data transfer of reservoir model geometry and parameters. Rescue was developed as a set of C++ libraries that read and write data to a subset of the POSC/Epicentre data model.

XML

As the industry moves to more XML-based data exchange—in particular with the arrival of WITSML and ProdML, the Rescue group is looking into the possibility of a new XML-based schema. The project has support from several of RESCUE’s major users. A members-only meeting is planned for May 16 at Energistics following the WITSML SIG (which is public). Rescue does not plan to decommission the C++ API at the present time.


SAIC deploys master data management for BP E&P

Kalido MDM at center of ‘trusted source’ of web services provisioned reference data.

Speaking at the 2008 PNEC conference, BP’s Kelly Sherril recalled how the merger and acquisition activity from 1998 on brought with it a lot of data ‘baggage,’ some good, some not so good. BP had ‘subsets and islands’ of well master data located in Canada, Norway, BP, North America and elsewhere. To which BP was continually adding more data sets. Things came to a head when a major automation project was launched with the deployment of a commercial production surveillance tool. Initial estimates were for around one month of data preparation in toto. In reality this required 3 months per asset!

Trusted

The answer came in the form of a ‘trusted source’ of master data built by SAIC using Kalido’s master data management solution (MDM). Now a ‘gold standard’ relational database distributes metadata using web services and Oracle replication to offer worldwide data access to header information. BP prefers to buy not build and evaluated several commercial tools.

Kalido

At the time there was no MDM solution for E&P data (although this has changed since). The Kalido MDM solution was adapted to manage master data in Finder, drilling and completion databases and production. Modeling the different views of the well in these data sources required attention. Each database has elements of master data some of which may be the source of record. ETL* is used to publish MDM out from each system of record. More from chris.johnson@kalido.com.

* Extract, transform and load.


X-Change Corp rolls out oil country RFID tags

New surface acoustic wave technology maximizes read range for oil field equipment tags.

X-Change Corporation unit AirGate Technologies is now rolling out surface acoustic wave (SAW) tag technology based on radio frequency identification (RFID) that is tailored for use on oil country tubular goods. SAW tags and readers are used for monitoring drill pipe and other down hole tools.

Inventory

Tags are inserted into drill pipe to determine pedigree, asset and inventory in down-hole drilling environments. Tags and readers can survive the lifetime of the pipe or tool. SAW technology uses on-chip digital signal processing to broadcast a unique identifier when polled.

Hanafan

AirGate president Kathleen Hanafan explained, ‘The tags have been developed to withstand temperatures up to 300°C and pressures up to 20,000 psi. They also offer the long read range that is needed in oil industry.’


Statoil awards $900 million well work contract

Halliburton and WellDynamics beneficiaries of deal that includes SmartWell completion technology.

StatoilHydro has awarded Halliburton and WellDynamics nearly $900 million in contracts to provide completion equipment and services, tubing conveyed perforating services and SmartWell completion technology for oil and gas fields on the Norwegian continental shelf. The deal includes multilateral completions, expandable completion systems and zonal isolation and control systems. Work is scheduled to begin in September 2008 and will last up to nine years if all option periods are exercised.

Mathieson

WellDynamics president and CEO Derek Mathieson said, ‘StatoilHydro has been an instrumental part in the evolution and application of our SmartWell technology. The award extends the range of products we currently supply to include our latest technology and paves the way for innovative solutions to tomorrow’s challenges.’ More from smartwell@welldynamics.com.


AnTech rolls out ‘DAQ>’ toolkit for coiled tubing operations

Low cost data acquisition and connectivity solution offers live, remote monitoring.

At the SPE/ICoTA Coiled Tubing and Well Intervention Conference in The Woodlands, Texas this month, UK-based AnTech announced ‘Daq>,’ a new data acquisition system designed for coiled tubing (CT) operations. The modular system transmits depth, temperature, pressure and fluid flow data in real-time. Daq> comprises three modules. Daq>W is a modular, battery-powered wireless mesh networked of sensors. The low power sensors meet the ATEX requirements for use in hazardous areas. A hardwired Daq>H unit can be used to acquire data directly from the CT unit. The Daq>I links the well the internet via satellite.

Miszewski

AnTech MD Toni Miszewski said, ‘Once the preserve of larger operators, now everyone can view live data during CT operations from anywhere in the world.’ Daq> uses ‘ZigBee’ low power, low cost wireless sensors.


Homeland security department tackles plant and cyber security

Vulnerability assessment tool for control systems, Cyber Storm and Green Scorpion tests.

Along with its work on Cyber Security (critical information infrastructure), the US Department of Homeland Security (DHS) is addressing the safety of process control systems and facilities. The US Computer Emergency Readiness Team’s (US-CERT) ‘Einstein’ program provides an early warning system of internet hacking and other ‘malicious activity.’ A ‘Trusted Internet Connections Initiative’ is to offer enhanced security for federal ‘.gov’ domains. DHS has also created a Control Systems Vulnerability Assessment Tool to help reduce vulnerabilities. Last month the nation’s largest cyber security exercise, Cyber Storm II, brought together public and private sector partners to test response to a simulated attack on critical sectors of the economy.

ESS Expo

At the 2008 ESS Expo tradeshow in Phoenix this month, the DHS was again in action with the ‘Green Scorpion’ exercise. Here terrorists gain access to a large chemical processing facility and are countered by emergency teams that implement ‘response strategies’ to stop the terrorists, free hostages, and prevent or mitigate damage to the facility and surrounding area. To spice things up, the exercise investigated how the incident might be complicated by the ‘simultaneous outbreak of a pandemic.’ Such exercises are now mandatory under the DHS Chemical Facility Anti-Terrorist Standards rule.


Amalto adds PIDX conformant documents to ‘b2box’

e-Business specialist extends reach to oil and gas operator-supplier transactions.

Franco-American e-business specialist Amalto Technologies is targeting the oil and gas vertical with the addition of Petroleum Industry Data Exchange (PIDX) standards-based interface for upstream transactions. Amalto’s b2box software supports electronic business to business (B2B) transactions with trading partners and automates the exchange of business documents. The PIDX extensions will support e-business initiatives between large corporations and their supplier networks.

Rosetta Net

The PIDX b2box allows oil and gas service companies to transact with clients through PIDX documents over transport protocols such as Rosetta Net and AS2. The PIDX b2box can handle invoice document and attachments such as field tickets.

Foehn

Amalto CEO Jean-Pierre Foehn said, ‘Our PIDX-ready b2box allows midsize suppliers to enable B2B transactions with their trading partners quickly while providing back-office integration.’ Amalto’s oil and gas customers include Total and BAPCO in the petroleum industry. Amalto has a strategic partnership with Ariba to enable supplier back-end integration through b2box, with the more than 160,000 suppliers transacting within the Ariba Supplier Network.


IHS acquires UK decision support boutique JFA International

‘Global Window’ software for upstream business opportunities analysis at heart of deal.

IHS has acquired energy industry decision-support services company JFA International in a $4 million cash transaction. JFA provides strategic analysis to E&P companies. JFA’s flagship software tool ‘Global Window’ is a decision support tool that provides a framework through which oil and gas upstream business opportunities can be compared and characterized from a technical, commercial, economic and strategic perspective.

Stead

IHS chairman and CEO Jerre Stead said, ‘The acquisition of JFA International will position IHS to provide even greater insight to customers by further leveraging our core position and E&P data. It enables us to convert our critical information into strategic frameworks and analyses that drive customers’ investment decisions.’ JFA was founded in 2003 by Bob Johnson, David Finlayson and Graham Bliss. The ‘information-rich’ tool and technical and commercial methodology supports strategic planning and portfolio management. JFA has worked extensively with IHS data and the company sees ‘synergy and scope for growth’ in the IHS global market place.


Asset operator wish list discussed at SPAR laser scan tradeshow

Lidar-based asset management roundtable hears call from BP for better interoperability, standards.

The fifth Spar Point Research conference, SPAR 2008, on 3D laser scanning, mobile survey and Lidar-based asset management included a debate on an industry ‘wish list’. Laser scanning is an important tool in assessing the current state of a plant, providing a bridge between the physical and digital worlds. A Spar Point survey of asset owners found a ‘groundswell of demand’ for standards.

BP

Deborah Deats, Design and Document Team Leader at BP Texas City put interoperability at the top of her wish list, followed by data management. ‘My crew wants to be able to update a point cloud database with one new scan, without re-registering the whole file.’ Cody Leishman, laser scan specialist at Petro-Canada Oil Sands, agreed on the need for better data interoperability, calling for ‘better control of display and interference tools, more tools for interfacing with engineering models, and attribution of data.’ Chevron’s Kevyn Renner gave this year’s process industry keynote titled ‘3D and the Virtual World—Petroleum Refining Meets Web 2.0.’ More from www.sparllc.com.


FIATECH ‘CETI’ award to ISO 15926 work in progress

Bechtel, Fluor, Bentley Systems and NRX Global recognized for engineering and technology innovation.

At the 2008 Fiatech conference in New Orleans this month, a ‘CETI’ award was presented to Bechtel, Fluor, Bentley Systems and NRX Global for their ‘work in progress’ (WIP) on accelerating the deployment of ISO 15926. The CETI (Celebration of Engineering & Technology Innovation) awards recognize significant achievements in new and emerging technology research, development, deployment and implementation related to capital projects.

Teijgeler

At the same time, Hans Teijgeler has ended his authorship of ISO 15926-7 after 16 years of involvement in the standard. Tiegler’s work on the authoritative www.infowebml.ws website is to continue while Onno Paap, of Fluor Corp. takes over with the 15926-7 authorship.

Semantic web

The ISO 15926 WIP is a flagship use of the World Wide Web Consortium’s (W3C) ‘semantic web’ technology and addresses computer and data interoperability in the process industry and large capital projects. The 4D (space-time) data model provides a historical record throughout the lifecycle of a facility and is expected to help with data handover from contractors to operators.

Compliance

A recent addition on the InfoWeb website is a compliance test for adherence to the ISO 15926-7—pending the release of an official version from ISO. Next month Oil IT Journal will be providing more on ISO 15926 and the semantic web technology in a report from the upcoming POSC-Caesar Association’s ‘Semantic Days’ conference in Stavanger.


IASC publishes XBRL IFRS accounting taxonomy

‘Near final’ version of machine readable international financial reporting standards announced.

The International Accounting Standards Committee (IASC) Foundation’s XBRL Team has released a ‘near final’ version of the International Financial Reporting Standards (IFRS) Taxonomy 2008. A complete translation of the IFRS taxonomy is now available in XBRL, an XML dialect used in financial reporting.

Review

The IFRS Taxonomy 2008 represents a complete review of past taxonomies and has an extensive review by the XBRL Quality Review Team (XQRT), set up by the IASC Foundation last year. The XQRT comprises 20 experts from the preparer community, securities regulators, central banks, financial institutions and software companies.

Zalm

IASC chairman Gerrit Zalm said, ‘XBRL is rapidly becoming the format of choice for the electronic filing of financial information—particularly within jurisdictions reporting under IFRSs. However, this will require IFRS Taxonomy updates to be synchronised with publication of the IFRS bound volume and I congratulate the team on their achievements.’

Comments

Interested parties are invited to access IFRS Taxonomy 2008 and send comments by 30 May 2008. The ‘near final’ version of the Taxonomy is available on www.iasb.org/xbrl/taxo.asp. The final version is expected to be released at the end of June 2008 and will also be freely available.


WellDynamics OptoLog fiber DTS for Shell Canada

Distributed temperature measurement system monitors high temperature steam injection.

Halliburton-Shell joint venture Well-Dynamics reports that Shell Canada has successfully deployed its OptoLog DTS HT fiber optic system to profile temperatures in 18 wells in a steam assisted gravity drainage (SAGD) project on its Orion field. The distributed temperature sensing (DTS) technology, in operation since September 2007 is installed in 18 wells operating at average temperatures of 240ºC.

Real time

The fiber-based system is used to track the steam front in real time and allows Shell to rapidly identify problems such as unexpected steam breakthrough, cooling, or deterioration of the efficiency of the steam front at particular intervals—allowing decisions can be made in time to impact well production. The Shell Canada installation followed a successful test in 2006 for Japan Canada Oil Sands.


Kappa launches ‘Rubis’ full field simulator where less is more

New lightweight full field simulator add-on to pressure transient toolset targets quick-look modeling.

Kappa Engineering has just rolled out a pre-commercial release of its new ‘Rubis’ full field, 3D, 3-phase reservoir fluid flow simulator. While other simulator developers move towards the massive ‘gigacell’ paradigm, requiring significant compute resources per run, Rubis takes a ‘keep it simple’ approach. The aim is to match production data with model forecasts quickly and frequently—so providing actionable information before it is too late to be useful.

Mainstream

Rubis brings high-end simulation to the mainstream with interactive creation of simple 3-phase, 3D numerical models. 2D unstructured gridding and flexible geometry options allow accurate modeling of gravity, phase contacts, and layering. Local 3D grid refinement allows for more detail in the region of a well bore. In fact grids can be shared with little modification between Rubis and Kappa’s ‘Saphir’ pressure test analysis tool, leveraging pre-existing calibration work.

Grids

Optimized unstructured grids are built automatically from maps and sections of the reservoir geometry. A spatial interpolation (Kriging) option is available for complex geometries. Faults can be either barriers to flow or with infinite or finite conductivity. Non-Darcy flow is available, as well as double porosity and areal and/or vertical permeability anisotropy. Rubis shares the GUI of Kappa’s Ecrin suite and can interoperate with analyses made in Saphir or Topaze. Rubis implements a fully compositional formulation in its object-oriented kernel. The package is available free to users of Kappa’s software pending its commercial release in six months time. Kappa has also announced ‘Amethyste,’ a new well performance analysis package for later this year.


DNV unit announces corrosion data management toolkit

CC Technologies and DCG team on pipeline integrity management solution ‘Corr MD.’

DNV unit CC Technologies, in cooperation with DCG Inc., has released ‘Corr MD,’ a pipeline corrosion data management toolkit for integrity management of onshore and offshore pipeline systems. Corr MD was developed following a CCT/NACE International study which found that corrosion is costing pipeline operators around $7 billion per year.

NACE

Corr MD was unveiled at the North American Corrosion Engineering (NACE) 2008 conference held last month in New Orleans. The first release of the program focuses on internal corrosion management, but subsequent releases will target key integrity tools required for corrosion management, activity planning and budgeting.

Standards

User-defined reporting parameters, data assessment, record retention and automated task scheduling support compliance with 49 CFR 192 and 49 CFR 195 in the US and CSA Z662 in Canada. The tool also offers tiered security levels and image-storing capacity and is compliant with the Pipeline Open Data Standard (PODS), Orbit+ Pipeline and other integrity management systems. Corr MD provides both guidance for collecting the data and directions for proper usage and data management. Corr MD manages information from a multitude of corrosion monitoring and test results, such as gas analysis, coupon monitoring, liquid composition, pig residues, inhibitor and biocide use, operational conditions, etc.


EDS, AT&T and Telekom beneficiaries of huge Shell outsource

Master service agreements cover network, telecommunications and IT. 3,000 jobs outsourced.

Shell has signed Master Service Agreements (MSAs) with AT&T (network and telecommunications), Deutsche Telekom’s T-Systems unit (hosting and storage) and EDS (end user computing and integration). The total value of the deal to the three suppliers will be in excess of $4.0 billion. The outsourcing of Shell’s IT and communications (ITC) services will kick-off in July 2008. The agreement will see some 3,000 Shell employees and contractors around the world transfer to the suppliers. Shell will retain ‘strategic management’ ITC staff.

Matula

Shell CIO Alan Matula said, ‘This deal is a major strategic choice for Shell and will enhance our ability to respond to the growing demands of our businesses. It allows Shell to focus on Information Technology that drives our competitive position in the oil & gas market, while suppliers focus on improving essential IT capability.’

Khakhar

Business transformation TPI of The Woodlands, Texas, advised Shell on the deal. TPI partner Elesh Khakhar commented, ‘By providing integrated services to more than 1,500 sites in over 100 countries, Shell’s approach combines all the advantages of decentralized service provision with the benefits and efficiency of a centralized governance structure. Shell will be able to exploit emerging commoditized services designed for the consumer market, such as email or internet phone services and integrate them within their services when they become robust enough for commercial use.’

Thomas

EDS EMEA VP Bill Thomas added, ‘Our global presence and experience as integrator on large-scale IT outsourcing projects will help reduce management complexity in Shell’s IT environment.’

Comment

We led last month with a Microsoft announcement that also touched on Shell’s communications strategy. It’s not clear quite how all the new deals interrelate—but it sure would be nice to have a peek at the service level agreements that navigate from Shell, through TPI, EDS, T-Systems, AT&T, Microsoft and Nortel. That’s one heck of a phone call!


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.