April 2002


A2D’s Log Preservation

A2D Technologies has announced a new ‘Log Preservation Initiative’ to safeguard the asset represented by deteriorating Mylar well log films.

Paper logs created from Mylar film have played a critical role in worldwide exploration and production. But today, many original logs are deteriorating in large, unmanaged storage facilities.

Useless

Deteriorating films and evolving technology render many of these original logs practically useless in their current form. But, according to A2D, their storage costs the industry ‘millions of dollars annually.’ A2D’s Log Preservation Initiative sets out to recover and preserve the original log images by scanning them to a permanent digital archive.

A2D needs YOU!

A2D is calling for owners of such log data to come forward and provide access to their Mylar logs. These will be digitized by A2D at no cost to the company. Better still, companies will receive a digital, raster image file of each log provided – free and immediately available online, via A2D’s Log-Line data library, today and perpetually!

Kotowych

A2D president Dave Kotowych stated, “The operator’s original film represents the most pristine copy of a log. This is what we are endeavoring to capture and preserve. It is in everyone’s interest to preserve the integrity of one of the largest and most valuable data assets in the oil and gas industry. In the end, exploration geologists will have access to first generation data, the best available.”

Reverse decline.

The Log Preservation Initiative seeks to reverse the steady decline in log quality by sourcing the original Mylar logs, and making pristine digital copies available online. Similar to non-profit work performed in the motion picture industry, the A2D Log Preservation Initiative will digitally preserve original Mylar films for future generations of exploration geologists.

Data management

A2D will be leveraging its recently announced well log data management service to capture and serve the preserved data. A2D Senior VP Rod Starr said, “Our data management services will provide new opportunities for our clients by eliminating the burden of in-house log data maintenance and storage.”

ChevronTexaco

The new service allows companies to outsource management of their logs and offers seamless access to proprietary and commercially available logs in the A2D repository. ChevronTexaco is one client working with A2D to create and deploy log data standards for new wells and legacy data.


API ComProServ

The API is rejuvenating its e-business PIDX standard. A new XML-based framework will underpin complex transactions like well cementing.

The American Petroleum Institute’s e-business PIDX unit has just announced new XML-based standards for inter-operability. The ComProServ specifications cover procurement processes surrounding configurable, complex products and services such as well testing and cementation.

Schema

The new standards will update and extend the older PIDX EDI specifications with configuration tools and technical data templates. They include XML schemas for twelve transactions, along with associated data elements and tags. The technology-neutral standards were designed and tested by 20 companies to ensure their usability. The new formats recognize the need for reuse of existing standards where applicable and rely on existing technology for communicating information.

RosettaNet

ComProServ uses the RosettaNet transport routing and packaging protocol (TRP) 2.0 framework, (migration to ebXML is planned). The design allows for transporting data at a lower of detail level along with descriptive files. Currently, ComPro-Serv deploys around 35 reusable components.


Gries - are we not wimps?

Oil IT Journal editor Neil McNaughton heard AAPG president Robbie Gries’ call for greater enforcement of the Society’s Code of Ethics - especially to protect the deal-making independent geologist. He wonders what separates ethics from a contractual framework? Should Associations police member’s deals?

One of the many sources of incomprehension between US and European geologists, beyond calling Devonian Pennsylvanian (or vice-versa) and figuring in feet instead of meters, is the royalty check. A European geologist’s reward for discovering a 100 million bbl oilfield may be a pat on the back, but an American, even early in his or her career may be cashing a number of (probably small) checks in respect of stripper production in the boondocks.

Deals

Europeans know about corporate farm-ins and outs, but a significant part of the US scene is driven by deals. A geologist works up a prospect and then promotes it to a corporation which gets the land, drills the prospect and if it is successful, rewards the ‘inventor’ with the famous royalty check. As you can imagine, the moment a geologist walks in to Big Oil’s offices is a potentially dangerous time. What’s to stop the big guy stealing your idea without a by-your-leave?

Gries

AAPG president Robbie Gries speaking* at the opening session of the AAPG conference in Houston last month electrified the assembled ageing geologists, awardees, spouses and acolytes with a tale of double-dealing worthy of a Cohen brothers movie. One Gries ‘case history’ involved an AAPG member, the VP of a major company, who agreed to review her deal. The company took a couple of months reviewing the project, all the while assuring Gries that they were looking seriously into the deal, asking her not to show it to anyone else and so on. In the end, they called to say that management has turned the deal down. End of story. Except no, in the interim they had run geophysical surveys, leased adjacent land and were starting their own project on the same play.

Despicable

Gries considered that the VP and his geologists ‘were clearly dishonest, unethical and despicable.’ Gries did not attempt to seek redress for this particular wrong. At other times in her career she was involved in litigation - and discovered that it is a costly and unpredictable process. To offer wronged geologists another form of redress, Gries is advocating that the AAPG should consider taking its own code of ethics more seriously.

Sanctions

The AAPG’s Code of Ethics offers a range of sanctions for misconduct. From a private admonition through temporary suspension to expulsion. But... in the last 10 years, the AAPG has not brought a single ethical charge against a member. Gries reasons that ‘we are either exceptionally ethical or just not bothering to initiate a grievance.’ Gries’ suggestion is that the AAPG should either apply its ethical code or abandon it. If it chooses to enforce the Code - she asks an important question - is the membership ready to devote funds to defending the Association in its enforcement of the Code? An action which may well entail paying considerable monies to insurance companies and their lawyers!

SPE

Apache Corp.’s Larry Brown, in a guest editorial for the Journal of Petroleum Technology (March 2002) makes a similar case for bolstering the Society of Petroleum Engineers’ Guide for Professional Conduct. Brown argues that the SPE ‘should adopt a Code of Ethics and Professionalism .. [which is] .. broad, encompassing the core ethical and professional practice criteria of existing licensing organizations.’ He further advocates the setting up of ‘appropriate enforcement mechanisms’ and the devotion of a proportion of the SPE’s publication effort to discussion and education in these fields.

Lion’s den

Hearing Gries’ story of shady practices I was reminded of many stories of software developers who showed their embryonic ‘killer app’ to large corporations. They too had their business proposals rejected - but then saw the corporation launch an identical product shortly afterwards. I am not sure if my view of such cases reflects that commonly held by the programming community, but my impression is that anyone walking in to large corporation with the next Excel (or should that be Visicalc?) had better arm themselves with some watertight contractual documentation before stepping into the lion’s den. I don’t hear many calls for ethics in this context either. The assumption is rather that everyone will be as unethical as they can possibly be. The only restraint being such contractual engagements as cannot be wriggled out of after the fact.

It’s the law!

This raises an interesting question. Where do you draw the line drawn between the law and ethics? What constraints on these deals should we expect to be contractual and what should derive from a Code of Ethics? I am not sure what the answer to this question is - it probably depends on a lot of things like local laws, circumstances and culture. But I think the question, when turned around give us some insight into the problem. If we separate from a Code of Ethics those issues which are normally dealt with in contract law, we have the beginnings of a natural division of labors.

Contract

If you can write a contract to cover the disclosure and non-exploitation of a geologist’s prospect that can be defended in a court of law, then there should be no need to ‘re-legislate’ for such activities in a societal Code. I’m not sure what geological or engineering activities really cannot be handled by contract. Doctors need ethics to help decide life and death matters like whether to treat a patient with no insurance - but geologists? Maybe the AAPG could help out by drafting some type contracts for typical deals - but my feeling is that ethical enforcement would be a potentially ruinous course for the Association.

*You can read a summary of Robbie Gries’ arguments in the April 2002 AAPG Explorer.


Resource Classification can be FUN

Reserve classification is a moving target. Reserves are the ‘bottom line’ numbers, that many stakeholders from geologist’s through governments to the investment community hang their hats on. Yet classifications are a chimera! The recoverable reserves, be they from a single oil pool or from the whole world, are a complex function of technology, oil price and a host of other variables. Reserve numbers are ultimately a matter of interpretation. Sigurd Heiberg (Statoil) kindly provided Oil ITJ with a review copy of a paper (co-authored by Per Blystad and Erik Søndenå of the NPD) on ‘Resource Accounts, National and International Standardization.’ The paper offers a new look at reserves and their evaluation.

Petroleum reserves are hard to evaluate because they represent a ‘forecast of future production resulting from the effort invested in bringing such production about.’ If external circumstances lead to a change in the ‘effort’ – then the reserves will change too.

Stakeholders

Reserve stakeholders fall into three groups with different agendas. Governments may want to evaluate reserves with a regard to tax take, or for the greater good. Oil companies may need to manage reserves to optimize production, cash flow or profits. The investment community requires ‘accurate’ reserve numbers to evaluate investment and lending opportunities.

Three categories

The 2000 SPE classification divides un-produced reserves into three categories, ‘reserves,’ ‘contingent reserves’ and ‘prospective reserves.’ Each category is divided into three bands – low, best estimate and high. Heiberg advocates a further sub-division of the three main categories according to project maturity. This means for instance that the ‘contingent’ category breaks down into three ‘development status’ subdivisions of – ‘not viable’, ‘on-hold’ and ‘pending.’ This classification potentially allows a distinction to be made between reserve variability stemming from a project’s maturity, and variability resulting from uncertainty in actual reserve estimation. Heiberg does note a potential source of confusion when comparing the new categories with older classifications – these sometimes allowed quantities associated with immature projects to be lumped-in with uncertain probable and possible reserves.

Uncertainty

The introduction of a measure of uncertainty in reserve estimates opens the door to a more sophisticated, statistically-based categorization. Heiberg suggests that by shifting reporting from ‘proved reserves’ to the ‘expected value’ of reserves, several issues with current reporting would be circumvented. The gross under-reporting of ‘proven-only’ returns would be eliminated and greater transparency in reporting changes in reserves over time could be achieved. But Heiberg steers clear of complete advocacy of such a radical change stating that ‘there are good reasons to continue to report proven reserves, as the concept reflects both a reasonably certain value statistically and also the quantities that have been confirmed by direct observation.’

FUN

The NPD forum for Forecasting and UNcertainty evaluation (FUN) has been adapting the SPE classification to the Norwegian context, with particular attention to establishing project status categories. Industry partners appear satisfied with the NPD adaptation and the whole of the Norwegian reserve base will be described using the new scheme from 2001 onwards. The new Norwegian scheme introduces probabilistic reserve computation. This allows stakeholders to move away from the single value reserve estimate to a more informative indication including an uncertainty range. Probabilistic estimates also allow for uncertainty ‘management’ – particularly in computing the overall uncertainty in aggregated reserves. They are also more amenable to sophisticated decision support.

The informed investor

While ranges of uncertainty are grist to the mill of the financial modeler, they may be harder for the investment community to get its head around. US GAAP accounting principles roll reserve estimates into depreciation calculation – here a single number is obviously required, but which one? The US SEC has a set of widely used guidelines for reserve statement. It is important that all companies should report using the same methods.

SEC

The aim of the SEC guidelines is not to give neutral expected values as required for business planning, but rather to express reasonably certain, conservative and low values. But other, more liberal reporting régimes exist, such as those practiced on the London Stock Exchange. Some work is in progress to harmonize these reporting requirements – notably through the International Accounting Standards Board (www.iasb.org.uk) – which build on the SPE classification.

Unified scheme

Heiberg described a tentative unified scheme encompassing the SPE, FUN and a Russian classification. Ongoing work under the auspices of the UN Economic Commission for Europe may lead to a formal unification of the three schemes. Heiberg concluded by endorsing these international efforts at standardization. He also recommends using probabilistic reserve forecasting for management. Differences inherent in the reserve reporting can be reduced by standardizing on mean value reporting, while offering proved reserves reporting as a supplement.


People

Landmark gets a new boss and Trade Ranger a new chairman from TotalFinaElf.

Andrew Lane has been named president and CEO of Landmark Graphics. Lane succeeds John Gibson who is now president of Halliburton Energy Services

Bertrand Deroubaix has been appointed chairman of Trade-Ranger. Deroubaix is on secondment from TotalFinaElf where he holds the position of VP e-procurement.

Jim Wortham has joined IHS Energy’s Data Logic Services unit as director of sales and marketing. Wortham was previously upstream business development manager for E&P information management solutions with Schlumberger Information Solutions.


GeoX 5.0 released

GeoKnowledge’s latest release of its decision support software adds user-friendly tax modeling and enhanced data browsing.

GeoKnowledge has released a new version of GeoX, its decicion support suite for the upstream. GeoX Release 5.0 introduces enhanced data browsing and reporting, greater modeling flexibility, a greater range of probability distributions and a new distribution editor option, better security and a new ‘point and click’ fiscal regime modeling option.

Decision support

GeoKnowledge claims that current prospect integration focuses on the interpretation environment. Users have to rely on a separate tool for economic evaluations leading to data loss and simplified assumptions in hand-offs. The GeoX suite by contrast, provides a complete decision-support solution. Analyses and documentation are stored in a common database facilitating knowledge capture across projects, disciplines and analysis tools. The GeoX family comprises gProspectR for prospect evaluation, gFullCycle for economic modeling and gPlayR for stochastic volumetrics. More from www.geoknowledge.com.


Digital Analogs 2.0

C&C Reservoirs’ new Digital Analogs database extends to 750 giant oil fields and 10,000 literature citations.

C&C Reservoirs, a Chevron spin-off, has released a new version of its Digital Analogs database of worldwide basin and play types DA 2.0 offers web-based navigation, an enhanced search engine and access to data from 750 giant fields and 900 reservoirs. DA 2.0 provides geologists and reservoir engineers with exploration history, structure, trap mechanism, depositional facies, reservoir architecture, rock and fluid properties, reserves estimates, development strategies and reservoir performance.

Sun

C&C Reservoirs CEO Qing Sun said, “This new release greatly enhances an asset team’s ability to generate low-risk prospects and identify optimum field development strategies.”

Search engine

DA 2.0 includes over 15,000 graphics and 10,000 literature citations along with a powerful search engine and database with over 180,000 items. 40 C&C Reservoirs E&P Synthesis reports are included providing categorization and analysis of clastic, carbonate, fractured and deepwater reservoirs. More from www.ccreservoirs.com.


Landmark’s Digital Well Library

Landmark’s Digital Well Library service turns a mass of geological well information into an online asset. The new multi-pane scrolling browser is pretty neat too.

Landmark’s Digital Well Library DWL is a software and service offering which transforms well-related geological information into an accessible digital asset.

Service

The service side of the DWL involves the digitizing of a variety of information assets such as cores, tests and photo-micrographs. Most every data type or information source is amenable to the DWL treatment including documents and reports. The service components include scoping and site design, data collection and QC, digitizing and support.

Killer app?

Speaking at the Stavanger Landmark City Forum, Conoco’s Arjen Rolf presented the DWL and demoed the impressive new front end on a twin-head workstation. This deceptively simple tool leverages client-side technologies such as Java Script along with Landmark’s WebOpenWorks browser ‘Wow’ to produce a veritable killer app for geological data browsing.

Scrolling

The DWL front end offers browser-based access to data in the library, with an intuitive multi-panel display. Log, cutting and core data can be displayed at different scales and variable-speed scrolling offers an intuitive display of data relationships. Clickable hotspots indicate the availability of more information, from technical reports to photomicrographs.

Hosting

The synchronized, multi-scale, multi-frame scrolling tool was developed by Landmark internally as a component of the Team Workspace (TWS) data aggregation and application management solution. DWL can be delivered to the corporate portal, as a stand-alone solution, or hosted from one of Landmark’s data centers.


CAT out of Wood Mac bag

Wood Mackenzie’s ‘Corporate Analysis Tool’ plug-in to Pathfinder and GEM offers improved data mining and ad-hoc query.

Wood Mackenzie’s Corporate Analysis Tool (CAT) is an Excel –based tool which offers rapid data mining of corporate information stored in Wood Mackenzie’s database. CAT is an add-on to Wood MacKenzie’s other software tools PathFinder and GEM. CAT answers a range of typical competitor analysis queries and produces company information in the form of ranking tables, benchmarking reports and detailed field by field breakdowns.

Op Costs?

CAT offers rapid answers to question such as, “How do my operating costs compare to those of my peer group?” - “Who are the key gas players in Asia?” or “Who are the top players in terms of value in Angola and what are their key assets?” In each case CAT provides numerical analysis and ranking of major competitor’s numbers. More from www.woodmac.co.uk.


New news from Petris

Petris is to incorporate a web-based upstream news service from Paris-based Exploservices in its Winds Now! enterprise portal.

Houston-based Petris Technology has signed a partnership agreement with Exploservices in France to market its upstream oil and gas industry news processing services EPNews. The agreement covers the promotion and sales of EPNews, operated by Exploservices. The internet-based news service for information lookup, personal alert management and archive search will be integrated as an option of the Petris Winds Now! enterprise portal.

Thierrée

Exploservices CEO Bernard Thierée said, “After three years of considerable progress in our domestic market, we are looking forward to expanding our services in the USA and Canada with Petris.”

Pritchett

The partnership brings more content to the solutions offered by Petris Technology to its clients, while strengthening its marketing activities in the US and Canada. Jim Pritchett, Petris CEO said, “Exploservices is a good complement to our growing body of software and services offered to the oil and gas industry. Its capabilities will enhance our users’ experience with Petris Winds Now!”

World-wide

Exploservices was founded in 1990 as an upstream consulting company. EPNews was launched in 1998, and provides its clients with a service that continuously processes published news to produce business activity indicators which are stored in a relational database. The service provides ‘quick-look’ abstracts for worldwide coverage of business activities such as alliances, partnerships, acquisitions and mergers. More from www.exploservices.com and www.petris.com.


Rockware RockWorks 2002

RockWorks 2002 now offers deviated borehole data management and expanded OpenGL graphics.

RockWorks 2002 promises ‘beautiful 3D OpenGL graphics.’ RockWorks is a low-cost solution for mapping, contouring and solid models. RockWorks functionality also extends to volumetrics, logs, cross-sections, fence diagrams and more.

Data management

New in the 2002 release is a deviated well data management function allowing capture of lithology descriptions, stratigraphic formation names, geochemistry and geophysics data, fracture measurements, water level/aquifer data, log symbols and patterns.

OpenGL

A new RockPlot3D OpenGL plotting window displays borehole logs, fence diagrams, stratigraphy models and solid models such as contaminant plumes and oil and gas reservoirs. Other new features include a formation top picking tool, improved multi-lable point maps, and context-sensitive help.


Paradigm’s Reservoir Navigator

Paradigm’s Reservoir Navigator leverages disk cache technology to offer high-end visualization on ‘cheap’ hardware.

Paradigm’s new Reservoir Navigator (RN) offers high-end visualization à la Magic Earth on ‘cheap’ Sunblade hardware. RN can display a number of ‘probes’ simultaneously. These can be blocks of multi-attribute seismic volumes, maps, logs or structural frameworks. RN caches 20 – 30 GB of data on disk for interpretation, to give a high level of interactivity even when swapping to the cache.

3D Propagator

RN integrates volume-based and line-based interpretation, and advanced earth-modeling technologies with mapping and geostatistics, structural framework creation and geological reconstruction. Paradigm has also incorporated auto-tracking technology derived from Stratimagic into its flagship VoxelGeo product. 3D Propagator will also be available in Reservoir Navigator along with ‘on-the-fly’ computing of seismic attributes and interactive fault ‘accentuation.’


Compaq, Schlumberger team

Compaq and Schlumberger are to jointly market global IT solutions including DeXaBadge smartcards.

A new joint marketing arrangement between Schlumberger and Compaq builds on long-term cooperation between the companies in the oil and gas industry. Earlier this year, Schlumberger and Compaq delivered an integrated security solution based on Compaq products and the Schlumberger DeXa.Badge offering that incorporates SchlumbergerSema (SS) smart cards and readers.

Blackmore

Compaq sales VP Peter Blackmore said, “Schlumberger offers deep insight into many industries of mutual interest and offers world class solutions that integrate seamlessly with Compaq’s technology platforms and service expertise. The alliance of our two companies will offer global enterprises effective solutions with superb economy of deployment.”

Chevallier

SS VP Jean Chevallier concurred, “This alliance combines our domain expertise in key markets, project management, systems integration, secure IT services with Compaq’s industry-leading platforms to offer customers world-class IT solutions.”


AAPG 2002 - Houston

The 2002 AAPG Annual Conference and Exhibition held in Houston last month attracted 7,724 attendees - the largest gathering since 1985. Our main take-home was the increasing use of rock digitizing. This can be small scale – using calibrated photographs of cuttings, or NMR scanning of cores to full scale 3D laser sketching of rock outcrops. Such digital representations can be used to give point and click access to cuttings, or to allow for image processing and automated classification on cores. With virtual-reality you can now even bring the outcrop to your geologists’ workstation or CAVE. What’s next? Maybe a new form of ‘interactivity’ using a haptic geological hammer!

If you were to take a snapshot of the average oil company’s geological info set you would find a hodge-podge of rocks, cores, photomicrographs and imagery scattered around the organization. How do you get a handle on such a heterogeneous mass of information and make it available to the hard pressed geoscientist who no longer has the time to visit the core store or browse the library. Heck, there probably isn’t a library there anymore.

Digital rocks

Enter digital rocks - our catch-all term for a range of software products and services that move geological assets into the database or onto the workstation. The most striking example of digital rock-hounding comes from the work done by the University of Texas at Dallas. Our first illustration shows UTD researchers using laser ‘sketching’ technology to digitize a rock outcrop at a type locality in northern Spain. The second illustration shows the spooky laser image as captured - but barely does justice to the technology. UTD uses 3D photorealistic technologies, geospatial visualization, and geostatistical analysis to ‘greatly enhance 3D outcrop description and modeling.’

Into the CAVE

UTD has worked with Norsk Hydro to bring 3D imagery into the Cave. Lasers map out the geometry and photo imagery is draped over the 3D volume using software from GoCad and Inside Reality. Accurate rendition of such imagery is beyond the paper edition of Oil IT Journal - but you can get an idea of the potential from the UTD website on www.utdallas.edu/~xuxue/examples. Jerome Bellian’s team from the University of Texas at Austin has been working for ChevronTexaco on the application of LIght Detection And Ranging (LIDAR), a laser-based mapping tool, to collect stratigraphic information by outcrop scanning. Light-ranging data is co-rendered with laser intensity to generate 3D outcrop models with ‘near zero distortion in x, y and z space’. The intensity of the return signal helps to discriminate between different lithologies.

Outcrops

David Hodgetts’ team from the University of Liverpool has performed digital geological field mapping of South Africa’s Karoo turbidites using high-resolution differential GPS systems. Data collected was imported into reservoir modeling software in the field. Digital photo-grammetry allows the collection of 3D geometric data from stereo photograph pairs or strips. Software allows the integration of GPS data with the detailed sedimentological logs of the field geologist, and digital photogrammetry, to facilitate 3-D reservoir model building.

Cutting description

At the other end of the rock spectrum are tools for capturing cutting descriptions digitally. Quality Assured Lithology (QAL) from UK-based HRH Ltd. leads the well site geologist through a standard sequence for describing cuttings in a consistent manner. A click on the Munsell color chart (licensed from the USGS) captures a standardized “grayish green.” QAL also handles Wentworth class and has a library of true-size photos of grains for angularity and sorting descriptions. Another aid to the well site geologist is real-time digital cuttings analysis from Australian Sautec Pty. A calibrated light source illuminates and photographs cuttings. Cuttings and cores go into the machine, 8 minutes later, cuttings and core data can be made available off-line. These go into a structured library of images all built with the same calibrated methodology. Sautec now plans to offer grain by grain analysis.

Westport Tech

Westport Technology Center uses modified medical Computer Tomography (CT) scanners and nuclear magnetic resonance imaging to see into unconsolidated GOM cores. Westport uses visualization software developed by Texaco and is planning to ‘go commercial’ with the technique. Westport’s analyses provide quick-look core inspection and monitoring of fluid displacement.

Automated core analysis

Once your rocks are digital there are a variety of interesting things you can do with them. Petro Image Llc. has developed a hardware and software solution to perform a variety of automated core analyses on digitally archived core images. Geomodeling Research Corp.’s SBED consortium, originally launched by Statoil and BP in 1986, generates ‘digital rocks’ at all scales. SBED then allows for 3D attribute generation, visualization and interpretation.

GEMINI

The Kansas Geological Survey’s tortuously-named Geo-Engineering Modeling through Internet INformatics (GEMINI) project combines online access to digital data with web application software to support collaborative petrophysical analysis and reservoir modeling. GEMINI components include log analysis, volumetrics and input to fluid flow simulators. The Survey is also working on a new CO2 Sequestration Atlas – the Midcontinent Interactive Digital Carbon Atlas and Relational Database (MIDCARB). MIDCARB is a digital spatial database involving a consortium of state geological surveys covering Illinois, Indiana, Kansas, Kentucky, and Ohio. MIDCARB is built around an Internet-enabled relational database and Geographical Information System. MIDCARB shows the quantity of CO2 relating to a source supply, the security and safety of a sequestration site, the long-term effects on a reservoir and the cost of compression and transport of CO2. Future sequestration sites include oil and gas fields, coal beds, abandoned subsurface mines, unconventional oil and gas reservoirs, and deep saline aquifers.

Petroleum R&D

The US Department of Energy’s sees corporate investment in research and development as declining in the face of low oil prices and shareholder pressure. While some companies are moving back to a centrally managed R&D function, the government share of R&D funding has grown. Following the National Energy Plan announced last year, the DOE is ‘realigning’ its oil and gas R&D support program.

Upstream innovation

BP’s Wolfgang Schollnberger defended a ‘vibrant oil and gas industry’ that is focusing on technology development for the short and medium term. BP is researching in key areas such as cost reduction, increasing recovery, managing risk, understanding markets and improving HSE. One BP contribution is the ‘e-Field’ a look ‘deep into the reservoir and deep into the market.’ e-Field components include intelligent wells, ongoing reservoir monitoring and predictive modeling. BP may assign intellectual property rights (IPR) on its own ‘inventions’ to service companies – in the belief that this will accelerate their cost-effective implementation. BP claims this ‘open’ approach distinguishes it from its competitors and has ‘triggered a river of innovation flowing towards the company.’

Recon

Austin Geomodeling’s newly commercialized software provides 3D visualization and correlation of an ‘unlimited’ number of wells. Recon has been used to correlate and rebuild sequence stratigraphy over the largest oilfield in the world - the Saudi Gawar field. Recon also handles horizontal logs in 3D and provides an interactive pick database. Recon V.2.1 is the first commercial release and it too is used in ChevronTexaco’s visualization centre.

Portal moves in-house

PetroWeb has evolved from a ‘Portal’ into an in-house deployed data access and management tool. PetroWeb browses and manages data in native format irrespective of location. PetroWeb president Dave Noel told Oil IT Journal that this approach “represents a move from ‘just in case’ data à la PPDM/POSC to ‘just in time’ data delivery where and when you need the information.” PetroWeb is deployed by Unocal, Marathon and Oxy and leverages mapping technology from either Autodesk or ESRI.

Petrel 2002

Petrel 2002 the ‘biggest release ever’ will be out ‘real soon now’ and will offer full-blown seismic interpretation in both 2D and 3D. Petrel now includes dipmeter functionality developed by Oxy. A work process manager allows jobs to be re-run on demand and Landmark and GeoQuest projects can now be attached without the need for file import/export. Petrel 2002 has an Open Spirit link to OpenWorks and GeoFrame along with ‘any Oracle database.’ Petrel was previously reluctant to go head-on against the competition in the seismic interpretation arena. No more! Petrel figures that the 2002 release is competitive with software such as SMT’s Kingdom Suite - but the mid-term aim is to attack the SeisWorks/Charisma/IESX market with an offer that includes everything from seismic interpretation to production. All this in an integrated user interface that is loaded once only, allowing the interpreter to ‘go from A to Z in a single project environment’.

SMT

Petrel may be chasing a moving target as SMT is ‘putting more and more geology into EarthPack’ as president Tom Smith told Oil IT Journal. New functionality includes raster logs, faults, sections and production data along with AVI movies. SMT is also leveraging Open Spirit to integrate with third party environments and to cater for clients who are asking for ‘very, very large’ survey capability.

Massive basin model

IES has built a huge 3D model of the whole of the United Arab Emirates for ADCO Abu Dhabi. The model impresses with its graphic portrayal of oil migration and reservoir charge and has been used to risk evaluate new prospects. 3D and fluid component modeling are keys to IES modeling success. IES uses the ‘QT’ cross platform development kit from www.qt.no for portability.

Halbouty

Mike Halbouty, a sprightly 91 year old, gave the Heritage Session keynote on the East Texas field. East Texas was discovered by random drilling in 1930 and has since produced some 5 billion barrels with another 2 billion remaining. East Texas is characterized by its simplicity (a pinch-out of the Woodbine formation on a regional high) and size - 42 x 5 miles! Halbouty stated, “There was no valid geological reason for a discovery at Wilcox. No structure, no geomorphology, no direct indicators – in short no logical reason whatever for a positive result.” Halbouty asks if there ever be another East Texas? He believes so – in foothill provinces or against uplifts. “My firm conviction is that the next large accumulations will be found in subtle traps.”

This report is a short version of an in-depth report on the 2002 AAPG from The Data Room. For more information on The Data Room’s Technology Watch Service, please email tw@oilit.com.


University oil and gas projects funded

The Department of Energy has just announced new three-year grants totaling $4.5 million to academic institutions in the US. Projects spanning geochemistry, seismic analysis and rock mechanics are intended to improve competitiveness in the domestic oil industry. Commercial partners in the research include HARC, Paradigm Geophysical and Shell International.

The US Department of Energy’s National Petroleum Technology Office (NPTO - part of the National Energy Technology Laboratory) has awarded grants to six new upstream R&D projects designed to develop advanced diagnostics and imaging technologies for oil and gas.

Tennessee

NPTO has awarded the University of Tennessee and the USGS $537,263 to study the geologic, chemical, and thermal history of the southeastern Appalachian Basin to predict the location for new oil-bearing formations that may have eluded previous exploration.

CSM

Colorado School of Mines, along with the Houston Advanced Research Center, Paradigm Geophysical, Inc., and Texas A&M University are the beneficiaries of a $750,000 grant to develop a seismic direct hydrocarbon detection methodology for application in the deepwater GOM.

SWRI

San Antonio-based Southwest Research Institute gets $858,000 to ‘investigate the relationship between changes in seismic signal strength to reservoir rock properties.’ According to NPTO, today’s seismic techniques ‘are unable to distinguish accurately between variations in the texture and porosity of reservoir rock and the concentration of hydrocarbons trapped within the rock.’

Wyoming

$750,000 of funding goes to a study to be performed by the University of Wyoming and New Mexico Institute of Mining and Technology. Here the influence of chemical characteristics and interactions of water, oil and rock on the tendency of oil droplets to ‘cling to the reservoir rock’ will be investigated. The focus will be on carbonate formations, where these characteristics are said to be poorly understood.

TEES – A&M

The Texas Engineering Experiment Station (TEES) at Texas A&M University along with The University of Texas receive $630,000 to design computer simulation techniques to trace the fluids pathways through the reservoir using ‘partitioning tracers’. These chemicals are able to selectively follow a specific fluid in the reservoir and allow operators to distinguish between oil and water flowing through the reservoir.

MIT

MIT along with Shell International receives $941,000 to develop a reservoir flow model tailored to fractured carbonate reservoirs and tight gas sand formations. MIT and Shell will use seismic data to predict fracture distribution in 3D using innovative techniques involving rock stress and well log data.


Plexus and Tobin team on pipeline integrity

New Office of Pipeline Safety regulations are driving operators to review integrity in ‘high consequence areas’. A new software and service offering from Plexus Data Solutions and Tobin International helps companies analyze requirements and implement compliance.

Tobin International is to team up with Plexus Data Solutions to offer complete integrity assessment services to pipeline operators. The Tobin/Plexus Total Integrity Management Solution (TIMS) offering spans the analysis of essential data requirements and initial planning through to the implementation of a compliant integrity management program. The new offering brings the pipeline industry a set of tools for integrating operational data in a comprehensive GIS architecture.

Ellis

Tobin VP Steve Ellis said, “Our Total Integrity Management Solution allows pipeline companies to monitor, search, access and analyze all the information they need, from disparate sources such as design and construction drawings, facility information, operational, inspection and repair reports, engineering drawings, right-of-way documents and environmental imagery, while complying with both the new and pending government regulations that affect their pipeline operations.”

OPS

The Office of Pipeline Safety’s new rules include the 49 Code of Federal Regulations 195.452 governing pipeline integrity in high consequence areas (HCAs). These new initiatives will have a significant financial impact on the operations of regulated and non-regulated pipelines in both the liquids and gas arenas.
Compliance auditing of pipelines is already underway. TIMS addresses the new regulations for HCAs by providing reports of incidents, identifying risks, and by documenting such information. TIMS also provides an effective means of communications between the pipeline company and third parties who might be impacted by an incident, through a public awareness mailing service.

Odom

Plexus president Dave Odom said, “The Tobin/Plexus team can now offer a full service package to address every requirement of the new regulatory initiatives. Having only a few of the pieces to the puzzle does not make companies compliant! Every pipeline company or operator needs to create its own unique program that meets the new government regulations.”

Seminars

Tobin and Plexus will be offering in-house seminars to introduce the new, and pending, government compliance initiatives to pipeline companies. More from www.plexusds.com.


PPDM compliance revisited

The PPDM Association is calling for proposals from members to address the thorny issue of compliance with its standard data model.

The Public Petroleum Data Model Association (PPDM) has just issued a request for proposals (RFP) from its membership to provide ‘services and materials to measure and report on the level of conformance between a database or software product and the PPDM standard.’ Compliance with the PPDM data model has been debated since at least November 1997 (Oil ITJ Vol. 2 N° 11). The PPDM RFP defines compliance as ‘a level of conformance between a database or software product and the published PPDM standards.’ Measurement and awareness of compliance will help software developers ‘expand their market and increase sales to include customers who see value in having compliant software.’

Data Vendors

For data vendors, the provision of data and loaders using PPDM standard definitions is expected to reduce the cost of multiple data formats and add value through remote, compliant database access.

Reality

This initiative should help eliminate the tendency of the vendor community to cite ‘compliance’ with one or other of the industry standard data models without any indication of the degree, or even the reality, of such compliance.

RFP

The RFP calls for ‘services and materials to measure and report the level of compliance, between a database or software product and the PPDM Association standards.’ To respond to the RFP, vendors must be PPDM members.


Ovation rolls-out Encore Solution

Ovation Data Services’ Encore Solution aims to provide near-line tape-based storage to relieve Network Attached Storage bottlenecks and to cut costs.

Ovation Data Services (ODS) is targeting its nearline storage ‘Encore Solution’ at companies whose seismic data is outgrowing the capacity of their Network Attached Storage (NAS). Ovation argues that NAS storage is less cost effective than Hierarchical Storage Management (HSM) software solutions driving tape robotic storage.

Automated

ODS’ Encore Solution automatically migrates seismic data from NAS disk storage to tape. The initial cost of a Encore solution for migrating up to 6TB of Static Seismic Surveys is only $0.026/MB. Subsequent data can be added for as little as $0.001/MB.

GUI

Ovation’s Encore Solution incorporates an end-user GUI interface and back office software to automatically manage the migration of seismic data according to configurable business rules. Encore allows disk storage usage to be optimized for current projects while older data is paged to the secure nearline system.

Simultaneous

Encore provides up to four simultaneous tape copies for off-site disaster recovery. Ovation uses a non proprietary tape format for data independence. More from www.ovationdata.com.


Norwegian operators extend Diskos contract

The Norwegian operator consortium Diskos has renewed PetroData’s contract until the end of 2003.

The Diskos group of companies, operator of Norway’s national data store, has extended prime contractor PetroData’s contract until the end of 2003. PetroData has been running the online upstream data store since 1995. Most Norwegian operators use Diskos to store seismic and well data at one centrally managed location while letting individual companies compete on the use and interpretation of their data.

Post-stack

Diskos currently holds almost all post-stack seismic data from the Norwegian continental shelf. Diskos is also used as a distribution channel for data pertaining to a license, along with navigation and cultural data. Future extensions are planned which will extend Diskos’ scope to pre-stack data. PetroData manages data other than the national data set through a sophisticated entitlements management system.

Repository

Companies which deposit their well data with Diskos are deemed to have satisfied Norwegian government reporting requirements.

Hanesand

PetroData president Trond Hanesand said, “A revised business model and price reduction are expected to increase usage, and strengthen Diskos’ position as the Norwegian E&P industry’s hub for information sharing and e-collaboration. The likely deployment of Surf and Connect Web Edition this summer will significantly increase data availability to the Diskos membership.”


Wellogix’ Synchrion framework

Wellogix has released new technology - the ‘Synchrion Framework’ for dynamic data exchange with Enterprise Application Integration platforms.

Wellogix, Inc. has unveiled new technology to help integrate its Wellogix software suite with customers’ legacy applications. Wellogix’ Synchrion interoperability framework provides pre-built, configurable gateways through which Wellogix products can be plugged into a variety of Enterprise Application Integration (EAI) platforms including Vitria Technology’s BusinessWare, IBM’s MQSeries, TIBCO’s ActiveEnterprise, webMethods’ Integration Platform, and SeeBeyond’s Business Integration Suite.

Swivel-chair!

The Synchrion framework eliminates ‘swivel-chair’ data entry for multiple software environments. Instead of re-entering identical data sets into numerous applications, users of the Wellogix products can dynamically export data to or from any of their legacy applications, efficiently and accurately.

Livesay

Wellogix CEO Jeff Livesay said, “Data integration is typically an extremely time-consuming, and often frustrating experience. Many of our clients have had unpleasant integration experiences with other software applications, which led us to develop the Synchrion architecture to ease their integration challenges. Through the Synchrion framework, we’re enabling companies to leverage their use of our software products to provide easier data transfer, improving the complementary nature of our software with established products in the marketplace. The Synchrion architecture is the next step in our web services strategy that began with our Proxis knowledge management suite. As we continue to execute our strategy, we will be able to dramatically reduce integration time and costs for all of our customers.”


Shell deploys Jason’s 3DiQ software

Shell is to integrate core reservoir characterization technology from Jason into its global E&P software portfolio.

Shell International Exploration and Production B.V. has selected reservoir characterization technology from Fugro unit Jason Geosystems BV. Jason’s 3DiQ technology will be integrated into Shell’s E&P Software Suite - the Shell Software Suite Portfolio (SSSP) for Exploration and Production. Shell maintains the SSS portfolio centrally and distributes it globally to E&P Operating Units.

3DiQ

Jason’s 3D Integrated Quantitative Reservoir Characterization (3DiQ) algorithms make up the core of Jason’s flagship Geoscience Workbench. Jason claims the 3DiQ software eliminates inconsistencies between geophysical, geological and petrophysical reservoir models. This is said to reduce uncertainty in reservoir parameters such as continuity, geometry, thickness and volumes. Shell has been using the Jason technology in its operating units to ‘de-risk’ field appraisal and development.

E-Plus

Along with the core inversion and modeling algorithms, Shell will deploy Jason’s ‘E-Plus’ front end. E-Plus, a component of the Geoscience Workbench, encapsulates the 3DiQ components and provides support for volume-based analysis and interpretation. E-Plus allows for external data to be rolled into the interpretation workflow and provides 3D visualization and body capture in time or depth, section view, base map.

Training

Jason will be working with Shell to further integrate 3DiQ with the SSS Portfolio, and will be providing training to Shell staff. More from www.jason.nl.


IFP’s Luminus deployed by GXT

GX Technology is to offer 3D plane wave prestack depth migration using the French Petroleum Institute’s Luminus software.

GX Technology (GXT) is to offer seismic depth imaging services using the French Petroleum Institute (IFP)’s patented Luminus software. Luminus is claimed to be the industry’s first shot-based 3D plane wave prestack depth migration. The software is said to enhance imaging of complex subsurface structures in the marine environment, especially beneath salt.

Worldwide

GXT has signed an exclusive worldwide agreement with IFP for deployment of the Luminus package which will join GXT’s other advanced imaging technologies - Optimus for shot record wave equation PSDM, Primus for amplitude-preserving Kirchhoff PSDM, and PrimeTime for curved ray prestack time migration.

Lambert

GXT president Mick Lambert said, “Luminus gives our clients an edge in depth imaging in the marine environment, and represents another major step in the continued expansion of our seismic imaging services.”

Friès

IFP senior VP Gérard Friès said, “We selected GXT as the exclusive marketer of Luminus services because it is a leading provider of prestack imaging solutions and will give our technology the widest possible application. We have a very close working relationship with GXT, and this is already paying off in our ongoing enhancement of the software.”

Speedup

Compute-intensive PSDM traditionally can take days of computer time. Luminus produces interpretable prestack depth volumes ‘in hours,’ allowing for QC of results early in the migration cycle. If required, modeling can be restarted - saving ‘weeks or months’ of time on iterations. More from www.gxt.com and www.ifp.fr.


Peloton’s new WellView is faster and secure

A new version of Peloton’s WellView information management solution promises improved performance and enhanced networked security.

Calgary-based Peloton has released version 7.0 of its well information system, WellView. WellView 6.0 is a well information management system for well planning, drilling, completion, testing and workovers. Originally developed by Merak, WellView was spun off into Peloton when Schlumberger acquired the company. Peloton says that WellView 7.0 is easier to use, faster and field work-ready, with a new remote data synchronization and
communication capability.

Faster

The new release speeds up performance and rationalizes the user interface. User feedback has led to enhancements in data entry and a shortened learning curve. WellView is now equipped with a powerful and robust two-way remote data transfer system which can work over cell phones, satellite phones, WAN, LAN, VPN, and ISP’s. Data is encrypted and compressed for security and speed and approved users can now retrieve wells from the server upon request.

Functionality

New functions include mud additive costs, and bit and rate of penetration parameters. A new security administration application assists with the implementation of WellView security in Oracle and SQLServer databases. A printed Users Guide is now available. WellView was recently integrated into the Pason Hub for use on the rigsite (see OITJ Vol.7 N° 3). More from www.peloton.com.


Unocal goes for Digital Oilfield

Unocal has chosen Digital Oilfield’s e-procurement solutions for invoicing and contract management.

Unocal has selected Digital Oilfield’s integrated invoicing and contract management solution. Unocal is to use Digital Oilfield’s OpenInvoice and OpenContract Internet-based solutions to automate its invoice and field ticket reconciliation, approval and data capture processes. Unocal is implementing the Digital Oil software in its North America operations initially, with the potential to later roll the technology out to its overseas locations.

Pownell

Unocal project manager Tim Pownell said, “Unocal has an aggressive program under way to deploy e-business technologies that enhance our business performance.” Digital Oilfield president and CEO Rod Munro (co-founder of GIS specialist Munro Garrett) added, “We’re delighted to assist Unocal in meeting its e-procurement objectives. With the Digital Oilfield solution, Unocal will have a greater understanding of its upstream spend, while streamlining the process of submitting, validating, routing, coding and approving invoices.”

80% saving

Using OpenInvoice and OpenContract, E&P companies and their vendors can collaboratively create and process field tickets and invoices to ‘eliminate 80% of the internal processing time and cost associated with invoice approval.’ Digital Oilfield’s workflow-improvement tools are designed for upstream operations. The software is said to ‘free-up professional staff’s time and give managers rapid access to spending and operating information.’ More from www.digitaloilfield.com.


MMS back online after Indian ‘attack’

The US Minerals Management Service’s computers are back on line after a three month interruption. The systems were shut down when a federal judge questioned Department of Interior IT security.

Internet access (including email) to the US Minerals Management Service (MMS) has been restored following a three-month nationwide shut down. A bizarre sequence of events was sparked off last December when a lawsuit from American Indians against the federal government caused a judge to close public access to the Department of the Interior Internet.

Security

At issue was the degree to which the DOI assured the security and protection of third-party data on its servers. The shut down interrupted the collection, processing and distribution of production data from federal offshore waters by the MMS and IHS Energy.

Back online

The MMS systems are now back online and companies can submit production and royalty reports to the MMS e-commerce service provider, Peregrine Systems, Inc, via the Internet. More on electronic reporting from www.mrm.mms.gov.


9/11 spurs OGIS interoperability

The Open GIS Consortium has just demonstrated multi-vendor GIS interoperability through web services. Development of the interoperability spec was spurred by the events of 9/11.

As last month’s Oil IT Journal showed, the oil and gas industry is a major consumer of geographic data and systems. One problem confronting industry users of geographical information systems (GIS) is merging data from different sources, vendors and formats. The current, pragmatic answer to this problem is to ‘standardize’ on a single vendor solution, but in the longer term, a more elegant and politically correct solution may emerge from an Open GIS Consortium (OGIS) initiative that is currently underway.

Web Services

The OGIS initiative sets out to provide a standard way of ‘exposing’ geographic data so that potential users can inspect and integrate foreign data sets programmatically. The initiative will leverage emerging Web Services standards. The OGIS Web Services (OWS) are intended to let machines talk to machines and seek out relevant geographic sources, to know when they are updated or modified and to blend all such information together in an intelligent and seamless manner.

9/11

The events of September 11 appear to have spurred the OWS members – which include the EPA, NASA and the USGS – to move rapidly to a working demonstration of OWS version 1.1 which addresses interoperability challenges defined by officials in New York City. The OWS 1.1 scenario challenged participating technology developers and integrators to address critical disaster management needs. The demonstration showed users discovering, accessing, superimposing, and portraying satellite and aerial imagery, vector data, and scientific data stored on servers in Europe, North America, and Australia.

Metadata

New draft specifications for metadata and services were used to implement registries that enabled discovery of data and geoprocessing services. Interfaces based on the OGIS’ draft Sensor Web Specifications enabled discovery of and real-time access to measurements from meteorological, water quality, air quality, and seismic sensors. The OGC Web Coverage Service was demonstrated accessing a variety of imagery including visible, hyperspectral and radar. Attendees witnessed the first public demonstration of a Coverage Portrayal service which, acting in this case as a middleware web service, accessed complex coverage data to produce simple pictures for display in a web browser. Commercial participants in the test included ESRI, Intergraph, Laser-Scan and SAIC.

Phase 2

The next phase of the initiative, OWS 1.2, begins in May 2002 and will focus on extending engineering specifications developed in OWS 1.1 and other initiatives including OpenGIS Specifications for OGC Common Architecture, Web Mapping, Imagery Exploitation and Sensor Web. More from www.ogis.org.


SGI Visual Area Networking rollout

At its new Houston ‘Energy Summit’ SGI rolled out its new technology for Visual Area Networking. VAN is slated (by SGI!) to become the visual equivalent of the world wide web.

SGI held its first Energy Summit in Houston last month, showcasing its high-end visualization technologies to the oil and gas industry. SGI rolled out its big guns for the occasion – with a top-level briefing from SGI president Bob Bishop and an endorsement for its technology from former DOE secretary Bill Richardson.

New breed

Bishop claims, “A new breed of graphic supercomputers is having a dramatic effect in reducing the complexity of decision making in the oil industry.” While Richardson, in his keynote address, lauded SGI’s “remarkable commitment to continuous technological improvement [which] has helped the industry reliably meet the energy needs of appreciative nations around the globe.”

Bartling

SGI Energy Solutions director Bill Bartling explained, “The oil and gas industry was among the first to adopt immersive visualization as a decision-making tool, turning seismic data into 3D images that teams of engineers and scientists walk through and manipulate in real time. This has resulted in improved success rates for the oil companies, who also produce more oil from the same wells and has extended oilfield life. It has also improved shareholder value and at the same time reduced environmental impact and decreased reliance on imported oil.”

VAN

The object of the SGI hype is its new Visual Area Networking (VAN) technology, which SGI hopes will become the equivalent of the World Wide Web for users of graphic-intensive computing. VAN will ‘allow creators and consumers of dynamic, interactive visual information to be at different locations remote from their complex data management and advanced visualization systems.’

Collaboration

With VAN individual users and geographically dispersed teams can collaborate using any mix of client devices. By remotely interacting with a visualization supercomputer, a geologist in Houston, a geophysicist in Saudi Arabia, and a drilling engineer on a platform in the North Sea can ‘work together to maximize the likelihood of striking oil.’


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.