Working with data can be a thankless task. If you have your nose to the grindstone and are trying to persuade your co-workers that consistent naming conventions really matter. Or that by tagging information with metadata on units of measure or projection systems can help avoid costly disasters like wells drilled in the wrong place. But readers of Oil IT Journal and its predecessor Petroleum Data Manager know all this. Surely this is not going to be another ‘data quality’ editorial...
Well, in a way that is exactly what it is going to be. Recent events have pretty well forced me to write on what is admittedly a delicate matter. I will try to be diplomatic. As any casual reader of the financial pages will have noticed, Shell was forced into an embarrassing re-evaluation of its reserves last month, revealing that it had overbooked reserves by a whopping 20%. This wiped some $15 billion off its market capitalization—a loss to shareholders someplace between a Parmalat and an Enron! How could this happen?
I have to come clean and admit that I have no idea how it happed to Shell—nor to El Paso—which has just reduced its proved reserves base by 41% and taken a $1 billion ‘non-cash’ charge. A partial explanation was offered in an Financial Times (FT) article covering Shell’s discussions with the Nigerian government about its reserves numbers.
On February 2nd, the FT, citing Nigerian government officials, revealed that Nigeria accounted for a third of Shell’s ‘surprise enforced cut’ in proved reserves. Actually Shell’s reserves position in Nigeria, and indeed that of other operators, was already the subject of a tax dispute before the latest reserves revision. In particular, Nigerian presidential advisor on petroleum and energy matters, Edmund Daukoru was quoted as saying, ‘The basic deciding factor in the dispute with the companies was the quality of data used to justify the new reserves’.
To hear the words ‘data quality’ in connection with a tax dispute and a billion dollar reserves write-down stopped me in my tracks. OK, we are in no position to judge the merits of the case. But the connection from data—i.e. reserve calculations—to the corporate bottom line could hardly be made more starkly. In the Nigerian dispute—affecting eight oil multinationals and amounting to some $580 million—the argument revolves around tax breaks designed to encourage exploration—by rewarding successful explorers. Again according to the FT, companies get tax breaks (and bottom line benefits) if their annual reserves increase more quickly than their production.
In other contexts (unrelated to the Shell debacle), reserves numbers are used to determine amortization of capital expenditure. Early in the field’s life, amortization rates, and hence taxes, are calculated from the initial estimates of reserves. Again, the reserve numbers go straight to the bottom line.
For an in-depth analysis of reserve restatement issues I would recommend a paper by IHS Energy’s Kenneth Chew*. Chew explains the intricacies of reserves reporting with the US Financial Accounting Standards Board (FASB) standards, used by the Stock Exchange Commission (SEC). According to Chew, new technology has complicated matters for the SEC—and hence for investors. Advances in seismic, reservoir simulation etc., make for ‘far greater precision in establishing in-place and recoverable hydrocarbon volumes than was possible a quarter of a century ago’ (when the FASB regulations were established).
One issue stems from the inherent conservatism of the SEC’s reporting rules—which exclude ‘probable’ reserves. Chew argues that this is a ‘serious loss of valuable information’. Chew goes on to explain how ExxonMobil gets around this difficulty by reporting its resource base and annual resource base additions in a Financial and Operating Review. Year end 2002, ExxonMobil’s SEC proved reserves amounted to only 29% of the company’s ‘discovered resource base’.
All these considerations are at once close to, and a million miles away from a great hobby horse of the upstream software business—portfolio management. How many times have we heard the economists explain that things have to change in the boardroom. That the old way of ‘one reserve’ number should be abandoned in favor of a statistical approach? Well it turns out that those folks in the boardroom already have to juggle with a lot more than one reserve number! There’s one for the local government—for the local tax situation, another for the SEC for the reporting and third for the annual report for consummation by the investment community.
It’s the data stupid!
The point of all this? Data managers, application developers, geoscientists and engineers are not, as it sometimes feels, engaged in some kind of back-room twiddling. These numbers generate, not just the ‘reserves’, but a goodly chunk of the corporate bottom line. Quality data, quality software and rigorous procedures are at the forefront of corporate integrity.
* Reserves Reporting. What’s it all about? K. Chew, IHS Energy.
Mike Wiley, Baker Hughes International (BHI) chairman and CEO, reporting on a ‘solid fourth quarter’ expects 2004 to be ‘a year of revenue and earnings growth’. BHI does not envisage a significant upturn in activity in either the North Sea or the Gulf of Mexico. But Russia, the Caspian, Latin America and the Middle East are all expected to improve. ‘All our divisions are planning improved revenue and profitability in what we expect to be a stable pricing environment’ Wiley concluded.
Halliburton president Dave Lesar described Landmark Graphics’ 2003 operating income as ‘18% up on 2002—the highest margin since Landmark was acquired (in 1996)’. The Energy Services Group (ESG), ‘benefited from increased oilfield activity in the United States and Canada, and from improvement in international markets. Looking ahead to 2004, customer spending is expected to accelerate over the course of the year, although our first quarter results are expected to be affected by normal seasonal softness.’
TGS Nopec reported net revenues for Q4 2003 up 29% to $41 million. President Hank Hamilton described the quarter as ‘fantastic—a record quarter’. Net revenue for the year was up 10% on 2002 to $140 million. 36.9 MM$ up 10% on 2002. The record last quarter came from many diverse customers, markets and projects. According to Hamilton this is a very broad signal that the market is improving. The company is projecting a 5% increase in net revenues in 2004. All in all a very positive outlook for 2004.
Aspen Technology reported a return to profitability and higher software license sales for its fiscal 2004 second quarter. Total revenues for the quarter were $80.4 million, with software license revenues of $37.7 million. Looking forward, President Dave McQuillan sees ‘economic indicators steadily improving, we are beginning to see pockets of strength in our customer base and we are moving aggressively to capitalize on these opportunities.’ Aspen closed nine transactions of $1 million or greater—signing significant transactions with Saudi Aramco and Anadarko.
The French Petroleum Institute is forecasting record worldwide E&P spend of over $115 billion for 2004 (excluding China and the CIS). The record level is due to growing demand, high oil and gas prices and a ‘new optimism’ especially in the North American market.
Landmark Graphics has signed a five-year technology agreement with Integrated Trade Systems, Inc. (ITS) to supply Pemex with a range of Landmark’s software.
John Gibson, Halliburton Energy Services president, said, ‘The Landmark agreement expands upon the 50-year strong relationship that Halliburton has developed with Pemex and we are proud to continue this great partnership.’
The deal includes the provision of Landmark service and support to Pemex E&P specialists. ITS is a wholly-owned Pemex unit, incorporated in 1994 in the state of Delaware. ITS provides international procurement services to Pemex in Mexico.
Pemex is an eclectic purchaser of upstream software. Last year, Schlumberger won a $60 million contract for the provision of upstream information management and services (Oil ITJ Vol. 8 N° 3). In 2002, Mexican President Vicente Fox opened a Paradigm-equipped geophysical data processing center for the state oil company (Oil ITJ Vol. 7 N° 12).
Oil ITJ—When was the plot hatched for a management buyout (MBO) of Tigress?
Sullivan—We have been trying to do a buyout for the last two years, but the deal was held up while PGS negotiated Chapter 11. The idea began when PetroBank was sold to Landmark back in 2000. PGS subsequently had the option of selling the Tigress unit to the trade, or doing an MBO. Financially speaking, the MBO proves that we’re back. We have no debt and are making money. In our last tenders, for contracts in China, Indonesia, Turkmenistan we won three out of four. These wins were on equivalent functionality and a significant price advantage of around 30%.
Oil ITJ—What are your plans now?
Sullivan—We are taking Tigress back to its roots as a software company with a renewed marketing focus. We have moved back to our original premises in Marlow, UK and are expanding in Tunisia, and Siberia, with a new office in Tyumen.
Oil ITJ—So a software company is a marketing operation!
Sullivan—When you consider the cost of developing a major software package like Tigress you see why there are so few software companies around and how important marketing is. Since the first installation in Shell, back in 1991, around 35 million has been spent on Tigress development. Given that most investors would expect 25-30% gross returns, you can see the pressure on the marketing end of the operation. Today, the cost of developing an independent suite around its own database would be prohibitive. In fact it was probably being owned by PGS that saved Tigress from being acquired by a third party.
Oil ITJ—How do you justify Tigress’ position in a software market which is dominated by two major vendors?
Sullivan—Tigress can be categorized as a maverick, a little fish in a big pond. But a lot of other companies have niche applications that integrate with OpenWorks or GeoFrame. But the original intent of ‘integration by design’ is at once Tigress’ problem and strength. Tigress is used all around the world by forty companies— ENI has a lot of licenses. We have seen significant growth in the asset team market. Three years ago we rolled out our first Linux product, with applications and database working at full performance. This version has proved popular with government departments involved in license rounds. Data can be made available to work up prospects for sale or drilling.
Oil ITJ—And how are you going to develop the product line?
Sullivan—Our ‘non compete’ agreement with Landmark has now expired. Our relationship with PGS is good. We have launched ‘HubCentral’ a new generation data-management system leveraging ISA’s GeoBrowse* and a major challenge to legacy systems and tape libraries. We have done a lot of modernization of our product. In this we are guided by technical evaluations performed by oil companies. To mark Tigress’ first ten years in Russia we have developed and launched GeoTig, our first all-Russian interpretation system, developed in co-operation with GeoLeader and aimed at the growing Russian market. We are excited about our work with GeoLeader and the potential for our first all-Russian product with more than 80 Russian specialists! Deliveries to launch customers will begin March 2004.
Oil ITJ—What’s your main selling point?
Sullivan—The conventional software sales market is mature and Landmark and Schlumberger won! But with the amount of consolidation that has taken place in the last few years, we believe that there is an opening for fully functional software costing a few hundred thousand dollars for a company-wide deployment. Data integration is now recognized as mission critical.
Oil ITJ—What are you objectives now?
Sullivan—To be profitable! This year the market is much improved—the general level of inquiries is an order of magnitude up over last year. Our MBO timing was very good! We were profitable in 2003. Turnover was up 30% gross and 10% net thanks to cost cutting and rationalizing of technology.
Oil ITJ—What’s your new technology?
Sullivan—We have a new version of Oracle, the Linux port and some key new code. Tigress 64 is a new ultra-fast 64 bit version of our core Tigress product using Red Hat Enterprise Server and Oracle 9. We support the AMD Athlon and Opteron chipsets. These systems will be delivered to launch customers from March 2004. The release of Tigress 64 marks our fifth year in the Linux market place. On the application front our large clients have been very supportive. ENI has provided its state of the art petrophysical software which is now embedded in Tigress. This tool is benefiting from a ‘fast track’ development and deployment throughout the ENI/AGIP group—we all benefit from this two way information flow. Tigress also supports ad hoc data exchange via Ties, an XML flatfile format with ‘21st century’ audit trails. There is also an Open Spirit version of Ties.
Oil ITJ—Where are your markets?
Sullivan—We work on interpretation projects with several seismic processing contractors including CGG. We also cooperate on project room work with experts in North Sea rounds. Next month we are opening the Petroleum Centre in Aberdeen. This is a joint venture with Ingen and Bureau-Plus. The Petroleum Center will offer oil companies serviced office accommodation along with state of the art software and skilled support professionals. The Aberdeen location particularly targets companies engaged in the DTI’s prospect license scheme. Software offered will include Tigress PC Edition and Ingen’s RAVE economics tool. Other Petroleum Centers will be opening in Tyumen and Marlow. A Center in Luanda opens in April in co-operation with Terra Angola and another is planned in Houston later this year. We also offer our tools in ASP mode via our internet portal.
Oil ITJ—How much did you pay PGS for Tigress?
Sullivan—Sorry, I can’t tell you but I can say that it was less than the $170 million that Landmark paid for PetroBank! PGS could have probably got more cash from a trade sale, but the MBO, which includes a small but significant deferred consideration, offers a greater chance of success in the long term.
* See Oil ITJ Vol. 8 N° 11.
Advanced Geotechnology Inc. (AGI), Nexen Inc. and PTAC Petroleum Technology Alliance Canada are kicking off a Joint Industry Project (JIP) to develop RocksBank, a new rock mechanical and petrophysical properties database. The RocksBank database will be a companion program to AGI’s StabView well planning software.
The new database will let operators, service companies and researchers manage, analyze, search for and store laboratory and log-derived formation data. The database will be delivered to JIP participants with a comprehensive dataset on rocks from several thousand sources.
Non-proprietary rock mechanical and acoustic property data for many reservoir and caprock formations from North America and other petroleum basins in the world will be compiled, quality-rated and organized for rapid review, analysis and application. Proprietary data can also be stored, analyzed and compared to data in the open literature. RocksBank will run on a stand-alone PC or workstation, a local area network, an intranet, and the internet.
Target applications for the JIP include seismic velocity modeling, seal integrity assessments, petrophysical analysis, drilling program optimization based on rock strength and hydraulic and acid fracturing evaluation.
Statoil has selected Roxar’s Irap RMS as its main tool for reservoir modeling in a five year deal. Following a year long tender process, the multi million dollar agreement will allow Statoil’s geologists, geophysicists, and reservoir engineers unlimited global access to Irap RMS.
Irap RMS will enable the Statoil teams to work within 2D mapping, 3D geological modeling and well planning, on all available hardware platforms running on Windows, Linux and Unix.
Sjur Talstad, Statoil senior VP for E&P Technology said, ‘This is a milestone agreement. By partnering with Roxar, Statoil has set out its clear intention of continuing to improve recovery from its producing fields as well as meeting its requirements for efficient workflow—making better 3D reservoir models in less time. With our new operatorships it is essential that we continue to invest in and apply the very best sub-surface reservoir modeling technology.’
The latest release of Petrosys’ mapping software now runs on Linux, includes a direct connection to SMT’s Kingdom suite and an upgraded Petrosys— SeisWorks import option.
Petrosys’ latest release contains 120 bug fixes and 77 enhancements. The package now runs on four platforms— Solaris, IRIX, RedHat Linux and Microsoft Windows.
An Open Database Connectivity (ODBC) connection to SMT’s Kingdom suite allows for the import of interpreted seismic and well data to Petrosys, direct display of well and seismic data, gridding of well and interpreted seismic data and transparent query and reporting of data in SMT.
OpenSpirit Corporation and ETL Solutions Ltd. have signed an agreement making OpenSpirit the exclusive worldwide seller of ETL’s Transformation Manager in the upstream oil and gas industry. The agreement also provides for the embedding of Transformation Manager’s data transformation capability into OpenSpirit’s middleware architecture, increasing data type extensibility for data stores supported by OpenSpirit.
Transformation Manager provides an intelligent development and test environment to rapidly build ‘maps’ between different data models, whether xml, database, java classes or flat files/APIs. An integral code generator automatically produces portable Java code. Transformation Manager ‘understands’ and leverages meta-data in the code generation process.
Open Spirit president Dan Piette said, “This agreement underscores our commitment to provide access to all data from any platform to any software. This new option brings our user base a best-in-class data transformation technology.”
EnCana’s International New Ventures Exploration (INVE) has reported successful use of Malibu’s offshore well information management system. EnCana’s previous well data collection system was fragmented and it was proving hard to add a new site. WellCore EnCana chose Malibu’s WellCore application which was customized to EnCana’s requirements in a three week time frame.
Ron MacDonald, well engineering team leader with EnCana said, “Within days of our initial meeting, Malibu designed a solution that could be tailored to the specific needs of each one of our rigs while providing data consistency essential for accurate analysis.”
Following the first deployment off Canada’s East Coast, WellCore has been installed in Aberdeen and is now being tailored to an Africa onshore environment. These customizations help EnCana capture data that is sensitive to local conditions yet standardized enough so that data from one rig can be compared to data from another.
Paul Stringer, Contingent Business Analyst at EnCana added, “Malibu was the first company to come along with the right offshore solution. We’re happy to have a solution that is tailored to our business processes and needs.”
Key Wellcore functionalities for EnCana are: multiple measurement formats and security protocols, reduced time for localization, once-and-for-all data entry and data visibility across all sites.
Visual solutions developer TGS has merged with Indeed Visual Concepts GmbH, the developer of Amira. Indeed will become a wholly owned subsidiary of TGS. Amira and other TGS visualization products are embedded in many oil and gas vendor solutions – notably Seismic Micro Technology’s Kingdom Suite.
TGS CEO Jean-Bernard Cazeaux said, “By leveraging TGS’s global sales and marketing resources with Indeed’s extensive technology experience, our market penetration will be accelerated.” TGS also markets the Open Inventor 3D graphics toolkit.
The US Department of Energy reports that the President’s 2005 budget includes $729 million for fossil energy programs. The lion’s share ($447 million) is allocated for coal research and $177 million for the strategic petroleum reserve.
The new energy policy promotes enhanced oil and gas recovery and improved exploration technology. However, the Administration recognizes that if the program is to produce beneficial results, it must be more tightly focused than in prior years. Consequently, Fossil Energy’s budget request of $15 million reflects a reorientation of the program toward areas where there is a clear national interest rather than solely a corporate benefit.
One example is the use of carbon dioxide (CO2) injection to enhance the recovery of oil from existing fields. According to the DOE, the private sector has not applied this technique to its fullest potential due to insufficient supplies of economical CO2.
Fossil Energy will also refocus much of the program on marginal ‘stripper’ wells and reservoirs. These aging fields account for 40 percent of US domestic production, and contain billions of barrels of oil that could be recovered with improved technology.
The budget also includes $125 million for ‘other’ program activities, including $106 million for headquarters and field office salaries.
The US Minerals Management Service (MMS) presented its eWell electronic permitting and reporting system to industry this month. eWell is an internet-based system built by the MMS to improve operational transactions between offshore continental shelf (OCS) block operators, partners and MMS district offices.
Application to drill
The MMS has re-engineered its Application to Drill, Rig Movement and Well Activity Report forms for use on a secure internet site where users can submit information electronically in lieu of the current paper submissions.
The eWell system is intended to improve and streamline current MMS business practices by offering improved access to MMS data sources for data validation, automated email notification of application status and approvals.
Steve Zelikovitz described how Exxon and Mobil managed their merger and created the new standardized global IT infrastructure. During the merger – the watchwords were, ‘KTBR’, keep the business running and ‘standardize’, on either Exxon, or Mobil solutions . New ‘solutions’ were not an option. Some $100 million per year savings in computing expense were achieved thanks to reduced complexity. The ‘absence of standards has created a mess – we are still mining the landfill for data nuggets’. Repositories abound because applications have driven data architecture (or rather the lack thereof). Cheap data storage has created exponential growth in data volumes – ‘we are awash in data’. In 2000 the company experienced 30-40% per annum data growth, costing $600-700k per TB to manage. An archival/deletion strategy was implemented. So far a total of 70TB of data has been archived/deleted—‘increasing the needle to hay ratio’.
The DTI stores key documents in its internal ‘Matrix’ DMS (developed around Tower Software’s TRIM product) as Stewart Robinson explained. The system stores documents as digitally signed, legally admissible evidence. PON 9, a new Petroleum Operation Notice has been written to fit in with the idea of a National Archive. The oil and gas division of the DTI seems to be a trailblazer in e-government. Data reporting is to be done through a web browser with metadata in Dublin Core. The UK Oil Portal will only accept XML and PDF digitally signed documents. Logging on to the repository enforces XML-based cataloging – ‘you can’t deposit a document without cataloging it’.
Malcolm Fleming presented the Deal Data Registry which is intended to relieve companies of the legal burden of data retention and the National Hydrocarbons Archive – a new research-oriented subset of UK data managed by the BGS. Two trials have been completed on the DEAL Data Registry (cores) and for data archival on the Hutton Field. The Deal Data Registry (DDR) was launched in September 2003 with funding from UKOOA, DTI and CDA. The DDR will catalogue well cores, cuttings, reports and logs, 2D and 3D seismic surveys. A separate initiative – the National Hydrocarbons Data Archive (NHDA) will contain a select subset of license data – small enough to be economically manageable, and large enough to be useful.
According to Landmark’s Laura Schwinn, industry has to do more with less people. E&P productivity challenges of a declining workforce and complex reservoirs are forcing better productivity. In 1960 there were 1.6 million oil and gas workers. By 2020 there will be a mere 100k. This implies that a 7% growth in productivity will be required to keep pace. Schwinn’s infomercial touched on Flare’s catalogue and Landmark’s Decision Space Portal. Schwinn cited a couple of data management war stories—one from the North Sea involved a well collision due to a mis-identified well trajectory.
Eldar Bjørge provided an update on Statoil’s ongoing data improvement effort. Statoil’s data management system tags approved data with quality control and context information (QCC) before storage in the corporate ‘results’ database (CDB). Bjørge warns that it is hard to compromise between too many attributes and ease of capture. Compliance metrics show the system is working well—in June 2003 some 5% of picks were in conformity with the standard – in December 2003 this had risen to 92% - ‘quite an amazing change’. Field and prestack seismic data is stored in an offline tape archive. Raw data is stored in PetroBank and interpretations in the CDB. Statoil has also implemented ‘SPV’—a low cost tool for seismic volume archival on an IBM TSM robot. The system is now being rolled-out to Statoil’s Global Exploration unit.
Ibrahim Al-Ghamdi explained how Saudi Aramco is digitizing legacy datasets for improved accessibility and applying quality methods to improve the data capture process. Al-Ghamdi’s presentation emphasized Saudi Aramco’s focus on data management and data quality. Aramco’s paper seismic archive is being scanned to TIFF in a ‘push for accessibility’ and to eliminate ‘muda’ or waste. Aramco is assigning significant resources to data cleanup—by addressing the sources of data capture problems. Al-Ghamdi insists that ‘fix a data point and you have just fixed that point—fix a process and you have eliminated future problems’. Al-Ghamdi advocates dual data entry. This may be expensive, but is recommended as increasing data quality and to identify problems with individual data clerks who can be coached or moved on to other tasks. Quality stems from the identification of people errors and process errors – one should ‘map and challenge’ processes constantly.
Glenn Mansfield (Flare) and Pete Paragreen (Centrica) presented Flare’s ‘Raptor’ system built to capture operations and production data from Centrica’s gas storage facility. Centrica Storage operates the North Sea Rough Field which holds around 76% of the UK’s storage capacity—around 10% of peak demand. Raptor is a knowledge-information-data (KID) store with embedded workflow. Users log on to see what they have to do next or to drill down for complex queries. A leak management reporting and tracking system was embedded in Raptor as part of the UK HSE/UKOOA drive to reduce hydrocarbon release.
Thierry Gregorius reported on Shell’s global ESRI-based GIS infrastructure, SAP Enterprise Portal and a new Microsoft .NET development standard. Gregorius cited Waldo Tobler’s law—’everything is related to everything else, but near things are more related than others’. Shell’s GIS professionals form a GIS Technical Advisory Panel (TAP) and are involved throughout the E&P data lifecycle. Example usage includes permit maps, spill prediction, environmental protection and forecasts of exploration success. Raster images get attention! A satellite imagery can be superimposed on a structural geological interpretation. Many of Shell’s explorationists have become expert GIS users. Shell’s rationalization and new world organization steered by HQ has brought worldwide standards and a global infrastructure. Three IT super centers (USA, Holland and Malasia) support Windows 2000 desktops with Microsoft .NET as the development standard. This is ‘quite a change in mindset’ for many in Shell. GIS layers are hyperlinked to documents in the DMS through the metadata. A Shell global UID system was described as ‘work in progress’. A GIS-enabled web-front end lets users mine data from multiple databases – in geology, reservoir engineering etc. This works across the SAP Portal and public data sources. Shell is in the process of designing its spatial data infrastructure to offer ‘joined-up’ GIS. Gregorius reports that the IT infrastructure is easy – but the combined catalogues are hard to deploy.
Helen DeBeer explained how EnCana is linking structured data in corporate databases to unstructured data in document management systems. The idea is simple – to use data from structured databases to make pick lists which are used to classify documents. EnCana uses this technique to link its Seitel EDM seismic data management system (structured data) with unstructured data in Open Text’s LiveLink Document Management System and in-house developed Oracle datastores. EnCana has built such systems for tracking exploration opportunities, for managing IT projects and to build a seismic survey/navigation and inventory database. In all cases the same philosophy is used. Metadata management is the key, adding industrial strength search to LiveLink’s limited capabilities.
ENI has built web portals for technical users (Landmark’s Team WorkSpace) and knowledge workers (SAP Enterprise Portal). Antonio Carlini decribed how the portals hide infrastructure complexity from users and offer worldwide access to IT facilities in Milan and Houston. Screen sharing between two remote workers has proved very popular. A ‘semantic search engine’ powered by Invention Machine’s Cobrain has proved ‘very powerful’. Like Statoil, ENI uses Schlumberger’s results DB to store project summaries. The portal has been linked to an external Finder database and a warehouse outside of Milan. Some 21 ‘workflows’ have been developed and are to be deployed ‘massively’. The keys to project success – leave data where it is, no ‘revolutions’ and a modular architecture.
Another SMi ‘regular’, Duncan McKay, updated attendees on ConocoPhillips’ digital data room preparation. ConocoPhillips is involved in a major post merger disposal program focusing on non core assets (low revenue, high G&A maintenance). The North Sea unit continues to develop its scanning workflows and was able to create a dataset of 26,000 files, 14GB of data in a record 2 ½ months for a recent disposal. The intent is to produce a polished product, ‘it’s a sales job after all’. The digital data room is a facet of ConocoPhillips’ CROP initiative. The campaign for reduced paper is not ‘don’t use paper’ just ‘don’t store paper’.
Database consultant Niall Young has been involved in two UK DTI sponsored projects, Vantage, an offshore ‘passport’ and digital training record, and EEMS, a system for reporting environmental emissions. The LOGIC Vantage People on Board (POB) experienced poor take-up. The Environmental Emissions Monitoring System (EEMS) is a 12 year old system – all companies are required to submit returns. This is to be web-enabled and linked with the DTI Oil Portal as part of the new digital signature initiative. Uses (will use) SOAP, Web Services, XML Schemas and Oracle’s XML database extension ‘XDB’.
This article has been abstracted from an 11 page report produced as part of The Data Room’s Technology Watch report service. If you would like to evaluate this service, please contact firstname.lastname@example.org.
* Things never change.
Thierry Pilenko has been appointed chairman and CEO of Veritas DGC. Pilenko was previously MD of SchlumbergerSema.
Kuwait-based Petroleum Services Company (PSC) will own and operate AspenTech’s new Middle East unit ATME.
Phil Beale is director of Ødegaard’s new office in Kuala Lumpur.
Shengyu Wu has been named Director of Geoscience for C&C Reservoirs. Wu was previously VP of Exploration with Total’s US unit.
Mike Fleming has joined Geotrace as Area Manager, Middle East, based in Cairo. He was previously with Western Geophysical.
UKOOA will publish a handbook on the disposal of exploration data later this year and PILOT will publish a handbook on the archiving process.
Trade-Ranger has appointed Amadeo Cazzalini to Integration & WebMethods Portal Administrator.
Baker Hughes has appointed James Clark as President and Chief Operating Officer
Energypromotion.net has just release a report on Information Technology for Energy Managers.
The EU Oil and Gas Directory has just been launched by UK-based First Point Assessment and Norwegian Achilles JQS.
Landmark’s UK data hosting center, which includes Common Data Access (CDA) UKCS Well Data, has received ISO 9001:2000 certification following a four-day audit by Det Norske Veritas.
Speaking at CERA week, Shell MD Malcolm Brinded says gas will overtake oil as the primary fuel by the mid-2020s.
Informative Graphics has just released a free TIFF viewer—see www.infograph.com.
Statoil now considers Stavanger-based Petec’s Dynaflodrill (DfD) as mission-critical for its under balanced drilling program on the Gullfaks field. DfD, part of Petec’s Drillbench suite, is used for the design of under balanced operations and enables engineers to investigate static and dynamic problems related to under balanced drilling operations (UBD).
Statoil UBD operations manager Johan Eck-Olsen said, ‘Following a successful qualification period, where the accuracy of the model was tested by comparing simulated results with actual UBD well data, Dynaflodrill has been a very valuable tool in planning and training for the UBD drilling campaign on Gullfaks.’
UBD helps reduce lost circulation, minimizing the risk of differential sticking, increases bit life and penetration rates. UBD also can avoid formation damage, increase well productivity and may eliminate the need for stimulation.
Statoil is going to perform the Norway’s first UBD on the Gullfaks C-5 well later this year. UBD was selected to overcome frequent well control problems faced in previous Gullfaks wells due to low margins between pore and fracture pressures.
Planning and preparing for an offshore UBD operation is more complicated than a traditional overbalanced well. Statoil has a strong focus on HSE, and being the first UBD drilling operation in Norway, it has also been necessary to convince the Norwegian Petroleum Directorate that the operations can be performed with an acceptable safety level.
Several Drillbench applications have been used in the project. Steadyflodrill and Dynaflodrill were used to design the UBD program and to train team members. The dynamic surge and swab module in Presmod was used to design the tripping procedures to avoid exceeding the operational pressure window during tripping.
A new web interface to CGG’s PetroVision supports secure remote web-based access data stored in PetroVision from any point in the world. Users can select, view, obtain or ask for delivery of any data, to which the user has been granted access.
Data types including wells and seismic lines are accessible through a GIS interface. Office Automation documents (Word, Excel and Acrobat) can be managed from the interface and filtered, sorted, searched and downloaded to the desktop.
Large files can be retrieved by authorized users via FTP. where user access rights exist. Viewers are provided for GIF, JPEG, TIFF graphics and industry-specific formats including LAS, LIS, DLIS, SEGY and RODE. Files can be emailed to authorized recipients. PetroVision is used by Total and by several clients in Russia.
In the introduction, Mike Economides makes a bold claim for data mining in oil and gas. For Economides, mining of the huge datasets generated by exploration and production is a new way of ‘matching real with predicted data’—along with traditional numerical analysis and numerical simulation. Oil and Gas Data Mining* deals mainly with three techniques—neural networks (NN) , self organizing maps (SOM) and genetic algorithms. The presentation is at a suitable level for non specialists to get a grasp of the basic concepts and is written in a clear style with a straightforward presentation of the math.
The treatment of mainstream data mining—with SQL and OLAP is a bit skimpy—a shame in view of the plethora of tools which are available for slicing and dicing hypercubes of data. But the book scores with excellent case studies. The first covers the use of multiple linear regression, SOM and NN to establish a model of permeability over a large middle east oilfield. The second compares techniques for selecting wells to stimulate on the Wattenburg field in Colorado. Five different artificial intelligence methods were compared with a type curve analysis. Intriguingly, the study shows that different methodologies threw up different stimulation candidates.
A third way?
The book is a great introduction to a variety of new techniques, but is data mining really a ‘third way’ for analysis? This reviewer sees these techniques as contributing to a bewildering armory of methods from analysis through simulation to statistics. Most approaches will likely combine elements of some or all of these into the estimation process.
* Data Mining Applications in the Petroleum Industry. Zangl and Hannerer, Round Oak 2003. ISBN 0-9677248-1-3.
Advanced Visual Systems has helped IGM Ltd. develop a new 3D decision support graphics package—GeoExpress. IGM was formed in 1999 to develop solutions for the analysis and presentation of complex datasets from a variety of sources. GeoExpress is the first commercial product.
According to Frank Arnott, CTO with IGM, ‘AVS/Express has enabled us to provide our users with an exceptional degree of insight to the complexities of earth science data. Its powerful development methodology, rich visualization techniques and dynamic performance on the PC platform have provided IGM with a solid foundation for our product strategy.’
GeoExpress supports interactive visualization, research and project analysis of geological, geochemical and geophysical data. Users can interactively explore a wide range of domain-specific data formats on all Microsoft Windows platforms. AVS/Express enables the development of highly functional data readers and writers, allowing seamless data handling and eliminating the need for the creation and management of secondary databases.
Spotfire has announced a global reseller agreement with Landmark Graphics for its DecisionSite data analysis and visualization package (see Oil ITJ Vol 8 N° 12 for review of Spotfire DecisionSite use in the upstream). The deal allows Landmark to embed Spotfire’s analytic application within its DecisionSpace Decision Management System (DMS).
Spotfire has also announced a new product for oil and gas professionals. DecisionSite MapConnect is an extension to ESRI ArcMap that lets users merge GIS and tabular data, verify data quality, identify important trends, and provide ‘compelling’ visualizations.
John Lyle, team leader, geological and geophysical interpretation services for ChevronTexaco Energy Technology Company said, ‘MapConnect speeds up our identification and high grading of prospective areas, by letting us interactively filter results in several attribute and spatial dimensions simultaneously.’
DecisionSite MapConnect combines ESRI map data with E&P project data so analysts can uncover hidden relationships, trends, and patterns and make faster, higher quality decisions. DecisionSite and DecisionSite MapConnect can be rapidly configured to support many applications that help exploration teams make better, faster decisions about how to allocate capital investments among prospective wells and oilfield assets. Example applications include prospect assessment and validation, environmental impact analysis, pipeline security and safety management, and many others.
Gerrit Louwaars, senior geoscientist global exploration, Shell, added, ‘MapConnect opens new ways of analyzing and visualizing spatially distributed data often faster than you can think, leaving its potential entirely to the user’s
Writing in Trade Ranger’s 2003 year-end report, President John Wilson outlined plans to improve the quality and efficiency of current offerings and to extend services and solutions with eInvoicing and eSourcing applications.
Jean-Pierre Foehn, VP Operations, added that the company has signed 650 new suppliers to the marketplace, bringing the total to nearly 2000. In December, document transactions rose to more than 55,600 for a 2003 total of 425,000. More than half of those were Purchase Orders. By the end of 2003, the number of stock keeping units (SKU’s) in the database rose to over 600,000.
All of the company’s major releases in 2004 will be officially branded under the Trade-Ranger Universal Environment (TRUE) including flagship applications TRUE Procurement (TRIP) and TRUE Order Management (TROM).
Sebastian Gass, VP of Technology added ‘We now handle about 1600 invoices per month a through TROM and over 550 through ERP-to-ERP integration. We forecast a total invoice volume of 5.25 million per year by 2006. The potential cost savings on the buy-side could be as much as $8.00 per invoice, or nearly $50 million per year.’
Repsol YPF has implemented the first Trade-Ranger solution in Latin America in its Brazilian subsidiary. The deployment is Repsol’s first eProcurement project that leverages Trade-Ranger’s hosted applications.
Repsol YPF Brazil’s purchasing and Contracts Department procures materials, assets and services (except for raw materials) including tanks, gauge systems and other equipment for service stations, IT materials and office supplies – issuing about 450 purchase orders per month, for an amount of $3 million, on average.
Houston-based business integration specialist ZettaWorks is to offer a decision support service to companies considering deploying an Enterprise Application Integration (EAI) solution. ZettaWorks’ five-step process helps clients determine the EAI solution that will best meet their needs within their budget while reducing risks of integration and solution choice.
ZettaWorks claims to be independent of EAI vendors and offers ‘unbiased’ assessment of clients’ current architectures, suggests vendor-neutral solutions, validates vendor proof of concepts and costs.
ZettaWorks VP of Technology, Eric Roch said, ‘We have leveraged the experience gained through our partnerships with leading EAI vendors to develop an extensive knowledge base of selection criteria, comparative product features and strengths. The result is a selection process driven by our customers’ business objectives and ROI and based upon proven technology that matches our business needs and technical architecture.’
‘The EAI market segment has all of the classic characteristics of emerging technology: media hype, wild vendor claims, and soon-to-be obsolete products. Yet there are mature technologies available that have solid ROIs and are backed by reputable vendors.’
A toolkit has been developed to lead clients to the most appropriate solution based upon their specific needs. Components of the toolkit include documentation templates, evaluation matrices, and ROI calculator. ZettaWorks’ offering is available as a five step product selection process or as a package tailored to a client’s needs.
EPG Companies has installed its E-Wave radio telemetry system to monitor 24 Pemex remote natural gas production sites in rural Villahermosa, Mexico. The system provides continual real-time production data and site status monitoring. Data includes American Gas Association (AGA-3) calculations from the flow computer PLC.
E-Wave is a wireless SCADA device that uses 900MHz, frequency hopping, spread spectrum radio technology. Any number of the E-Wave transmitters can work side by side with little chance of interference with each other or other sources.
The economical, labor saving telemetry system also performs shut downs and generates reports from a single master location and uses open architecture protocol to accommodate future upgrades and/or expansion. There have been no system problems related to the telemetry equipment. Additional stations and systems including over 250 RTUs have been added since the original system. EPG telemetry systems are designed to provide complete control over any field operation, from monitoring vital signals, levels, and temperatures, to performing control functions and complex calculations.
Chevron Technology Ventures along with other venture capital firms and institutional investors are to put up $6.5 million in Series B funding for MetaCarta – the geographic document search specialist. MetaCarta combines textual place name search with a geographic information system (GIS) to provide an innovative way of accessing large document databases (see Oil ITJ Vol. 8 N° 3).
MetaCarta’s flagship product, MetaCarta Geographic Text Search (GTS), uses natural language processing to identify implied and explicit references to geographic locations within documents. The solution marks documents with appropriate latitude and longitude coordinates, and then enables a search for marked documents through a Graphical User Interface (GUI). Users search for documents using a combination of geographic location and keywords.
Independently of the equity investment, ChevronTexaco is a MetaCarta customer and uses the product at several locations around the world. MetaCarta reports other users in the intelligence and defence sectors. The new funding will be used to expand the product line, and to explore new market segments.
MetaCarta founder John Frank said, ‘We continue to grow upon early successes. Our company’s vision, and our employees’ effort and expertise continues to drive our growing recognition and product acceptance within government and energy markets. This latest investment is a testament to our organization’s ability to deliver valuable solutions in support of mission critical customer needs.’
Chevron Technology Ventures LLC, a unit of ChevronTexaco Corporation, is involved with identifying and investing in new and emerging technologies that have the potential to enhance the performance of ChevronTexaco businesses and lead to new growth opportunities. Based in San Ramon, Calif., ChevronTexaco is the second largest U.S.-based energy company and the fifth largest in the world, based on market capitalization. Co-investors in this round of funding are Sevin Rosen Funds, Solstice Capital and Chisholm Private Capital Partners.
Total has awarded Aberdeen-based SAP consultancy, Absoft, a contract for an SAP Business Information Warehouse (BW) deployment in its Indonesia, Congo, Gabon, Angola and Nigeria units.
Total’s Christian Placines said, ‘We are implementing BW alongside SAP in all our operations. Within two years, we will be in a position to consolidate reporting across all units against standard data structures. This will provide a springboard to generate additional savings through easier consolidation and analysis in all areas’.
Development of the module for 2,000 users in these countries is being carried out in Total’s head offices in Paris where Absoft already has a large team working on other SAP projects. Absoft has been working with Total since 1991.
BW supplies the infrastructure typical of data warehouses, but also includes preconfigured data extractors, analysis and report tools, and business process models. Among the other features of BW are: Business Application Programming Interfaces that enable connections to non-R/3 applications, preconfigured business content, an integrated OLAP processor; automated data extraction and loading routines, a metadata repository, administrative tools, multiple language support and Business Explorer, a web-based user interface. SAP Business Warehouse is an integral component of the company’s mySAP Business Intelligence group of products.
Placines added, ‘We chose Absoft for this project because of its experience of both BW and SAP for the offshore industry. The company has repeatedly proved itself on major projects around the world and we can be confident that it will deliver on time and within budget.’ Absoft is involved in two SAP deployments for Total—UNISUP for larger subsidiaries and SALSA for smaller international affiliates.
Retail gas supplier Centrica has implemented its ‘Essential’ portal using Plumtree’s Enterprise Web Suite. The Essential portal allows Centrica’s businesses to access vital product and customer information and serves as a platform for knowledge sharing and collaboration. Centrica plans to extend Essential to its global business operations and its mobile workforce over the next couple of years. An estimated $3.7 million per year savings should result from the replacement of paper-based information distribution in Centrica’s call centers.
The American Productivity and Quality Center (AQPC) has just published a guide for ‘Continuous Improvement in the Energy Industry’. The Guide is designed to help companies navigate the current business landscape of increasing globalization, corporate downsizing, and outsourcing and to help them evolve, improve, and stay ahead of the competition.
Using its own benchmarking methodology, the APQC has researched practices critical to the corporate world and produced a catalog of case studies from the world’s leading organizations. These include British Petroleum, PECO, Halliburton, Chevron, Florida Power and Light, and Edison International.
The CD-ROM includes 34 profiles and executive summaries of the benchmarking study’s participants, key findings, research methodology, and study focus. The study targets companies interested in the energy industry’s continuous improvement efforts. Industry-wide trends and best practices in a spectrum of continuous improvement arenas, from knowledge management and performance measures to competitive intelligence and online training.
Calgary-based electroBusiness (EB) is providing data validation services for ConocoPhillips Canada (CPC). eBvalidation is a managed solution that enables CPC’s vendors to verify the accuracy of the accounting codes used on CPC’s invoices. Once verified, invoices can proceed directly to the payment approval process.
Judy Smith, e-Business manager with CPC said, “EB provides an efficient means for our vendors to comply with our electronic document requirements. Data validation has greatly reduced the back and forth communication and re-work that is required whenever incorrect data is submitted.”
EB has also launched a secure electronic document exchange service to complement its e-business offering. eBgateway uses SOAP-based web services to communicate with the e-Business Utility host. A Secure Sockets Layer (SSL) protocol encrypts all data with a 128-bit key.
The e-Business Utility uses Lightweight Directory Access Protocol (LDAP) to authenticate users and manage access rights and document routing. LDAP is based on the OSI X.500 directory architecture standard, simplified and adapted to TCP/IP protocol. The major software companies have embraced LDAP.
Cal Fairbanks, CEO electroBusiness, added, ‘We are providing a highly effective solution for any scenario where information must be shared securely across company boundaries or locations. This has broad application in industry and in any business activity that requires communication with individuals and groups from different organizations. We are scoring a double-win with our client’s business leaders and IT department heads alike, because we are enabling them to easily conduct business with their partners while at the same time ensuring that high security standards are upheld.’
High-end supercomputer, storage and visualization specialists SGI reports on recent oil and gas deployments in its second fiscal quarter results. British Gas Exploration has acquired a four-processor Silicon Graphics Onyx4 UltimateVision graphics supercomputer with 12GB of system memory and four graphics pipes. The system was chosen in part for its ability to accelerate complex subsurface visualizations using InsideReality software.
Schlumberger has also acquired an SGI box for its Cambridge (UK) Research unit. The system, another Onyx4, will be used to develop 3D well planning and seismic volume interpretation software.
SGI’s results for the quarter ending December 26, 2003 showed GAAP revenue at $237.9 million (up from $218.0 million for the previous quarter) and gross margin of 46.7% (up from 43.4%).
SGI chairman and CEO Bob Bishop said, ‘This was a quarter of significant progress for the company. The completion of our bond exchange offer, coupled with solid operational performance and the recently announced Altix 350 mid-range server has investors, customers and partners taking a fresh look at the company. We’re confident, focused and ready to deliver value to all of our stakeholders.’