Speaking at the Trade-Ranger 2004 Community Conference last month, Total’s CIO, Philippe Chalon, revealed that 59% of Total’s IT budget went on applications. Of this, 14% goes on hardware, 14% on software, 29% on OPEX and the remaining 43% on ‘integration.’
Chalon is not happy with this state of affairs, and points out that such excessive integration spend is not only a loss to the end users, but is hardly a bottom line benefit to application vendors—in particular, SAP.
One problem is that software generally does not cover all of the user’s business needs and requires customizing before roll-out. Another issue is that there is no stable interoperability platform available today. Chalon does not believe in a ‘single vendor solution,’ and prefers a ‘platform for best of breed applications’. Standards are key—these can be technical or just catalogs. Security is a big issue for Total, as is performance. A one second latency in a satellite link can be detrimental to integration and application connectivity.
Short cycle releases
Compounding these issues is the fact that the vendor’s business model is one of ‘short cycle releases and constant change’. This is where the absence of a solid IT architecture is felt most. ‘For things to change, you need a solid architecture at every level.’
New business model
Chalon advocates a change in the IT business model—suggesting that suppliers should include the cost of integrating their applications into client IT systems. Such issues need to be addressed for each new release, and should include any required data migration. Software installation and integration of all pre-approved connected applications should also be a part of the service package.
More for SAP
Chalon pointed out that this approach would mean that companies like Total would be paying less to the systems integrators, but more to application vendors.
Chalon is wary of SAP which is ‘very powerful’ and comparable to IBM twenty years ago, or to Microsoft today. Chalon pleaded with the SAP representative Usman Sheikh to ‘stop growing your perimeter, you can’t be the best in all fields’. Sheikh refused to apologize for SAP’s growth or the company’s aggressive release cycle, but promised to ‘ease the pain’ of upgrades.
See our report from the Trade Ranger conference in this issue.
Marathon has awarded SAIC an outsourcing contract to provide IT infrastructure services, upstream applications development and maintenance. In a seven year, ‘multimillion dollar’ agreement, SAIC assumes responsibility for service desk and client-side support for 12,000 users and 10,000 desktops. Global network, security and infrastructure support are also in the deal.
Marathon CIO Tom Sneed said, ‘SAIC offers the expertise and a broad knowledge base that will let us focus on our core businesses.’ SAIC’s Randy Walker added, ‘We look forward to applying SAIC’s oil industry experience, as well as our delivery expertise and methodologies to help Marathon achieve its business objectives.’
Last year, Marathon awarded EDS an eight-year, $63 million outsourcing services agreement to improve its computing capabilities and consolidate its server environment.
Before this month’s editorial, a word about our sponsors. First, a big thanks to our renewing sponsors for the 2004/5 period. In alphabetical order, these are;
Foster Findlay Associates
Venture Information Management
Next, a special welcome to our new sponsors,
By supporting our website and online editions, these companies demonstrate their commitment to oil and gas information technology. Next time you are on www.oilit.com make sure you visit their websites too.
As you are probably aware, our parent company, The Data Room, has been watching oil and gas information technology evolve for nearly a decade. Our coverage spans upstream, pipeline, GIS, construction and also ‘generic,’ horizontal IT—both in the commercial arena and in the standards sector.
This year has been a particularly enriching experience for us. We have renewed our contacts with the plant data management sector—see our report from the USPI-NL meeting on page 5 of this issue. E-commerce has likewise come back into the spotlight with a report from the Trade Ranger Community Conference (page 8). Earlier this year we also attended the World Wide Web Consortium plenary meeting and reported on the activity of the Semantic Web special interest group.
At the same time our regular coverage of industry trade shows such as the AAPG—and later this year, the SEG and SPE, along with specialized vendor and data management shows give us a ‘unique’ viewpoint on the industry. ‘Unique’ is a word we always remove from press releases on the grounds that it is unlikely ever to be true. But our technology coverage is demonstrably unique. Why? Because no other organization attends such a broad cross section of the oil vertical’s conferences. Even internal company attendees usually come from the particular segment; people rarely look over the silo wall.
Out of depth?
Now I use the phrase ‘unique viewpoint’ advisedly. We don’t claim omniscience. We do not pretend to have in-depth knowledge of every field. Indeed whether we are at the W3C, visiting with the Trade Ranger community, or trying to make sense of ISO 15926, 13584 and PI-STEP we have come to a state of being nearly always out of our depth.
At first—and I mean the first year or two of watching technology—this was a scary experience. I remember arriving at the old Plant Information Management show back in 1999 and wondered just what I had let myself in for! It seemed incomprehensible and off-topic, and it required a considerable effort of mind to stick with much of the esoteric jargon.
Now I look forward to arriving in a new community and trying to find out what’s going on. To take the temperature of a community. To see if they are heavy duty IT techies (like for instance the business object aficionados of yore), or if they are domain specialists—to whom a ‘standard’ may just be a Microsoft Word document packed with information.
One thing that has forced us into the deep end is the continued emphasis that the industry puts on ‘breaking down the silo barriers’. Talk of the e-field, of simultaneous computer modeling of subsurface and facilities, of GIS as an enabler and other horizontal solutions all means that coverage should be as wide as possible.
But getting back to that unique if somewhat blurry viewpoint, I’d like to try and tie together a few lose ends from our coverage of the last few months—if not of the last few years. To a degree, the ‘techies vs. domain specialists’ dichotomy reflects a differing focus on container or content. For instance are we to standardize the data model, or the well name list? It can be hard to understand just what each community is trying to achieve—often I’m not sure they have really decided themselves!
Which leads me to the commonality I have observed in just about all of the meetings I have attended for the last year or so—the commonality of the catalog. Everyone, from geologists through e-marketers to construction engineers, wants to use standard lists of names. There has been a distinct shift in emphasis from container to content.
But is it good enough to embed a standard catalogue in a proprietary tool? Or even in an ASCII file, Lotus Notes or a Microsoft Word document. For cross silo information sharing, we should be looking to the catalog container as well. Somewhere in the W3C plethoric sub-domains—OWL, RDF, and other XMLs there must be a way to do this. But even the W3C has a problem with its own silos. There are just too many ways to skin the ‘standard container’ cat! This is indeed a great frustration for the subject matter hoppers that we are. If only we could point those domain specialists in the direction of a single, straightforward cataloging technology. Along with units of measure naturellement.. É
Oil ITJ—How did you get started in the cluster business?
Hutchings—We sold our first cluster to the US government in 1997 already running on Linux. Our first commercial sale was to Brookhaven National Labs. The company started in 97 as board manufacturer. Then our customers asked to have boards assembled and the cluster business was born. Clusters are not just a bunch of PCs, managing them requires special skills. Our ClusterWorx package provides a complete management solution for clusters including system admin, application management, job scheduling, resource management and prioritization. All of which greatly increases levels of use. Cluster management has been the key to successful migration from big iron. For oil and gas this means better price performance for clients like GX Technology and Shell’s Rijswijk R&D unit.
Oil ITJ—What about companies like CGG which just uses ‘commodity’ Dell boxes for their clusters?
Hutchings—Companies need to take a step back and see how much managing clusters is costing them. For most computational users, the value is created through a relationship with an independent software vendor (like Schlumberger, Landmark). We can then think about creating the best infrastructure to run clients’ apps.
Oil ITJ—So clusters are not commodity hardware?
Hutchings—You should ask those who buy clusters. Both ExxonMobil and ChevronTexaco let their system administrators run MPI and were disappointed with the results. They should have bought a managed system.
Oil ITJ—What of those companies who have made ‘skunk work’ clusters – buying boxes on eBay?
Hutchings—Yes, Daimler Chrysler made a Grid out of 12 laptops. But how does that help their business? Today even early adopters should be concerned with total cost of ownership, productivity etc. We test our systems and code in Salt Lake City before delivery. So they run out of the box. One system, the sixth fastest in the world, with 2,800 Opterons and high speed interconnect was installed at Los Alamos. This was 6 days from delivery to doing science.
Oil ITJ—How do you count the number of processors in a cluster?
Hutchings—It’s defined by the scalability of an application. At 80-90 CPUs most systems max out as CPUs don’t have enough to do.
Oil ITJ—What about the new ‘on-demand’ computing paradigm?
Hutchings—That’s what we are doing at Los Alamos. Some jobs run on 2,800 CPUs, some on 16. There is a shift in focus from the number of CPUs/$ to productivity/$. Interconnect is also crucial. Oil and gas often uses Gigabit or hi speed Ethernet but there are much faster options; Quadrics or Infiniband. Multicast software upgrades allow you to radically change configurations according to work load. This resource/node allocation lets you do your re-engineering at night and GigaViz interactivity during the day. The new pre-stack focus represents great potential for cluster use. Fraunhofer is also delivering cluster-based visualization with PV-4D.
Oil ITJ—What happened to the Itanium?
Hutchings—The AMD Opteron, which has great memory access, has grabbed the Itanium’s market. Intel has reacted and is now offering a 64 bit Xeon. The Itanium may take off in 12-24 months. Incidentally, there will be a 2,200 node Xeon 64 at the Department of Defense in September.
Oil ITJ—Where’s cluster-based computing heading now?
Hutchings—The world is moving back towards central computing and data servers, with distributed clients. The same applies to visualization, where the key is configurability, multi-case provision, scheduling and ease of upgrade.
UK-based Geotrace has been awarded a prestigious contract for the processing of Statoil’s benchmark ‘Q’ survey over the Visund field, Norway. WesternGeco will be shooting the survey using its proprietary Q-Marine technology, with ten 5,000 meter cables in a 675 meter swath. Western-Geco’s Q technology uses single sensor acquisition to record extremely high resolution data.
640 km. sq.
Geotrace is also to re-process and merge two previous data sets along with the new survey. The survey is part of Statoil’s plans to optimize existing fields and the goal of the survey program is to identify where Statoil should drill to tap additional reserves, and to map existing discoveries in order to ensure profitable tail production.
The 640 square kilometer survey will also establish a baseline for future monitoring of fluid movement in the reservoir. The Visund 4D processing was awarded to Geotrace based on its previous processing and capabilities in the field of time-lapse technology. Unlike more conventional 4D targets, the Visund field is well into its production cycle. The license holders Statoil, Norsk Hydro, Petoro, Conoco-Phillips and TFE are to use the time lapse data to ‘monitor changes of physical parameters within the field.’
TGS-Nopec Geophysical Company has concluded its acquisition, for $11.25 million in cash, of the energy division of Houston-based NuTec Sciences. The acquisition provides TGS-Nopec with specialized pre-stack depth imaging technology—to be used to add value to TGS-Nopec’s spec data and to provide imaging services to oil and gas companies.
NuTec has 25 employees and operates what is claimed to be one of the world’s fastest supercomputers. TGS-Nopec will continue NuTec’s third-party seismic processing services and Prima software licensing business which generated an operating profit of $2.5 million from turnover of $10.6 million in 2003.
Included in the deal is NuTec’s Prima seismic processing
and integrated multi-volume visualization tool. The visualization module is
a commercial application that is currently licensed to over 50 oil and gas companies.
Houston-based Quantum Earth is offering an OpenSpirit link for Matlab (and Mathematica) providing hands-on developers with access to a range of E&P data stores. The bi-directional links have been developed with the Java version of the OpenSpirit development kit and extensive use of the java environments available within Matlab.
Data items retrieved from OpenWorks, GeoFrame or other supported databases are available in the Matlab workspace as global variables and structures for study and computation before being written back to the database. Currently the link supports the following objects: seismic, horizons, grids, faults, logs and others. Session and project management are also supported and the Quantum link is aware of OpenSpirit data selection, change and cursor events. A phased release envisages support for all items in the OpenSpirit 2.6 table model later in the year.
The 60 or so clever enough to locate the OpendTect breakfast meeting in the labyrinth of the Paris exhibition witnessed growing support for the ‘open source’ seismic interpretation system from de Groot-Bril (dGB). dGB claims over 3,000 downloads of the software in the past seven months.
OpendTect and its source code is free for research and educational purposes. Commercial use is subject to a ‘modest’ maintenance fee. OpendTect now supports multi-volume seismic attribute and neural network analysis, spectral decomposition, reservoir characterization and other applications. It also features multi-platform distributed computing on any combination of Linux, Solaris, Irix and Mac-OS/X. Windows (2000/NT/XP) is currently supported as stand-alone processor.
Statoil reported that its company-wide license for OpendTect is routinely used for fluid migration path studies (chimney cubes) and fault cubes. The software is also popular with Statoil interpreters for seismic facies analysis and for noise suppression by dip-steered median filtering. Statoil has also integrated proprietary technologies for dip steering and neural network-based seismic analysis into the environment. Maersk Oil also showed a fluid migration path study where chimney cube results were linked to seismic facies.
dGB’s GDI inversion technology is being integrated into OpendTect and plug-ins from vendors including ArkCls and Erm.s are also being developed. A joint industry project, with support from Statoil, BG Group and the Dutch taxpayer, is under way to develop a sequence stratigraphic module.
ArkCls showed how its IDEAL data exchange server (an OpenSpirit clone) is used to access vendor data stores. The company presented its Seismic Spectral Blueing module - which is claimed to maximize resolution by shaping the seismic spectrum to the reflectivity log. The blueing module, and ArkCls’ seismic colored inversion are both available as OpendTect plug-ins. Paris-based GeoSole showed how OpendTect could be used to analyze ground penetrating radar data for geotechnical studies, with an example from a survey over the Roland Garros tennis courts!
ERM.S’ Geodesis project uses random geometry theory and morphological segmentation to provide seismic pattern recognition and attribute processing solutions. OpendTect will be used to deliver a Geodesis toolbox for clients.
OpendTect has hit a sweet spot with universities around the world. Several research initiatives are underway investigating coherency (Curtin University) fracture detection (Oklahoma), automatic seismic classification (Tromso), 4D seismic applications (Rio de Janeiro) and others.
USPI-NL, the Dutch Process and Power Industry Association, works on standards in the field of construction and rehab of oil and gas production facilities, refineries and other facilities. On such projects, IT spend can be massive, witness the $8 million IT budget for the Sakhalin revamp project (below). But owner operators consider that such spending is justified if it helps in project handover from design, to construction and finally to operations. Just as upstream focus is shifting from the model to the catalog, the construction business has all but abandoned the complexity of data modeling and is instead focusing on standard reference data for parts. This appears to be coming to fruition in the ISO 15926 register.
Dalip Sud (Shell Global Solutions) welcomed the imminent approval of the ISO 15926 Register, a set of ‘textbook’ references covering activities, equipment and properties. There are currently 12,000 entities – about 25% of the whole, and Shell is ready to ‘test drive’. Integration with internal and external partners should now be possible. USPI-NL has been working on a futuristic ‘next generation scenario’ where everyone, owner operators, engineering prime contractors and vendors, would benefit from referring to a standard. Savings of around 1% of CAPEX have been suggested and 20-25% of IT costs. Standards should reduce legal disputes and generally ‘add value to the supply chain.’
Jean-Jacques Rey (ABB) described how Sakhalin Energy is modifying the MTI platform to support year round operations. The project has a $10 billion price tag of which $8 million is allocated to information management. The main objective is to populate Shell’s information management system with data from a multiplicity of subcontractors. The major part of the project involved data cleansing and checking.
An Information Handover Guide (IHOG) was built as a Microsoft Access database containing ‘classes’ i.e. functional classifications of physical plant objects. Shell’s ESPIR spare parts module was also used. 30,000 technical documents were stored in the repository, which leveraged Intergraph’s INTools and OpenText’s LiveLink. The major part of the project involved data cleansing and checking. In conclusion, Rey stated, ‘The quality and value of life cycle information (LCI) for oil, gas and petrochemical projects is growing significantly. But the global cost and effort associated have not been reduced.’ To aid this effort, Rey advocated rapid deployment of the ISO 15926-4 Register along with ‘mutually beneficial business rules for its use.’
Reinoud Slot (iBanx) believes that plant owners don’t know what should (and what should not) be kept in the ‘as built’ model. But discrepancies between ‘as designed’ and ‘as built’ mean that there is a lack of trust in information systems. An ‘as-built’ best practice was developed for BP to help decide what should be kept, why it should be kept and for whom it should be kept. Partners in the project include Total, AGIP, BP. STEP is used for consistent naming, but its 2,000 document types have been slimmed down to 130. Slot acknowledged that the cost benefit of this kind of work may be hard to justify. Slot pointed out the high potential cost of error—such as when the piling for a 30,000 tonne storage tank just missed a 6kv electricity cable.
Wolfgang Wilkes of Fern University introduced the ISO parts libraries (PLIB), a set of standards for e-procurement and technical data exchange. PLIB provides a data model for existing data dictionaries. There are many of these including UNSPSC, GHS (hazardous chemicals), PIDX (oil industry), IEDC (electrical), SEMI (electronics) etc. The OIDDI (open and interoperable domain dictionaries initiative) standardizes dictionary usage.
USPI-NL took part in the EU funded study* of a ‘next generation scenario’ for the plant supply chain. The study recognized challenges to the oil, gas and petrochemical industry from excess capacity, global competition, environmental concerns and a ‘graying’ industry. The internet is facilitating globalization and round-the-clock working, but also exposing incumbents to fierce competition. In the long term, demand will mop up some of the excess capacity, but the shrinking EPC marketplace will mean that companies will have to be ‘more innovative and cost efficient.’
*ICT Challenges in the Plant Supply Chain. Netherlands Ministry of Economic Affairs .
Baker Hughes Inc. has sold its Petroleum WorkBench product line to Paradigm. The products will be incorporated into new offerings from Paradigm to support ‘informed production optimization decisions.’ Petroleum WorkBench includes the SimBest black oil simulator, the Comp5 compositional simulator, the Interpret well test tool and WBpvt for fluid characterization.
Paradigm CEO Eldad Weiss said, ‘We are committed to bridging the gap between geoscience and reservoir engineering to enable the digital oil field. Combining these new tools with Paradigm’s advanced visualization, reservoir characterization and well planning solutions will add value for our customers and for Petroleum WorkBench and Interpret users.’
The deal was backed by Paradigm’s principle investor Fox Paine whose MD Troy Thacker said, ‘We are happy to support Paradigm in this technology acquisition, which enables the company and its management to create exciting production optimization opportunities and expand their market reach.’
Commenting the recent hike in the oil price, Total’s VP of E&P Christophe de Margerie said, ‘We all underestimated India and China. We were wrong!’ There has been too much focus on western stock levels and little focus on international growth. Worldwide, BOE/capita/year is increasing ahead of population growth. In 2000, consumption was 9 bn. tonnes, in 2030 it will be 15 bn. tonnes. But the reserves are there, Saudi Arabia still claims 260 bn. bbl. But forecasting future demand is tricky. If Asia consumes 0.7 units/head, the EU consumes 3.2 and N. America 8.3! Since Asian growth is racing ahead the big question is, ‘which consumption model is Asia going to align itself with?’
Schlumberger president Andrew Gould also believes in a durable price rise. For the first time, the problem is supplying a sustained demand growth. Spend is shifting away from new production and towards ‘decline rate’. ‘We know little about decline rate and have no clear pattern of what it is.’ There has been a remarkable ‘decline’ too in finding and developing costs, thanks to new technology like smart wells and intelligent completion. Geophysics used to be a language spoken only by explorationists, now 3D goes beyond exploration and is even used to optimize topside design. Knowledge management is an issue with the large scale retirement of workforce. But for Gould, ‘production decline’ is where it’s at, we should ‘manage decline through technology.’
For Pete Carragher (BP), tomorrow’s fields will be deeper, hotter and harder to image. Access to and timing of discoveries will be ‘unpredictable’. IHS Energy data shows steady decline in field size since 1970, with poor replacement for oil, although the situation is better for gas. Average depth to top pay is rising. Carragher cited a recent survey of geoscience skills in the oil industry which showed a growing need for non technical workers and IT. According to the survey, the oil industry does not need sedimentologists, structural geologists, core and many other traditional skills. Biostratigraphy, petrology and remote sensing are likewise ‘obsolete,’ although Carragher is not so sure.
On the exhibit floor, software vendors are partnering with big computer companies to offer more or less ‘shrink-wrapped’ solutions, integrating software, hardware, storage and networking. Schlumberger has partnered with HP on the port of its software to Linux and the companies plan to deliver a ‘plug & play’ upstream computing environment. Landmark has likewise partnered with IBM to offer a ‘Rapid Prospect Generation Engine’ and ‘on demand’ computing for infrastructure free deployment.
But the biggest buzz of the show was the amazing growth in the use of Linux clusters for high end visualization on common off-the-shelf (COTS) hardware. Fraunhofer Institute’s PV-4D offers rapid access to terabytes of stack or pre-stack data with high speed rendering and stereo display of multi-volume data. Cluster management has been developed in association with Linux Networx. Paradigm’s Reservoir Navigator leverages disk caching to make visualization possible on a laptop, with 1:20 or 1:40 memory to data ratios. This compares favorably with established techniques as used in Voxelgeo, and Magic Earth, which store all data in memory. HP’s Project Sepia, building on High Performance Computing work done for the US Department of Energy, uses COTS technology for simulation and seismic visualization. Sepia promises to be ‘10 times faster for 25% of the price of current solutions’. Schlumberger’s GigaViz also leverages clusters so that a 40GB volume can be interpreted from a laptop. All graphics processing is done on the cluster, only pixels that change are sent to the client. An oil company might install a big graphics processing cluster and give geoscientists low end machines. A demo with a 1400 km. sq. dataset showed that GigaViz does Magic Earth-like probes with fast pan and zoom. IBM was showing off its grid computing which promises a ‘self configurable’ environment. IBM’s Grid@petroleum offering includes guaranteed service levels, storage and computing on demand.
What do you do with all this compute power? Well you could try the IFP’s Cougar reservoir simulator driver. If you think reservoir simulation is resource hungry, try running multiple simulations for sensitivity analysis. Cougar output plots field oil prod total vs. time in days. Tornado plot shows sensitivity. Cougar selects Eclipse parameters using ‘experimental design,’ a statistical method for eliminating bias.
Tanks ’n tubes
Dave Hale’s (Landmark) work on seismic meshes has now extended into reservoir engineering. The seismic ‘image’ is segmented into a mesh of ‘tanks,’ polygonal reservoir blocks, and ‘tubes’ indicating transmissibility. Reservoir properties, such as tank pore volumes and transmissibility of tubes can be adjusted to test the impact of different seismic interpretations on fluid flow. Numerical experiments suggest that properties obtained from such coarse models can be used to constrain for more detailed models.
Philippe Baldy told of the lessons learned during Total’s successive mergers. Geoscience data used to be stored across four different systems. Fortunately, Total, Fina and Elf all used Schlumberger’s LogDB with well reference data in Finder, although customizations did differ. Now, Total’s LogDB now holds 120,000 references, and nearly one terabyte of data. It is believed to be one of the world’s largest. Total’s physical data was gathered and indexed, totaling 676,000 reports and 2.4 million ‘other’ items. Migration took 11 man years of effort over a 14 month period. Baldy warns, ‘The best QC does not mean error-free! The final QC stamp will be given by users.’ Total now has an acquisition and divestment (A&D)-friendly approach to data management which avoids duplication of effort between HQ and subsidiaries. Total is now planning a fully-integrated system, based on Finder, and a new Geoscience Web Portal.
Piantanida’s presentation stressed ENI’s shared earth model which is ‘distributed’ across people, subsidiaries and the HQ. 3D Data sharing spans the application workspace—GeoFrame, OpenWorks, Tigress, FlowMap, Petrel and Eclipse. Remote access is supported over an ASP link using Citrix MetaFrame and MIT’s ThinAnywhere. ENI is experimenting with a new Grid computing tool from Softricity. Best practices are stored in the technical ‘Know-how’ Portal as workflows for reservoir model building. Project directory uses Lotus Domino, Accenture’s E&P OnLine and SAP Portal 5.0. Workflow management includes description of tasks, best practices, database access with drill down to specific tasks such as facies identification.
INT is branching out into end-user software with the commercial release of INTViewer for pre-stack and attribute seismic visualization. INTViewer offers a ‘simple workflow’ for pre-stack data manipulation, ‘replacing an entire suite of vendor tools.’ A new wellbore schematics tool has been developed with Chevron-Texaco leveraging WITSML.
IPRES’ IPResource is a new reserves database for company reporting including annual reports, SEC and key performance indicators. The tool is developed atop of Business Objects. IPRES software is used by ChevronTexaco, ConocoPhillips, Marathon, Norsk Hydro and Statoil.
Neuro Genetic Solutions (NGS) was showing its neural networks and ‘committee machines’. A (human) domain specialist performs initial log interpretation which ‘teaches’ the neural net. Committee machines link multiple neural networks together to enhance results. NGS comes from the same academic stable as recent Schlumberger acquisition, Decision Team.
Scandpower’s MEPO history matching is now commercial. MEPO runs as a front end to the reservoir simulator to test out multiple hypotheses by optimizing history matching prior to full-scale modeling. Mepo uses Baysean analysis to check the validity of a match. Scandpower is currently running pilot studies on a 22CPU HP cluster running Linux.
Weatherford’s ‘Clarion’ 4D permanent in-well seismic monitoring is being trialed in the Izaute gas storage field in southwestern France. Both time-lapse (4D) and VSP and microseismics are recorded by a permanent five-station, three-component array and used to map gas-water contact variations.
EpiSEM, Rainaud’s (IFP) ‘knowledge-driven shared earth models’ capture interpretation metadata in ‘geo-ontologies’ and ‘abstract descriptors.’ Annotated interpretations can be shared between applications and stored for re-use. Tchistiakov (TNO-NITG) described an internet database of fluid flow simulations from over 32,500 models of shallow marine reservoirs. The project quantifies the influence of sedimentology, structure and up-lift on reservoir quality. Taner (RSI) is using joint time-frequency analysis with unsupervised neural networks to produce seismic lithology maps. Automatic event recognition and classification simulates human hearing perception. Kayser (Schlumberger) showed how Inside Reality can visualize the internal structure of cores. Densely spaced microfocus computer tomography (ÁCT) images are digitized for display in the Cave. According to Naess, Statoil’s experience on Heidrun has underlined the importance of the single common database for subsurface data. ‘Firmly defined’ work processes underpin remote drilling operations from Statoil’s new Onshore Support Center in Stjørdal. Naylor described how Shell’s new operating model has E&P organized into a global business, with ‘unambiguous, single point accountability for performance, portfolio and resources.’ This model contrasts with the autonomous asset-based approach. Technology is the key, including 3D VR and Real Time Operations Centers ‘leveraging global expertise’ to enable rapid decision making in remote locations. Baker (SPE chair) believes that society meetings are important, ‘people are gregarious; they like to come together.’ The development of virtual meetings is likely to impact professional societies’ revenue but ‘if we are to stay relevant in these days of restricted travel, we have to become a player in the virtual or distance-meeting realm.’ Stinson (Data Modeling) demonstrated automatic velocity analysis computed at every CMP and time location. The Auto Imager has been tested with real and synthetic data. A 32 node Linux cluster ran the SEG/EAGE 3D Salt Model in 11 hours (60,000 velocity profiles). Results are ‘as good as or better than’ the human picked velocities.
This report is abstracted from an illustrated, 38 page report produced as part of The Data Room’s Technology Watch Reporting Service. For more information on this subscription-based service please email email@example.com.
C&C Reservoirs has hired Bob Trice as Manager of E&P Solutions. Trice was previously with Shell and Enterprise.
Ouahiba Ghazli is to head-up a new Lynx/DPTS tape transcription joint venture in Algeria.
Pieter vanderMade has resigned as MD of Fugro-Jason. He is succeeded by Eric Adams, manager for the Americas.
Phil Longorio has been named CEO of Well Dynamics. Longorio was previously VP of Halliburton unit Sperry-Sun.
Claus Kampmann is the new chairman of the TGS-NOPEC board. Kampmann spent most of his career with Schlumberger’s Oilfield Services unit.
John Weigant has joined Geotrace as manager of depth migration. Weigant was previously with Amerada Hess, NuTec and CoreLab.
Malibu Engineering & Software has appointed Cecil Shewchuk as CEO.
Advanced Technology and Systems has integrated its SCSI/SATA RAID system with Ultera’s MirageVL Virtual Tape Controller. A 2 terabyte VTL solution costs $ 9,999.
E-commerce in oil and gas has had a rough ride since the
heady days of the dot com boom. UpstreamInfo, PetroCosm and other oil and gas
e-ventures have failed spectacularly. So it was particularly interesting to
attend the 2004 Trade-Ranger Community Conference last month, attended by about
100 members. Since Trade Ranger was established, considerable changes have taken
place. Merger and attrition have reduced the numbers of buyers. Those remaining
are predominantly EU-based. The general focus of e-procurement also has shifted
slightly downstream, where buyers and sellers of chemicals and finished products
will likely offer a more attractive target for
Baker Hughes’ (BH) suppliers are faced with ‘a large number of e-procurement initiatives’ according to Steve Sidney. Integrating such solutions is expensive. While Baker Hughes (BH) is enthusiastic about e-procurement for order fulfillment, it is less keen on catalog-based ordering. Catalogs are an integral part of the BH proprietary offering. While there is ‘potential’ for integration with customer ordering processes, ‘timing and readiness of customers remains unclear’.
Joop van Dierendonck said that Dow Chemicals was ‘really going for’ e-business with a major integration of Dow eMart with Trade Ranger. Dow’s e-procurement strategy aims at reducing costs and errors by integrating procurement with ERP. Last year ago, Dow’s eConnect project started using MRO purchasing, which handles 80% of worldwide transactions. Purchase orders are sent to suppliers by AutoFax (65%) or EDI (35%). The Current objective is to move to ‘e-execution’ via Trade Ranger; to ‘discourage EDI and promote XML’. Last year, 73% of Dow’s 5 million invoices were paper. This year this will be reduced to 32%. Invoicing staff is now down to 8 people. Dow now wants to do the same on the buy side, eliminating manual entry and hundreds of jobs. Dow is working with TR on a multi-generation plan for current and future technologies and software vendors.
Rosanna Di Leo reports that today, e-auctions, with a $1.1bn commitment, amount to 20% of Total’s overall spend. e-RFX (e-call for tender etc.) contrasts with the traditional approach of cover letter, appendices, follow-ups, mails and telephone conferences. Total’s branded e-RFX, e-Novation comprises a browser front-end to documentation, follow-up, RFQ, evaluation, communications etc. ‘all in one box’. e-auctions put established suppliers ‘back into the melting pot.’ This leads to greater competition, visibility and traceability of prices. Total’s refiners were reluctant to do this kind of purchasing – but they came around and e-Sourcing has grown from zero in 2000 to 650 e-RFXs in 2003 with € 74 million at auction. Total has a clear group policy and good management support for this. Overall, Total purchases around 23bn/yr – so the e-procured is only 4% of what is potentially e-purchasable. ‘We have a long way to go’. Di Leo compares this situation to the introduction of email in 1992, when there were similar doubts. Internet access and short training is all that is needed.
‘Procurement on steroids’ is how Stephanie Sherrod describes Shell Oil Product’s (US) web-based purchasing. Purchase to pay, (P2P) enables strategic sourcing, enhances compliance and improves spend analysis and commodity reporting. Shell’s iNeed is P2P solution integrated with SAP and Trade Ranger. By setting up vendors for automated clearing house (ACH) payment, Shell has reduced the cost of paper checks and wires. Trade-Ranger’s XML-based connectivity links Shell’s SAP-based invoicing with suppliers including Dell, McJunkin and Wesco. An estimated $100 million/year of low value purchasing has been eliminated.
TR president John Wilson sees a conflict in the medium term between TR’s ‘independent’ hub and the expanding ‘supplier relationship management’ offerings from software behemoth SAP. The plenary discussion addressed this and related issues (see this month’s lead) without a clear outcome. SAP’s Usman Sheikh acknowledges that SAP now goes well beyond ERP. NetWeaver has brought ‘unprecedented flexibility’ including a fully integrated SRM solution. But NetWeaver also heralds a ‘more open’ environment for competition. Sheikh said, ‘We have retained our proprietary rights for too long, our management is ready to open up. NetWeaver is the first step.’
Tulsa-based independent energy group Williams is to deploy the entire P2ES Enterprise Upstream (EU) suite of financial applications in its upstream division. Williams’ production from the Piceance, San Juan, Powder River and Arkoma basins totals approximately 450 million cu. ft./day from an estimated 2.4 trillion cu. ft. of reserves.
P2ES VP Trent Derr said, ‘William’s E&P business will benefit from our agile, integrated solution without the rigidity and overhead imposed by other solutions. EU is tailored to the energy industry and will support the sophistication of Williams’ E&P business with lower up-front and ongoing total cost of ownership.’
EU is a web-based transactional processing software suite built on Oracle technology. EU provides production volume and asset management solutions implemented at midsize and large independent oil and gas companies as well as major oil companies’ E&P divisions.
Statoil has awarded a contract to Iron Mountain for the storage and handling of its seismic tapes and other magnetic media. Under the terms of the contract, Statoil is to deploy Iron Mountain’s eSearch package to provide on-line access to its physical and electronic assets.
eSearch resulted from the merger of Iron Mountain’s OpenRSO inventory management system with Schlumberger’s AssetDB. The tool is now marketed to the upstream oil business by Schlumberger Information Services.
Iron Mountain is also to transcribe some 250,000 of Statoil’s legacy seismic data tapes and cartridges, over a three year period. Iron Mountain has been working for Statoil for the past fifteen years.
Venture Information Management has been chosen by the UK oil industry data portal Common Data Access (CDA) to help achieve its strategic objective of improving the completeness and quality of well data stored within the CDA DataStore.
CDA has been engaged in discussions with its membership, the UK Department of Trade and Industry (DTI), release agents and other service providers. These have led to a better understanding of key requirements, workflows and use of existing standards and processes. The Venture study will make recommendations for each CDA data type on matters such as methods and rules for data collation, data comparison and manipulation.
The study will also investigate roles, responsibilities and associated processes and what software tools might help achieve CDA’s objectives of data completeness.
TGS (now part of Mercury Computer Systems) has announced version 5 of VolumeViz, its volume rendering technology for very large data sets. VolumeViz is used by many major oil companies and upstream software vendors including Landmark, Schlumberger, SMT, Jason, Paradigm and Roxar. VolumeViz, which supports 100GB plus datasets, comes as a C++ library with Qt support for the GUI. VolumeViz is ‘thread safe’ and can be multi-threaded across multiple processors and 32 or 62 bit graphics engines.
Naamen Keskes, image processing adviser with Total said, ‘We selected TGS’ volume rendering technology after an extensive evaluation of 3D visualization solutions. By integrating VolumeViz 5 software technology into our Sismage workstation geoscientists are able to manage, visualize and process extremely large 3D seismic datasets. TGS has also helped us to re-engineer our product development initiatives.’
VolumeViz integrates existing OpenGL applications, extending Open Inventor, both C++ and Java interfaces with volume rendering techniques. The scalable solution can be used on systems ranging from notebooks to immersive VR environments.
Petris Technology Inc. is to utilize BEA’s WebLogic Platform to develop an energy industry intranet portal. The portal will be available as an add-on to the PetrisWINDS Enterprise platform, providing clients with a single point of access to data assets and collaboration tools.
Petris senior VP Jeff Pferd said, ‘WebLogic will bring full portal functionality to PetrisWINDS, enabling customers to deploy a true web-services environment. This architecture will give flexibility today while setting a firm foundation for future extensibility.’
BEA’s WebLogic allows multiple portal initiatives to be federated across the enterprise, bringing workflow and data management together in one place.
To provide simple yet secure access, both the Weblogic Platform and the PetrisWINDS Enterprise systems are designed to support single sign-on (SSO) capability to protect information and to ‘streamline user adherence to good security practices.’
Schlumberger Information Solutions (SIS) and Aspen Technology have entered into a five-year alliance to provide operators with a seamless view of their assets, from reservoirs through to processing facilities, enabling them to plan, manage and optimize oil and gas production.
A fully integrated model will link the reservoir, wells, surface infrastructure, process facilities and economic conditions into a single computer simulation environment that provides a common platform for field planning and real-time reservoir management. Aspen AssetBuilder serves as the platform for the joint solution which will also leverage AspenTech’s HySys simulator, SIS’ Eclipse reservoir simulator, PipeSim, production systems analysis package and the Merak Peep economics package.
SIS president Peter Goode said, ‘Even small increases in oilfield productivity can result in major economic benefits. A system that supports integrated asset modeling will deliver significant business value and is key to dynamic production enhancement and enablement of the i-field.’
Sense Technology has just announced SenseXL. The web-based application exposes rigsite data from Sense’s SiteCom to remote users with appropriate access rights. SenseXL leverages the WITSML standard. The application can be used to view data from other WITSML sources. SenseXL can receive real-time data directly from one or more SiteCom systems or WITSML compliant sources.
The viewer offers charting, query and reporting options. The chart view provides charting of both real-time and historical data. The query view provides WITSML-based add, delete and update for any WITSML data. Reporting likewise works with Sense’s own systems or other WITSML datastores.
SenseXL supports import and export of rig data from other sources and can be used to read data from pre-WITSML days, store the data in the SiteCom Central database, and then put the data to use in WITSML compliant analysis and process improvement tools.
QinetiQ has developed an Autostereo 3D Display Wall, described as a ‘breakthrough in visualization technology for the oil and gas industry.’ Autostereo offers more than 25 times the information content and is 10 times brighter than conventional stereo visualization images.
The QinetiQ display provides natural 3D images that do not require the viewer to wear glasses. Groups of users can move and collaborate freely in front of the display. Each viewer sees continuous and horizontally correct perspectives of the 3D images over a large viewing zone. Applications of the Autostereo 3D Display Wall in oil and gas exploration, well planning and production include enhanced spatial and multidimensional analysis, planning, monitoring, simulation, communications and decision-making.
Chris Slinger, CTO with QinetiQ, said, ‘From their very first viewing experience, people who have known only conventional immersive visualization systems are amazed at the fidelity and impact of these 3D images.’ An OpenGL interface will allow seamless integration with existing interpretation and visualization applications.
MRO Software Inc. has just announced a new industry solution for the oil and gas markets leveraging Maximo, its asset management solution. Maximo Oil and Gas (MOG) extends Maximo with key partner solutions and licensed best practices. The solution is designed to help companies meet production targets while reducing costs and maintaining safety and environmental standards.
MOG embeds interfaces with Primavera Project Manager and Intergraph’s SmartPlant. These bi-directional integrations allow customers to combine Maximo with Primavera’s scheduling and Intergraph’s engineering. MOG includes support for failure codes, asset specifications, and integration with industry standard engineering systems and project management systems.
MRO Software VP Johan Arts said, ‘We’ve enhanced the functionality our clients need to manage assets such as offshore rigs, refineries, pipelines, transportation assets and production facilities.’ MOG will ship later this year.
Fugro Robertson Inc (FRI), part of Fugro’s Geoscience division, has just concluded the purchase of the assets and business of C&M Storage, based in Schulenburg, Texas. C&M offers data management and storage services for cores and cuttings, paper records, samples, tape and microfiche. C&M Storage claims over 60 oil and gas company clients in the Houston area and generates turnover of approximately US$2 million per annum.
The company will be integrated into FRI, but trade as Fugro C&M Storage. The acquisition forms part of the expansion of the Fugro Robertson Data Solutions business line in the United States. All current key employees have agreed to continue their employment. The assets include more than 250,000 sq ft of custom storage facilities and specialized sample handling and preparation activities. FRI plans to expand the scope and number of activities offered through the business by building on Fugro Robertson’s service lines. These include areas such as data conditioning, electronic data distribution, provision of bespoke web portals and associated services.
M2M’s iSCADA real-time compressor monitoring and control service platform continuously collects data, confirming operability and alerting operators when problems occur. From the Internet, operators look at key operating parameters without the need to travel to the site. Monitored data can be simple on/off status to hundreds of data points allowing historical trends which may reveal long-term productivity and maintenance issues.
New demand for light-duty compressors is being driven by the expansion of marginal and unconventional natural gas resources in the US. Smaller and more widely separated compressors are required to move lower production volumes from the field. M2M’s new basic packages and lightweight communications make remote monitoring for small units an economic proposition.
M2M COO Don Wallace said, ‘Our R&D has led to distributed software, lighter and faster remote field devices, and more efficient and cost effective bundled communications packages.’ An M2M ‘Ultralight’ SAT-SCADA starts at $952 and includes a satellite modem, IP communications gateway and power supply. A monthly service for alarm alerts and remote start/stop is as low as $23.95.
Occidental Petroleum of Qatar (OPQL) has awarded Roxar with a three-year extension to its frame agreement for the provision of downhole instrument systems for permanent temperature and pressure monitoring. Since the contract began, Roxar has delivered 35 downhole pressure and temperature monitoring systems to OPQL. The equipment is being used to assist OPQL in the reservoir management of their offshore Idd El Shargi and North Dome developments.
Shell continues to outsource its IT with the award of a contract to SAIC for its ‘smart fields’ systems integration and business consulting. ‘Smart fields’ involves the application of ‘measure, model, control’ techniques developed for the process industry to oil and gas production. SAIC was selected because of its systems integration experience in other industries.
SAIC VP Randy Walker said, ‘Smart fields offer similar challenges to the US Future Combat Systems Program, where SAIC is a lead systems integrator. Shell and SAIC will pool their technologies to make Shell’s future operations more efficient, and to provide Shell with accurate, real time information for better decision making.’
‘Smart’ pilot projects have demonstrated improved production rates and recovery. The concept is now ready to transition to operational applications and technologies. SAIC has been working with the Shell Smart Fields team for the past two years and has been involved in several other digital oilfield initiatives.
UK-based environmental consulting group RPS has acquired upstream consulting house Cambrian. Cambrian Consultants, along with its American and Asian subsidiaries, are now part of the RPS Group. Along with its consulting business, Cambrian develops upstream software; Wotan for CGM publishing and InToto for wellsite data management and composite log creation. RPS plans to keep the Cambrian management team.
The Cambrian acquisition strengthens RPS’ Energy Division, which was established following the acquisition of Hydrosearch Associates, Troy-Ikoda and Australian BBG. The Energy Group was created to address the ‘daunting challenge’ of satisfying world energy demands without ‘damaging health, blighting local environments and threatening vital natural systems.’ RPS plans to expand its presence in the energy sector significantly over the next few years. In 2003, RPS Group made £21 million profit on turnover of £125 million.
ExxonMobil has selected Siebel Oil and Gas 7.7 as customer relationship management (CRM) support tool for its Lubricants business.
Siebel Oil and Gas (SOG) coordinates interactions with customers across multiple communication channels. ExxonMobil VP John Lyon said, ‘The Siebel Oil and Gas team offered out-of-the-box support systems capable of delivering a comprehensive solution that will bring tangible benefits to both ExxonMobil and our customers.’ The solution integrates ‘customer-facing’ activities with back-office and several existing operational systems.
Last year ExxonMobil used Siebel’s eService solution to underpin its Signum oil analysis service. Signum lets customers access oil analysis over the web to monitor equipment condition in real time. The Signum project was implemented by a joint team from ExxonMobil and Siebel Professional Services.
Deloitte Petroleum Services has just launched a new GIS-based product for the oil and gas industry. PetroView Live (PVL) is a web-based mapping service that serves PetroView data across entire organizations, enabling everyone to investigate upstream oil and gas operations, create maps and access corporate information through a single interface.
PVL comes in two flavors, as a hosted ASP-based internet service, or as an intranet deployment within the company’s firewall. The intranet version targets organizations with an existing ArcIMS license or with strict information security policies. Both options include initial client configuration, dataset updates on a monthly basis and telephone helpline support.
PVL democratizes GIS information access across the organization. A typical user might have a basic mapping requirement for a presentation or report, or just need a quick look at licensing, drilling activity, production or infrastructure information.
Presets allow for rapid access to specific assets or regions and the user-interface can be modified to comply with corporate branding policies, such as color schemes, logos and symbology. Users can add their own data files such as seismic or prospects layers to visualize alongside the standard PetroView data. External documents can be incorporated into PVL using the built-in document and web linking tools.