Microsoft’s booth occupied a strategic place at the Supercomputing ‘05 conference held in Seattle this month, heralding the official roll-out of the Windows Compute Cluster Server (WCCS) 2003, Microsoft’s foray into scientific computing. This technology is said to offer scientists and engineers straightforward set up of ‘affordable’ clusters of 64 bit systems. Installation scripts and automated processes ‘lead novice administrators’ through set up tasks.
Addressing the 7,000 attendees Microsoft chairman and chief software architect, Bill Gates, traced the company’s transition to a 64-bit architecture which is ‘very important for the large data sets in the technical and scientific realms.’ Gates addressed the ‘challenge of parallelism,’ noting that microprocessor clock speeds ‘won’t be increasing at the rates they did in the past.’ A 6-8GHZ limit is here ‘for some time.’
Gates noted that many scientific computing problems can be parallelized in a very straightforward way and anticipates supercomputers of all sizes, including one that will cost less than $10,000 and which will sit at your desk or in your department and which will be ‘very, very accessible.’ Gates’ attack on the technical computing market builds on Microsoft’s installed base in data acquisition. ‘If you’re a scientist who wants to analyze a lot of information, how can we make all the steps involved far more efficient?’ For many problems, changing data formats and software tools takes many man-hours. ‘A breakthrough here would have incredible leverage, beyond the computational modeling aspect.’
Gates cited the University of Washington Neptune project that has deployed geophysical sensors over spreading plate boundaries in the Pacific. These stream ‘overwhelming amounts’ of real time XML data tagged with contextual metadata. Gates stressed that the XML revolution has changed the software industry. Microsoft now uses XML in its SQL database, in Office file formats and in Windows itself. XML is likewise used to describe a cluster’s capabilities, to invoke services and track execution.
WCCS 2003 is said to ‘remove the administrative barriers found in HPC’ offering a Windows ‘look and feel’ for cluster management. MS MPI, an implementation of the MPI-2 standard, runs over Gigabit Ethernet and InfiniBand.
Visual Studio 2005 has been enhanced with parallel debugging capabilities supporting MS MPI. On the downside, maximum RAM per node is capped at 32 GB and only four processors per node are supported. As for other Windows 64 bit incarnations, there is currently no support for the Itanium.
Labrador Technologies (LTI, formerly Sterne Stackhouse) has announced the expiry of its non-compete agreement regarding the Canadian Petro-Lab disposal. Petro-Lab was acquired by PriceWaterhouse unit QByte – which later was bought by IBM Canada and divested to PE Energy Solutions earlier this year (OITJ Vol. 10 N° 7).
Speaking to shareholders LTI CEO Ron Sterne announced that LTI now plans to ‘aggressively re-enter the Calgary oil and gas software market.’ Labrador will re-establish its historical relationships with Calgary’s oil and gas Data Centers, leveraging its flagship Labrador data management package. LTI is also working with industry and market experts to identify and develop critical Data-Flow Management software components. Design and development of these products is to start forthwith.
A lot is riding on the success of this new software. At the present LTI has ‘no ongoing source of revenue.’ The company has enough cash to operations through to April 2006 and is actively seeking alternative sources of revenue. LTI warns however that ‘future operations will depend on the successful development and marketing of the its data retrieval technology.’
Not sure what to make of Bill Gates’ talk at SC 2005 (see this month’s lead). It’s not every day that the great Gates speaks to the scientific computing community—but there again, he didn’t have to travel far. Back in 2003, Microsoft hosted an event in Houston on high performance computing (HPC) in oil and gas. We were skeptical then (OITJ Vol. 8 N° 7) and I guess that I’m still skeptical. For one thing, the roll-out in December 2005 of an HPC offering would suggest that the 2003 event had an element of fear uncertainty and doubt (FUD).
Fortunately for all of us, Microsoft’s marketing department is not going to have a free run at the HPC field where Linux’ lead looks unassailable for the moment, not least because of its reputation for code integrity—see below. But there are other forces at work in HPC. The really interesting work is being done on esoteric hardware such as the graphics processing unit (GPU), used first by SMT for reservoir simulation, and now by FineTooth for seismic processing (see page 7 of this issue). These developments have led to various damage limitation exercises in the form across the board SEG ‘sponsorship’ from Microsoft and Intel—and from the latter, a joint presentation with NVIDIA to ‘explain’ how the GPU is not really a threat!
Microsoft’s offerings took a beating in a recent talk by Prof. Les Hatton (Kingston University, London). Hatton’s specialty, ‘Forensic Software Engineering’ includes evaluating software reliability by counting ‘defects’ in executables, both applications and operating systems. Code quality is measured in faults per thousand lines of executable (machine language) code (KXLOC). The ‘state of the art’ is represented by the NASA Shuttle software, with a 0.1 KXLOC fault rate. Windows 2000 is thought to have a fault count in the 2-4 KXLOC range while Linux fares relatively well at 0.5 KXLOC.
OS of choice?
Hatton has little time for Windows as an operating system citing MTBF for 2000/XP as around 100 hours against over 50,000 hours for Linux. Hatton’s recommendations are that if you want a reliable and secure operating system, ‘don’t use Windows’.
Out of memory
Hatton says a lot more about code and compiler quality. It would have been interesting to hear more from Bill Gates on such topics. Windows got a bad rap in the past, but it has improved. My own system doesn’t crash spectacularly like it used to. But I can still run ‘out of memory’ in a 20MB document—despite my recently acquired 2GB ‘just to be on the safe side’!
For computational reliability though, it’s those folks who are using Microsoft Excel who are really living dangerously. Hatton cites a University of Hawaii study that found over 90% of spreadsheets have errors. Surely a warning to our financial and engineering brethren.
Che Zan Yassin described Petronas’ knowledge management (KM) initiatives spanning upstream and downstream (petrochemicals). These include a magazine, ‘The Pipeline,’ a website portal for the domestic Petroleum Management Unit, a KM familiarization program, communities of practice (COP) and lessons learned reviews (LLR). These share failures as well as successes. Techniques learned from Rolls-Royce have led to interviews with experts by KM facilitators—these are recorded and put online. Training is crucial since 42% of Petronas employees have under three years experience. KM success is evaluated by LLRs on the projects themselves and by tracking website click-through and video usage and from feedback from monitoring and coaching. Lotus Notes is the basis for most current KM work although other tools are being evaluated.
ADNOC’s E&P Information System (EXPRIS) was presented by Mohamad Abu El-Ezz. ADNOC wanted a common data model for the upstream and selected Schlumberger’s Finder back in 1994, a ‘pioneering project’, believed to be one of the first to achieve such broad coverage in a single database. EXPRIS’ design principles are to ‘honor business rules, honor the earth and honor reality.’ As Finder matured, there was less need for customization. In 1999, ADNOC moved to a standard version of Finder, reducing customization and interfacing with Business Objects, OFM and FinderWeb. To date over 90% of ADNOC data has been captured and preserved. Data preparation time reduced from 4-6 months down to weeks.
Paul Helm (HP), a geophysicist by training, has been ‘moving downstream’ and into the digital oilfield. Here a single SCADA system can generate 10,000 alerts per day. These are impossible to handle as such, so neural net or stochastic processing is performed on the data to predict failure etc. Helm says this is established technique but ‘we are only now starting to get the value’. In one case, an EU gas producer was paying $15 million/year in tax on back allocated gas volumes. The simple expedient of a $300k meter eliminated the discrepancy, and the tax bill! One of HP’s supermajor clients avoided replication by leaving its data in the Historian. Helm advises that such solutions are ‘brittle’ and recommended an online transient data store, built around its HP’s ‘Zero Latency Enterprise’ (ZLE) technology, a joint venture with CapGemini. ZLE Hubs store enterprise info which is then accessible through web services publish and subscribe mechanisms. ZLE leverages EAI middleware including 2EE, Tuxedo and CORBA. HP itself is a keen consumer of KM and actually mandates use of Knowledge Systems by management. ‘Brute force is the only way to ensure take-up.’ This comes in the form of the year-end appraisal where ‘metrics drive behavior’. A neat KM example involves background scanning of corporate email to produce a ‘knowledge map’ showing, for example, an RFID ‘cluster’ of people who make up an informal social network. Helm notes also that various SCADA protocols are increasingly replaced with wireless IP.
Femi Adeyemi described how Shell has bought in to the digital oilfield as a key component of its operational excellence model. Shell is ‘going global,’ with process, workflows, apps, data management and is looking to standardize infrastructure. A big effort is put into information quality with a virtual activity group for data management.
Zinhom Wazir showed how ADCO has enhanced the interface between Schlumberger’s Finder and Eclipse to support consistent terminology. ADCO noted a lack of ‘standard operating procedures’ (SOP) for the model building processes. The Finder-Eclipse interface covers well data, PVT, SCAL and 3D geological models. A separate Production Injection Database (PIES), also used to build Eclipse models. ADCO plans to extend the interface to include RT data and Maximo etc.
Sugimin Harsono told how following Total’s thirty years operations in the Mahakam delta, its data situation was getting out of hand. Multiple databases meant it was not easy to know which one was right. Data known to be bad was not deleted. Data management was perceived as ‘dull’ and there was little corporate awareness of the problems. Total initiated a ‘triple C’ approach, communication, cooperation, and consultation in its Target 3000 data revamp. Total’s production data management system (PDMS) is migrating from an in-house solution to a Schlumberger-based package. This leverages FieldView, Finder, OFM and other components. A FieldView to Finder (F2F) link was written for the project. The hub of the system is Schlumberger’s DecisionPoint, used for reporting, data loading (with ‘portlets’) and to configure personal workspaces. The PDMS is a component of Total’s Operational Excellence Information Systems that will help monitor processes, identify cause of failure and deliver continuous improvement.
OITJ—What’s happening in Paradigm?
Gibson—We are making tremendous changes. Paradigm’s shareholders offered me an opportunity to turn the company into a platform for value creation. But for a $100 million company to successfully compete with two ‘billion dollar’ vendors we have had to rethink our strategy.
OITJ—What are the major changes?
Gibson—We are to restrict our activity to the ‘upstream’ part of the upstream—ring fencing geology, geophysics, drilling and petrophysics where, with Geolog, we are world leaders. The next decade is going to be all about rock mechanics—and here, petrophysics is the key.
OITJ—How do you ring-fence upstream upstream?
Gibson—Think of what feeds into a seismically-conditioned reservoir model; rock properties, attributes—all of which are correctly positioned in space with accurate depth migration, signal processing and petrophysical analysis. We don’t have to do fluid flow simulation nor production accounting. It’s already a big niche.
OITJ—Is this a return to the silos?
Gibson—It is something I’ve been preaching for some time. People don’t want dumbed-down seismic processing technology. In fact geoscientists are going to have to be smarter in the next decade to solve tomorrow’s problems and we are going to give them research quality tools. We also want to componentize our leading edge technologies. VoxelGeo for instance is the tops in volume rendering but it is a monolithic application. We want to componentize the good bits of VoxelGeo and other technologies and serve them inside new workflows—a move to a model of software components and re-use.
OITJ—Will this finally bridge the processing and interpretation divide?
Gibson—This is a ‘done deal’ for Paradigm. This is what GeoDepth is all about. But our users are not pushing us hard enough here.
OITJ—How will this affect pricing?
Gibson—We need to realign our pricing to drive our development model. Our private ownership helps here, reducing the impact of prices on our development effort. But I have to say that companies are over-focused on license management tools like FlexLM and GlobeTrotter. They are wasting everybody’s time. The last thing the industry needs is more licensing management software!
OITJ—So how do you counter the penny pinchers?
Gibson—I don’t know! But some how we have to optimize use not cost!
OITJ—What are your plans for Paradigm’s Epos middleware?
Gibson—EPOS is a best of breed enabler that circumvents our competitor’s proprietary data models. We are working on an EPOS dev kit and API. Maybe with a move to open standards.
OITJ—You mean there will be an ‘Open Epos’?
Gibson—Maybe. We might even outsource the development and maintenance of an Open Epos.
OITJ—And what about data management?
Gibson—We want to be infrastructure independent but we don’t want to have to cater for multi-supplier datastores. We are committed to providing a ten fold performance differential in our applications so we need an ‘intimate relationship’ with our data. We are not a horizontal DM/IT company (except for HPC applications).
OITJ—Can a $100 million company offer worldwide support?
Gibson—We are everywhere! We have very broad coverage for our size with representation in Russia, Mexico, Canada, Europe and Nigeria. We also just issued an announcement about our beefed-up customer support which leverages web based tools.
OITJ—What’s your strategy on WITSML?
Gibson—Talk to us in three months!
Skyline Software has embedded its 3D visualization technology in Intergraph’s GeoMedia product line, offering users 3D visualization of geospatial information including terrain, imagery and feature data.
Maurer Technology Inc. has just released V 4.0 of its WellCon Well-Control software for evaluating well-control procedures. A new kill sheet generator provides a quick analysis of the well-control process.
New ‘machine learning’ functionality in MetaCarta’s geographic intelligence package leverages ‘computational linguistics’ to identify geographic references in text. MetaCarta now ‘learns’ to recognize linguistic patterns pointing to geographic references. Such learnings are packaged into Geographic Data Modules—knowledge bases used to ‘disambiguate’ geographic references and assign latitude/longitude coordinates.
Safe Software has updated its Feature Manipulation Engine (FME) geographic data translator to import and export data from PostGIS and PostgreSQL open source GIS packages. The new functions are available in FME 2006 and through application extenders for ESRI ArcGIS, Autodesk MapGuide, and Intergraph GeoMedia.
de Groot Bril has released OpendTect 2.2.0 including a semi-automatic horizon tracker. The attribute engine has been re-engineered for speed, crossplotting and other enhancements.
EMS Pipeline Services and Quorum Business Solutions are to share ownership of the PGAS natural gas measurement system.
OSIsoft has released a 64 bit version of its PI System Server, the engine driving its Real-time Performance Management (RtPM) Platform.
FFA has productized its FaultTrends fault extraction technology in V2.3 of its SEA 3D package. This includes structurally oriented noise cancellation filters, ‘CookieCutter’ filtering with SeisMath and VoxelMath and highlighting of fault trends, a workflow previously only available through FFA Services.
Association is working to promote the Geographic Markup Language (GML)
to the upstream. First target is a GML-based seismic schema.
Decision Dynamics has licensed its Wellcore package to an unnamed ‘large independent E&P company’. The $CDN750k sale covers 115 seats.
Tecplot has released a new version of its reservoir simulation post processor. Tecplot RS 6.0 includes new ‘non-neighbor connection’ views of fault-related flow and new histogram plots of grid properties. Streamline simulator output is now supported. A network license costs $4,000 per year.
Tibco has acquired master data management technology from Velosel Corp. and is to offer ‘product master data management and product-centric process integration.’
A new study from UtiliPoint benchmarks natural gas software applications and IT architectures supporting the natural gas value chain ‘from wellhead to burner tip.’
Internet SCADA specialist M2M Corp.’s new communications infrastructure, M2M Connect has been enhanced to support an extended range of satellite and terrestrial wireless networks. M2M’s iSCADA service now offers a private access point names service (M2Mops.com) with Cingular Wireless, which will allow M2M to provide a wider range of GPRS services and customized tariffs.
Killetsoft has released GeoDLL Windowslibrary for geodetic transformation.
Norsk Hydro has bought an SGI Altix 3700 Bx2, with 40 Intel Itanium 2 processors and 96GB memory for its R&D center in Bergen. The system runs Paradigm’s GeoDepth tools for 3D tomography and migration velocity analysis.
Hydro principal geophysicist Jan Pajchel said, ‘Tomography involves very large matrix calculations. A shared memory architecture removes the need to divide our data into sub-volumes which tend to produce instability in the solution. The more you divide the data, the harder it is to match it up again for an accurate image. With the Gulf of Mexico data on the Altix, we are producing cubes of some 30 by 20 by 15 kilometers. Our data sets range from 300 to 500GB. The Altix’ large memory saves time, allowing migration analysis to be performed simultaneously on several slices. This is a very important machine for geophysicists today.’
Another Altix has been acquired for Hydro’s Porsgrunn R&D center, a 3700 model with 128GB memory and 96 Itanium 2 processors running Linux. Hydro is working with Statoil on a new ‘VR Safety’ system to train personnel in handling emergency situations, such as an oil or gas leak or fire. The Silicon Graphics Prism visualization system is connected to a TAN Holospace VR system with screens on the walls and floor.
Hydro principal engineer Eirik Manger said, ‘The Prism is the only system that can drive the Holospace with the large datasets that we need. The shared multiprocessor system lets us run both relatively small as well as large jobs faster.’ Virtual reality (VR) simulations include large-scale geometry, displaying walk-throughs of existing installations, design reviews of new onshore and offshore facilities, and general presentation of complex experimental and computational data.
The Research Centre in Porsgrunn uses the SGI Altix system to run large computational fluid dynamics problems and other calculations in datasets ranging up to 20 to 30GB at present. Data is currently visualized in the Holospace using COVISE software from VISENSO.
Abu Dhabi-based Zakum Development Company (ZADCO) is another keen SGI user. A combination of SGI hardware and software and consulting services from Landmark, has cut ZADCO’s simulation run times significantly. ZADCO has used its four Altix 350 systems, with a total of 56 Intel Itanium 2 processors, and an immersive VR center driven by a 16-processor Onyx 3900.
A new version of CGG’s depth imaging suite GeoVista promises faster turn-around for high-resolution prestack depth migration of large datasets. GeoVista III offers seismic processors automated workflows for managing the large residual moveout datasets generated during dense auto picking of image gathers. GeoVista includes automated editing, geostatistical filtering and QC for data input to VelTracer for 3D depth tomographic inversion. Volumetric 3D depth tomography on large datasets leverages the massive parallel processing capabilities of 64-bit Linux clusters. GeoVista III technology has already been used to process in excess of 35,000 km2 of high-resolution 3D Pre-SDM in the Gulf of Mexico, producing ‘improved depth imaging results more rapidly than traditional low-resolution approaches, even in areas of complex structure’ according to CGG.
Transform Software has signed its first client, Apache Corp., for its 3D ‘full wave’ seismic interpretation system. Apache’s interpreters will be able to concurrently analyze many different types of E&P data. The agreement provides Apache with early access to Transform’s technology. In return, Transform will gain access to Apache project data and expertise.
Apache VP E&P Technology Mike Bahorich said, ‘Our E&P technology group promotes the application of emerging technology from companies like Transform, whose approach to multi-dimensional seismic interpretation is both innovative and practical.’
Transform announced a technology alliance with Input/Output (I/O). I/O will gain access to Transform software for use in processing and interpretation services.
Decisioneering’s Crystal Ball (CB) is a Microsoft Excel plug-in that offers Monte-Carlo modeling of spreadsheet data. Monte Carlo (MC) techniques offer a simple way of propagating errors (or ‘risk’) through an economic or scientific model. One enthusiastic CB user is William Standifird from geopressure specialists Knowledge Systems (KS). KS develops compaction, reservoir and depositional history models—all of which require calibration and statistical modeling. Uncertainty is omnipresent which is why KS uses MC modeling on ‘unstable’ parameters. These are used to evaluate prospect viability including seal efficiency and drilling feasibility – some planned well paths are technically impossible to drill. BP now does regular ‘drillability’ assessments – important in a $100 million well! KS offers two modeling approaches: hard coded ‘perfect’ solutions, using bespoke algorithms where CB is too slow and ‘imperfect’ and more flexible solutions, leveraging CB as a ‘fit for purpose’ uncertainty evaluation technology.
Oscar Bravo Mendoza stated that in Ecopetrol, which has 150 CB licenses, ‘no major decisions are taken without CB.’ Ecopetrol measures risk to be in a better negotiating position with respect to banks and joint venture partners. Despite perceptions, Columbia is not a high risk country as witnessed by its score in the Emerging Markets Bond Index Global. Colombia is not in the same league as Chavez’ Venezuela! Plots of project complexity against uncertainty or risk likelihood against risk impact are used and shown to insurance companies to negotiate rate reductions. MC modeling can be combined with decision trees to show a project’s upside potential.
Pat Leach* (Decision Strategies) presented a case history of FPSO design which determined how much NPV would be lost if the facility’s water handling capacity was too low. Leach used Murtha’s stochastic production profile generator to model water breakthrough. While some cases showed ‘significant’ production loss, others deferred production. In the end a compromise was reached with a compact water processing installation that left room for extra capacity to be added if needed.
Steve Hoye’s presentation enumerated the many pitfalls of the spreadsheet as a modeling tool. As a Decisioneering trainer Hoye has seen it all and has a few horror stories to relate on spreadsheet worst practices. In general it is the spreadsheet rather than the stochastics that gets the average user into trouble. Spreadsheets get unmanageable. Hoye recommends using color and space as aids to spreadsheet readability. Complex stuff needs breaking down into separate worksheets or workbooks. Range names should be used rather than cell references. Astonishingly, some 40% of CB trainees don’t know about this best practice. Other tips include protecting your data cells, using templates and sharing assumptions across the business to avoid multiple hypotheses. Finally, don’t put reports in models, use Latin Hypercube sampling, avoid the ‘killer formula’ and avoid macros if possible as they reduce transparency.
For those of you who may be daunted by doing your own CB analysis, Decisioneering, in association with Vose Consulting, is ready to help. Vose now offers a range of services including model building and audits, risk analysis and on-site Crystal Ball training.
SEG President Craig Beasley (Western Geco) stated that world population growth suggests that we will need 50% more energy in 2025 than now. Today, 60% of world energy is from hydrocarbons. For the next few decades ‘nobody expects alternatives to fill the gap’ so there is a heavy responsibility on the oil industry. ‘We’ve delivered in the past, but tomorrow...?’
R&D spend reached a high in the 1990s before dropping off in 2000. Beasley noted a particular lack of investment in seismics and a lack of R&D ‘diversity’ in both universities and in-house programs. ‘Contractors can’t afford to fund research – although this is changing as I speak (I hope)’. What can the SEG do? Beasley showed a map of his travels as president to underscore the global nature of the organization. The SEG plans to expand its ‘Forum’ meetings, the distinguished lecture program and boost international meetings. The SEG has some 23,750 members, half outside of the US. Membership trends show good renewal of young members since 2000, due to active student recruitment. Beasley believes that ‘demographics are self correcting’. A more serious problem is the poor perception of industry in the student community.
Beasley noted some paradoxical recent headlines: ‘OPEC is to increase production,’ even though OPEC is ‘capped out’ right now. ‘Environmentalists protest wind farm in Scotland’. And the reports that oil company activity is ‘constrained by lack of people,’ while the industry shed 4% of its staff last year. Beasley’s take on ‘peak oil’ is that by 2006-7 horizon, projects coming on stream should replace depletion, pushing the ‘peak’ back to 2010. The key question is the development of an alternative energy source. Beasley doubts that NYMEX free market is enough to bring about necessary change.
This year’s The Leading Edge Forum, ‘sponsored by’ Intel, Microsoft and BP, debated the energy supply challenge, asking if it was ‘real or imagined?’ The debate was with an absent Matt Simmons, whose book, ‘Twighligh in the Desert’ appeared earlier this year. Schlumberger CEO Andrew Gould confirmed a ‘crunch’ for oil and gas but anticipated that shale gas and other non conventional resources ‘will contribute to replacing conventional reserves’. Gould noted that ‘most prospective new areas present high political and technical risk.’ Oil company ‘excess’ profits are due to an inability to re-invest. On the shortage of personnel, graduate supply/demand ‘depends where you look’. There is an excess in India, China and the Far East, Venezuela and Mexico. But in the short term, ‘We need a massive cooperation effort in the industry to bring on the technology to fill the gap’. This includes the digital enablement of oilfield operations. 4D seismics, real time operations (drilling and production) and intelligent completions. Production optimization is ‘very people intensive’ but hiring from each other is a ‘zero sum game.’ Gould forecast 4D seismics activity will ‘double in the next few years’. But admitted a pet hate, customers who split acquisition and processing contracts. 4D seismics has a ‘very short shelf life.’ In the Q&A Gould opined that, ‘Operators don’t realize how technically competent the service sector has gotten.’ This met with enthusiastic applause such that Gould backtracked, ‘But I’m not starting a revolution here!’
Tim Cejka was harder on Matt Simmons. ExxonMobil forecasts global energy demand to rise by over 50% by 2030 and oil and gas to play an increasing role. Exxon considers its $600 million per year investment in proprietary R&D essential. In geophysics this targets seismic imaging, and electromagnetics for remote detection. While service company R&D spend is up to $1.7 billion, this doesn’t let companies differentiate, ‘Off the shelf technology will not suffice.’ Geophysical advances have been ‘dramatic.’ ExxonMobil is working on proprietary anisotropic wave equation migration to enhance sub salt imagery. Proprietary plate tectonic software is used to model paleo heat flow conditions. An internal SHAPES project compares reservoir architectures with natural examples from geopmorphology and sedimentology. A global climate and energy project at Stanford represents a $100 million investment over 10 years, shared with Schlumberger, GE & Toyota. Cejka notes a spectacular fall in finding costs—down from $2 in 1994 to 50c today. In a Simmons-esque lapse, Cejka noted that there are more rigs drilling for gas in the US than ever, yet production still declines. The rig market is tightening and the geophysical market is ‘stretched’. Parts of the world are closed to the majors and there are threats to contract sanctity, and competition that is ‘not necessarily economically driven’ from national oil companies.
Landmark was offering a sneak peek at Geoprobe’s Fault Net Manager which niftily ‘snaps’ horizons to termination objects (faults) to create a sealed fault network. Geoprobe’s Well Seismic Fusion plug-in lets you move around the top of the reservoir, viewing gathers in a separate window. Results are captured to a ‘spreadsheet’ of geobodies which can be filtered and sorted by attribute before viewing in 3D for well planning etc. Landmark is to bring all its Open Works seismic data into Oracle (except trace files and 3D horizons!). SeisWorks files will disappear and CRS management will be ‘transparent’ across projects.
Similar functionality was on display from Genetek’s Earthworks which displays gathers in a pop-up window overlaying the seismic display. This window into the prestack dataset can be dragged around to look for similar anomalies. Autopick can be done on stack or on gather, with best fit of moveout in AVO space. Earthworks runs on DEC/HP VMS—making it something of an anachronism to many. But Genetek president Mark Sun isn’t going to bow to peer pressure. ‘Why let the IT department dictate your infrastructure? VMS is the best operating system there is and it does a great job for us and our users.’
Schlumberger unveiled more of its data management strategy around Petrel and its Seabed ‘open data model.’ End users will be able to extend OpenSpirit data access to access foreign data sources. Multi user enhancements include secondary project workflows and reference projects. OpenSpirit write back to OpenWorks and GeoFrame is extended to large data volumes. Seismic data sharing keeps bulk data in one place on disk. A new ‘Petrel Assist’ initiative will let Petrel run everything against the database. This can be ‘Oracle, Seabed, or any ADO/.NET technology.’
Jimmy Cain (Cain and Barnes) warned users of survey data of the pitfalls. There are ‘13 different feet and 2 different meters in use’ throughout the world. Latitude, longitude and direction are ‘not unique’ (google ‘geo4lay.pdf’). Google Earth has ‘variable quality of geo-referencing.’ High resolution imagery is great but ‘where is it?’ EnSoCo’s Jo Connor enumerated more geodetic do’s and don’ts. Mismatching NAD 27 and NAD 83 gives a 400 ft (which foot?) error for Alaska. Connor asked is your software geodetically aware? Even with good 3D loading information, mistakes interpreting 3D indices are ‘possible and frequent’.
Ebb Pye’s Visualization Theatre was the venue for a presentation of Shell’s 123DI interpretation and visualization system running on Linux. 123DI has been extended to display CAD/CAM type models of production facilities—here an FPSO. The idea is to extend the prototype VR display to use on smart fields.
The SEG continues to be a showcase for high performance computing—for number crunching, visualization and storage. Sun and ModViz are working on sharing graphics computation across multiple machines. HP is offering ‘PolyServe’ technology used to run large jobs across multiple OSs and hardware. Everyone is offering ‘computing of demand’ (see page 12 of this issue). But, so far, only FineTooth is actually offering computing on GPUs (see below) although just about everyone is trialing this technology.
IBM was showing its ‘Wireless Oilfield’ connecting operations to corporate systems. ‘Intrinsically safe’ field terminals from IS-Mobile check RFID tags and support real time dialog with operators. We had a sense of déjà vu, checking back to last year’s SEG we noted a similar demo. But while last year the infrastructure was ‘ICE 2 and web services’ this time it’s ‘IBM’s MQ Series’. Whatever. Halliburton is offering a Geographix hardware and software bundle for $799 per month. SGI was showing a spectacular, 9 megapixel Sony 4000 Cinema Projector. This uses the new Digital Cinema Standard and ‘Media Fusion’ of multi mode visual streams from supercomputer, workstations or Windows.
Finally, some numbers we spotted on the Saudi Aramco booth. At the Dhahran EXPEC Technical Computing Center Aramco has 98 SGI CPUs, 5980 PCs, 192 IBM, 3.6 TB tape, 1.2 PB disk and 10 TFLOPS compute. Elsewhere on the booth, Aramco’s drilling department forecasts a dramatic activity hike. In 2004, 57 rigs were in operation, this year 95 are planned and in 2006, 115. The historical record is 55 in 1981.
This article is taken from a longer illustrated report in The Data Room’s Technology Watch series. More from email@example.com.
Following SMT’s use of Graphics Processing Units (GPU) to perform fluid flow modeling (OITJ Vol. 10 N° 8), Boston-based FineTooth was showing seismic processing and data compression on a bespoke NVidia-based system from Panta Systems of Cupertino, CA at the SEG. The hardware included 16 high-end NVidia Quadro 4500 graphics cards with Infiniband interconnect and storage for a 110GB prestack dataset. Finetooth’s software is being developed to decompress on the fly for visualization of such massive prestack datasets. Later work will extend the use GPU to seismic processing. The potential is significant. Today’s Intel Xeon processors have a theoretical compute capacity of 6 GFLOPS, the Quadro, 165 GFLOPS.
FineTooth CEO Diderich Buch told Oil IT Journal, ‘Current interpretation software doesn’t understand prestack data, in part because the data volumes are such a challenge. We plan to let interpretive processors interact with prestack data by leveraging GPU-based processing. This will provide access to multiple versions of processed data using data compression to provide rapid access. Our real time compression technology targets QC and parameter selection, bringing processing and interpretation together.’ FineTooth is targeting the major processing houses and ‘any company doing prestack interpretation’. Buch is also CEO of Norwegian Hue Space, a technology supplier for Schlumberger’s GigaViz package.
Houston-based Cal Dive has acquired Helix RDS of Aberdeen for $31 million. Helix has 180 employees and offers reservoir and well services. Cal Dive plans to use the acquisition to offer its production contracting model in the North Sea and elsewhere.
David Archer is to stand down as CEO of the Petrotechnical Open Standards Consortium, following a divergence of strategy views with the POSC board. A search is on for a replacement. CTO Alan Doniger is interim CEO.
Petrosys has appointed Franck Lemaire as its EU business development manager based in Paris. Lemaire was previously with Dynamic Graphics.
Calgary-based Divestco has acquired Laser Software Ltd. developers of the LandRite land management package. Divestco also acquired Focus Integrated Solutions, which provides management and technical services in CRM and ERP.
Schlumberger Limited is to relocate its US corporate office from New York to Houston in the second half of 2006.
BEA Systems has acquired Plumtree, the portal company, in a $204 million deal. BEA’s WebLogic unit is to offer ‘blended’ open source and proprietary solutions built around the Apache Tomcat Java application server.
Andy Barden is manager of BJ Process and Pipeline Services’ new Houston-located ‘global business development group.’
Energy Solutions has named Roderick J. Hayslett as CFO.
Quorum Business Solutions has hired Joseph “Troy” Turcott to lead its Pipeline Integrity effort.
Norsk Hydro’s VC arm, Technology Ventures, has acquired a NOK 20 million stake in Zurich-based Spectraseis Technologie AG.
Input/Output has appointed Jim Hollis VP New Ventures in its new FireFly division. Firefly, rolled out at the SEG, is a wireless seismic data acquisition system.
Data storage specialist Iron Mountain has acquired the Australian operations of Pickfords Records Management. Bob Miller is to manage the new division.
In two internal promotions, TietoEnator has named Pentti Heikkinen as President and CEO and Matti Lehti as Chairman.
Knowledge Reservoir and Concessions International are to team on a study of undeveloped Asian discoveries.
Chevron’s CTTV VC investment arm and Contango Capital have taken a stake in real-time wireless provider Moblize.
Denby Auble has joined Geotrace as ‘manager of innovations.’ Before joining Geotrace, Auble was with Western Geophysical.
Veritas DGC has appointed Vincent Thielen as VP, Business Development and Dennis Baldwin as VP, Corporate Controller. Thielen’s was an internal appointment. Baldwin joined Veritas last June from Universal Compression Holdings. Veritas also announced the retirement of Director and Vice Chairman, Stephen J. Ludlow. Ludlow began his career with Digicon in 1971 as a member of a marine seismic acquisition crew.
Deloitte Petroleum Services has appointed Edward Woodhouse as GIS developer to support and develop both the MapInfo and ArcGIS versions of PetroView.
Colin Howard (Zeehelden Geoservices) is to represent 3DGeo in the EU and Africa.
SpectrumData has appointed Simon Inglis as General Manager.
Ex-Seitel CEO Paul Frame received a five year jail sentence for taking $750,000 from the company.
Halliburton’s Energy Services Group (ESG) posted record revenue of $2.6 billion in the third quarter of 2005, a $489 million or 23% increase over the third quarter of 2004. ESG also posted record operating income of $566 million, up $152 million or 37% from the same period in the prior year.
An IPO brought UK-based potential field specialists Getech £3.2, valuing the company at £10.8 million.
Reporting on a $26 million operating loss for Q1 2006, SGI CEO Bob Bishop said, ‘We achieved several important goals in the first quarter. We grew revenue from core products year over year, exceeded our margin targets, secured new financing and are on-track with our restructuring plans.’ SGI completed a $100 million ‘asset based credit facility’ from Wells Fargo Foothill and Ableco Finance.
Kelman Technologies announced a $47 thousand net loss for Q3 2005 over Q3 2004. Canadian and US seismic processing showed significant improvement while the international and data divisions lagged.
Input/Output reported Q3 2005 net income of $1.4 million, on revenues of $82.7 compared to a net loss of $5.0 million on revenues of $80.9 million for the same period a year ago. I/O CEO Bob Peebler said, ‘The gradual improvement we anticipated throughout the year remains on track and will continue into Q3 despite the challenges that hurricanes Katrina and Rita presented to our operations.’
Computer Modelling Group’s (CMG) second quarter results for its 2006 fiscal year showed revenues of $3.8 million (CDN) up from $3.1 million fro the same period last year. CMG president Ken Dedeluk said, ‘Underlying factors supporting increased demand for reservoir simulation exist and we are positioned for future growth opportunities.’
Aspen Technology’s Q1 2006 saw total revenue of $60.1 million (40% from software). CEO Mark Fusco said, ‘In the past three quarters, we have improved our services margins, eliminated our convertible debt, and created an infrastructure that can deliver improved performance over the long-term.’
MRO Software reported record Q4 2005 revenues at $55.4 million, up 11% percent, with software license revenues of $20.9 million. Forth quarter clients include BP Oil, CNOOC, Kuwait Drilling Company, and the US Department of Energy.
ADI pioneers Douglas, Peaceman and Rachford.
A meeting in Rice University, Houston celebrated the 50th anniversary of the publication of a seminal paper* on the alternating direction implicit method (ADI) for solving differential equations. The authors, Jim Douglas, Don Peaceman and Henry Rachford, working for the Humble Oil Co (later Exxon) were among the first to use computers to model oil reservoirs.
Their method’s ingenuity lay in the way it adapted a multidimensional problem to a sequence of one dimensional steps that fitted into the extremely limited memory of the IBM calculator. As described in Paul Davis’ excellent review of ADI development**, the method involved punching intermediate data onto cards backwards and upside down. These were turned over and read in backwards, reversing the direction of computation.
Bill Watts (Exxon) described ADI as the best available method until the late 60s when strongly implicit procedures, LSOR Krylov space methods ‘changed the world of solving matrix equations’. Now, Krilov, ILU, nested factorization, Line SOR and multi grid methods are used. But for today’s large, complex reservoir models, ‘We are still feeling our way,’ seeking a balance between compute efficiency, algorithm stability and accuracy.
Henry Rachford, who now works for BG spin-off Advantica, told Oil IT Journal more about the hardware used. ‘The IBM Card Programmed Calculator (CPC) was a cobbled-together collection of existing IBM equipment including an accounting machine printer and card reader (I think it was a 407***), and a relative of the 605 calculating punch, the system’s plug-wired ‘CPU’. It had three storage units which we called ‘iceboxes,’ because once stored, data couldn’t be retrieved in under 400ms, so it got pretty ‘cold’. Each stored a collection of 16 signed 10-decimal digit numbers. Three iceboxes stored 48 10-digit numbers, which we formatted as a 2-digit exponent and 8-digit mantissa. The 407 provided storage for an additional 8 internal 10-digit decimal numbers for a total of 56. This limited the first test of ADI on the CPC to a 14 x 14 grid. Don Peaceman and I did all the coding which turned these accounting machines into card-instruction-interpreters for performing somewhat restricted floating point calculations including log and exponentials, but no trig. Insofar as I know our particular version of this computing system was never made available outside our office.’
* Journal of the Society for Industrial and Applied Mathematics, Vol. 3, pages 28-41.
** SIAM News, Vol. 26, No. 4, July 1993.
Following the renewed certification of Landmark’s customer support center from the Support Center Practices* (SCP) program, Beverly Stafford, Director of Global Support and Training told Oil IT Journal how it had achieved its 4th Global Certificate. ‘The SCP program helps us improve customer satisfaction by measuring response and resolution times. The program helps us fine tune customer support with automated techniques.’
Landmark uses an automatic call distribution (ACD) package, ‘Apropos**’ to ensure clients are routed to qualified analysts—a.k.a. ‘skills-based routing’. A call menu asks users for a personal identification number and details on the product and nature of their problem and the software pulls up appropriate subject matter experts with time zone availability and the appropriate language skills.
This a heavy duty solution mobilizing some 180 Landmark employees in its technical applications centers (TAC) in in North America, Latin America, Asia Pacific and EAME. The Apropos package manages call inflow and ensures round the clock coverage. The software also monitors response times which, along with other metrics, are shared with clients to demonstrate service level agreements.
Customer support dovetails with the research department so that bug reports (Landmark prefers to speak of incidents) get passed on to its PeopleSoft toolkit – used to track incidents internally. The system handles 80-100,000 calls per year.
* The SCP program is managed by Services Strategies – www.servicestrategies.com.
** More from www.apropos.com.
Stone Bond Technologies (SBT) has just launched its launched its new Enterprise Enabler (EE) Server 2006 integration tool for monitoring IT processes and data workflows. EE combines SBT’s Enterprise Application Integration (EAI) and Extract Transform Load (ETL) packages in a single environment. EE provides ‘connected’ data management, real-time decision support and governance over the changing technical environment.
SBT CIO Pam Szabo said, ‘EE has many-to-many interface-building capabilities to handle mixed relational and hierarchical formats. Our AppComm technology provides high speed native connectivity to proprietary formats, eliminating the need for adapters or central data staging. EE Server 2006 is the ‘Swiss Army knife’ of systems integration.’
Earlier work by SBT leveraged EE connectivity to extract production data from PDVSA’s OSI Soft’s PI SCADA database, said to be the largest PI database in the world.
Ikon Science has released a new pore pressure module for well planning. RokDoc PPC (pore pressure prediction calculator), was developed in collaboration with Shell International E&P’s Houston R&D center. RokDoc-PPC models and predicts oil or gas reservoir pressure. Shell uses the technology for both technical training and global E&P project delivery.
Shell petrophysicist Mark Kittridge said, ‘Our role in technology development is in identifying best practices for use by Shell operating companies worldwide. Our collaboration with Ikon Science leveraged their development expertise and the RokDoc platform to rapidly deploy a new design concept and technology worldwide.’
Ikon MD Martyn Millwood Hargrave added, ‘This work is a win for Shell in ease of use and deployment and a win for Ikon in developing a leading product that we can roll out to an industry that is hungry for new and better technologies.’ RokDoc investors include Tullow Oil and Shell Technology Ventures.
Houston-based Geomodeling Technology Corp. has released V5.0 of its VisualVoxAt (VVA) seismic volume interpretation package. The package now includes ‘real-time’ cross-plotting to compare attributes and data sets. Cross plotting supports seismic section, volume, horizon, interval, strata-grid and well data displays. Users can digitize polygons on cross-plots and identify corresponding features in 2D or 3D seismic, well or map views. An AOI can be selected in seismic or map view and the data studied in the cross-plot window.
Spectral decomposition now includes Continuous Wavelet Transform (CWT) algorithms. Other new features include improved detection of thin beds, lateral discontinuities and subtle anomalies. CWT and Fast Fourier Transform methods can be applied to both 3D and 2D data sets for more comprehensive analysis.
Other enhancements to the software include improved data management displays for more efficient handling of well log data and formation tops. Seismic waveform correlations and facies classification can be performed and grid balancing and mistie analysis aids jump correlation.
Speaking at OpenText’s LiveLinkUp event in Orlando this month, Sasol’s Eric Slaghuis described deployment of Open Text’s Internal Controls Solution (ICS) to ensure compliance with Section 404 of the U.S.’s Sarbanes-Oxley Act (SOX). SOX compliance is mandatory for non-U.S. firms trading on the New York Stock Exchange. South African Sasol deployed OpenText ICS as an extension of its existing Livelink ECM solution, Sasol’s collaboration and content management platform (OITJ Vol. 9 N° 4).
Slaghuis said, ‘We established an early timeline for Sarbanes-Oxley requirements to ensure we would be ready well in advance of our compliance deadline. Much depended on getting the right software in place. We evaluated several corporate governance offerings, but chose ICS to fit into our existing IT environment to speed deployment. We’re pleased to say that we were successful in documenting all financial reporting controls well in time. This enabled us to initiate a robust self-assessment business process in order to prepare for our first filing in 2006.’
ICS integrates with Livelink ECM’s notifications and assignments to inform compliance officers of questionable findings, and launch the required processes to address and resolve potential risks, issues and control tests.
Morgantown, WV-based Intelligent Solutions (IS) has released a new package for production data analysis, IPDA. The software integrates decline curve analysis, type curve matching and reservoir simulation through history matching to converge on a set of reservoir parameters that satisfy all input data.
No data, no problem!
IS president Shahab Mohaghegh, who is professor of Petroleum Engineering at the University of West Virginia, said the new package is particularly suited to partial data sets, ‘Even if you have no pressure data, IPDA will work from monthly production data. IS’s fuzzy pattern recognition maps the results of an analysis over the entire field, showing 3D volumes of permeability, depletion and remaining reserves.’
IS will soon be adding ‘Intelligent Surrogate Modeling and Analysis’, ISMA to its product line, bringing artificial intelligence to the smart field and removing the bottle-necks that exist between high frequency data streams and slower analytical techniques like reservoir simulation. ISMA will include real-time reservoir simulation, surveillance, well integrity monitoring, decision support and process optimization.
Calgary-based e-business software house Digital Oilfield has just introduced OpenInvoice Remote (OIR), a new application that provides users with the ability to download invoices from Digital’s hosted, Internet-based invoice processing system and work on them offline. OIR lets workers review and approve invoices on a stand-alone PC before uploading them to the online system.
Digital Oilfield president Rod Munro said, ‘Field users don’t have access to the Internet on a continuous basis. OIR puts them in charge of when they fit the invoice approval process into their day.’ OIR is a component of OpenInvoice Suite, an electronic invoicing solution for the oil and gas industry that automates paper-driven processes between suppliers and operating companies.
Digital Oilfield has just signed Calgary-based Pengrowth Energy Trust as its latest OpenInvoice client. Pengrowth CIO Clay Radu said, ‘We chose OpenInvoice following six months of discovery and analysis. Now all incoming invoices, whether paper or electronic are processed the same way. We are handling less paper and reducing the staff time it takes to handle invoices.’
Knowledge Systems (KS) has released Pressworks, a relational database for geopressure data that integrates with its Drillworks geopressure and geomechanics application. Pressworks brings together diverse pressure-related data from many sources of well and reservoir information.
KS CEO Jim Bridges said, ‘By making geopressure information available to engineers involved in the planning and real-time management of drilling operations, the Pressworks database streamlines the workflow and enhaces asset team efficiency. Pressworks provides a corporate repository for pressure data, eliminates cumbersome paper plots and text files to improve data usability and reliability, while improving multi-disciplinary collaboration and well planning.’
Murphy Exploration and Production Company has licensed P2 Energy Solutions’ (P2ES) Tobin SuperBase products, including its land survey file, well header file and culture file. SuperBase is a suite of digital mapping databases targeting the energy industry. The databases are used in a wide variety of applications including energy-related ERP and work with most mapping, CAD and GIS software. P2ES claims SuperBase to be the most comprehensive, continuous high-resolution map base data coverage available for the United States.
Ron Chassaniol, Business Development Manager for Murphy, said, ‘SuperBase gives us access to comprehensive high-quality data that will help us expand our domestic exploration.’ P2ES is adding data in new areas as high-resolution satellite and aerial imagery becomes available. Key features of the program include the refinement, expansion and enhancement of its data products through application of mapping sciences and new technology and the addition of a comprehensive metadata set. P2ES also integrates private and public survey data and offers other services including database synchronization, digitizing and format enhancements and digital reporting of changes made.
Outsourced ‘computing on demand’ seems to be taking off as witnessed by recent announcements from several IT hardware vendors. HP’s Flexible Computing Services offering has been taken up by Schlumberger to provide reservoir simulation customers with the extra compute cycles and improved performance they need to run large numbers of reservoir simulations. The ‘public utility’ computing model, trialed with HP’s EU clients, offers peak shaving for other heavy users of CPU power such as seismic processors and software developers.
Sun Microsystems’ partner Virtual Compute Corporation (VCC), has just signed an agreement with Paradigm for the provision of CPU cycles on the Sun Grid Compute Utility. The deal offers Paradigm quick access to thousands of Sun Fire V20z Opteron-based servers. VCC resells Sun’s Grid to the oil and gas vertical. Sun has 2,000 CPUs in New York and another 3,000 in London serving the financial services sector and is working with energy companies to establish presence in Houston.
Appro partnered with CyrusOne to open its Compute on Demand Center (CODC) in Houston this month. The CODC is claimed to be one of the few ‘top tier’ data centers in the country engineered specifically to address grid computing.
Finally, IBM’s ‘deep computing on demand’ is still available from its Houston-based data center opened last year. IBM grid clients include SINOPEC, El Paso, PGS, Paradigm (again) and Landmark.
Shell is to deploy ModViz’ Virtual Graphics Platform (VGP) to accelerate its 123DI seismic interpretation application. VGP allows OpenGL-based applications such as 123DI to leverage multiple graphic processors in a single workstation or cluster to enhance 3D graphic performance. 123DI is utilized throughout Shell on the desktop, in collaborative ‘Team Rooms’ and Visualization Centers.
Tom Coull, ModViz CEO, said ‘Seismic interpretation produces some of the most demanding 3D visualization needs in any industry which is why our VGP technology has attracted so much attention. VGP allows users to interact with very large 3D data sets without the need for data pre-processing or decimation.’ VGP is an OpenGL-based computing platform for ‘supercomputing level’ visualization of large data sets on clusters of commodity-based 3D graphics computing nodes.
Intervera Data Solutions is to use the OpenSpirit integration platform to offer their Data HealthCheck users increased integration with third-party geoscience applications and data stores. OpenSpirit now supports over 40 client applications from 30 developers, providing access to ‘a dozen’ data stores.
Intervera president Paul Gregory said, ‘E&P technology managers see Data HealthCheck as a smart investment because it helps them reduce the overall business risk caused by poor data across their networks. OpenSpirit, the leader in vendor-neutral integration, enables applications to interoperate and access data from diverse projects.’ Intervera’s data quality profiling will leverage OpenSpirit to access and QC data from multiple vendors. OpenSpirit CEO Dan Piette added, ‘Intervera inspires end-user confidence through simple, smart solutions that provide effortless data quality assurance.’
ExxonMobil Research and Engineering Company has signed a ’ perpetual license agreement’ with Invensys’ SimSci-Esscor unit for the deployment of its dynamic simulation software, DYNSIM, in ExxonMobil’s worldwide downstream affiliates. ExxonMobil’ has also nominated Invensys as its preferred operator training simulator provider and extended its long-term license agreement for the provision of SimSci-Esscor’s process simulation, heat transfer, and flow network modeling software.
SimSci-Esscor VP Alastair Fraser said, ‘We are pleased that ExxonMobil has extended its relationship with Invensys as its provider of steady state and dynamic process simulation solutions leveraging the SIM4ME common modeling environment. We are also pleased that ExxonMobil has adopted our solutions for process design, operator training and real-time optimization. All of which will allow ExxonMobil to reap the benefits of a consistent modeling framework throughout the plant lifecycle.’