Sun Microsystems made a significant move this month by embracing the Open Source (free) Linux operating system on its ‘low-end’ servers. Sun is to commit new resources to the ongoing development of the Open Source operating system. Sun will ship a full implementation of Linux on a new line of general-purpose servers aimed at workgroups and remote offices.
Intel x86 hardware
New single and multiprocessor systems, to be announced mid-year, will abandon Sun’s proprietary SPARC architecture in favor of Intel’s x86 processors. Sun’s line of Cobalt Linux appliances (such as pre-configured web servers) will be expanded. Cobalt entry-level prices are around $1,000 and an installed base of over 100,000 units is claimed.
Sun plans to participate more aggressively in the Linux developer community by contributing key components of its Solaris operating system, and by releasing tools help ensure compatibility between Solaris and Linux.
Solaris on high-end
Sun’s high end machines will remain on Solaris – for the time being – but with significant new Linux leanings. A Linux Compatibility Toolkit (LinCAT) will assure that Linux applications will run on Sun Fire servers. Sun’s upcoming Solaris 9 environment will provide additional built-in Linux commands, utilities, and interfaces.
Sun ONE technologies will be offered on Linux, including the iPlanet directory and web servers, Forte for Java development tools, the Java/XML platform, Project JXTA, StarOffice, Chili!Soft ASP support, and the Sun Grid Engine. Sun, along with IBM, is already one of the largest providers of intellectual property to the Open Source development effort with contributions to OpenOffice.org, GNOME.org, Mozilla.org, Apache.org, NetBeans.org, X.org, WBEMsource Initiative, the University of Michigan NFS version 4 Linux port, the Grid Engine Project, and Project JXTA.
Sun’s Linux announcement was well received by the markets. Merrill Lynch commented that the move would ‘help Sun fight off increasing competition.’ In this special Open Source issue of Oil IT Journal we explore the extraordinary growth of the free software movement.
In a move reminiscent of Landmark’s acquisition of Magic Earth last year, Schlumberger Information Solutions (SIS) has bought 3D virtual reality (VR) specialist Inside Reality (IR). IR’s technology was developed in conjunction with Norsk Hydro and Christian Michelsen Research, a Norwegian institute.
IR’s ‘Inside Earth’ VR system offers an immersive environment for well planning, geosteering and seismic interpretation. IR goes further than Magic Earth in the use of a fully immersive VR paradigm. Users interact with the reservoir through natural hand and body movements, such as walking, pointing and grabbing. Hydro claims a 90% reduction in turnaround time for planning of horizontal wells and ‘significant increases’ in oil production over conventionally-planned wells.
Hydro VP Jens Hagen said, “We plan most of our complex wells within IR which has had a major impact on the economics of field development. This has fueled the rapid uptake of the technology, with over 150 users in Hydro.” SIS will deploy IR it its iCenter VR visualization centers worldwide. IR connects to project databases via the OpenSpirit integration framework.
The dot come debacle has taught us that Internet is not a one-way bet on instant wealth. But it has proved a great way of driving costs out of software development. How much would you expect to pay for an operating system today? Or software to run your web server, or mail application? The correct answer is .. nothing. All these are available for free thanks to the Open Source movement.
Lets get the politics of Open Source out of the way first. There are two camps of software developers. One - the Open Source brigade, believe that software grows incrementally - that all software today builds on a mass of pre-existing work, most of which is in the public domain. To the Open Source camp, the act of patenting and selling such software is evil! On the other hand, the commercial world sees software as a source of competitive advantage and guards jealously what are sometimes quite trivial inventions. The most glaring example in recent times being Amazon’s infamous patent on its ‘one-click’ ordering ‘invention’. The politics of Open Source are much more complex than this. If you want to learn more, read the excellent book Rebel Code*.
While the politics of Open Source are exciting, they would be trivial if the Open Source movement, aided by the Internet (which it collectively almost invented) had not come up with an unstoppably good way of developing quality software. Let me explain with an analogy. Suppose that you could harness all the effort that folks around the world put into doing their morning crossword puzzles. Imagine the enthusiasm and effort that a massive world-wide crossword puzzle would generate! If you could only think up an appropriate ‘puzzle’, the internet offers you unlimited free ‘labor’.
The ‘perfect’ Operating System
One of the ‘puzzles’ that crystallized the Open Source community was the building of the perfect operating system. In the 1970’s work done on what was collectively know an Unix, pretty well laid down the rules for what a computer operating system is and how it should behave - its ‘interface’. The basic notions of good software design were established then - software should be modular - a good program does little and does it well (in stark contrast to the way most commercial applications work). But better than this, the Unix movement achieved a large consensus on what these modular building blocks should be. Anyone who has used one of the Unix shells will appreciate how elegant this design is, and how the system of files and pipes make it easy to leverage the power of modularity. The basic Unix specification was crystallized in Posix - a blueprint of the Unix interface - but without the code that made it work.
Posix - the code-less Unix was the ‘puzzle’ that set the Open Source movement in motion. Posix, and a Fin called Linus Torvalds, who started work on what was to become Linux, over a decade ago. Torvalds began posting the components of Linux on the Internet, and a community of like-minded puzzle-solvers soon grew up around him. What made the whole program so successful was the common purpose of the puzzlers and the modular design of Unix. It was possible to develop, publish and test code designed to do one particular thing, without it having a load of side effects crop up in other parts of the system. (If you have ever programmed with non-modular systems - such as those using Fortran ‘common’ storage - or even if you write your programs with lots of ‘global’ variables, you will know about these pitfalls).
The public inspection that the internet brought to Torvald’s ‘work in progress’ has led to impeccable design. If your code is buggy, hundreds of ‘puzzlers’ pounce and fix it - all for the love of their art! This built-in quality control has led to whole chunks of programming expertise migrating to the open source paradigm. Most of the code that underpins the internet is Open Source - notably the highly successful Apache web server. Many key tools used in modern software development (compilers and scripting languages) are equally developed under constant public scrutiny. Amazingly, the software that is used in encryption is largely Open Source. The rationale here is that the chance of a hacker benefiting from intimate knowledge of the open technology is actually less of a risk than flawed software being developed in secrecy!
Open Source software was a real cat amongst the pigeons in the commercial software world. But some savvy operators have learned to cohabit with, and profit from the movement. IBM was one of the first, throwing out its proprietary web server and rolling Apache into its WebSphere offering. Many others have followed - most recently Sun as witnessed by this month’s lead. Noting the market success of Linux, many of the original Open Source developers have watered down their anti-commercial stance and founded Linux-related startups - notably Red Hat which offers a ‘professional’ version of Linux. In the other corner, Microsoft has blown hot and cold about opening up Windows code to public inspection, on the one hand mooting a move to totally closed networking to squeeze out Linux and on the other, offering a peek at the Windows source to major clients.
As Linux goes commercial there is a risk that it will encounter the same sort of competing implementations that caused its Unix progenitors so much strife. Doubts have been expressed on the long-term viability of the Open Source paradigm - with coders ‘growing up’ and tempted into ‘real jobs’. Personally, I reckon that there will be a good supply of puzzlers waiting to code for free for a while. The impact on the IT business, on corporations and end-users is harder to figure.
*Rebel Code by Glen Moody. Penguin Books 2002. ISBN 014 029804 5.
With the availability of broadband communications, a new delivery model for spatial data has emerged. By packaging and integration location-based services, the power of the Geographical Information System (GIS) can be delivered over the internet. The technology that underpins these new capabilities is Application Service Provision - ASP.
ASP allows end users to access sophisticated software running on a remote server from any web browser. A good example of ASP is Microsoft’s Hotmail - a fast, easy to learn and effective way to manage email with little or no software to install and manage on your local machine.
Web-based technologies from leading GIS vendors along with third party ‘thin client’ (Internet Browser) solutions from Citrix and Tarantella make spatial ASP a reality. Spatial ASP solutions include www.MapQuest.com, National Geographic’s MapMachine and www.Expedia.com. In industry, thin client solutions serve GIS-based applications to geographically dispersed field engineers and offices. Software installation and maintenance is centralized, easing maintenance and the problems of conflicting and ‘unauthorized’ versions of data is avoided.
Autodesk has just launched its Location Services division, with three core components relating to the provision of spatial GIS services - Mapguide Commerce, OnSite and LocationLogic. ESRI leads in spatial ASP with its flagship Internet Map Server (IMS). GE Smallworld’s Internet Application Server (IAS) is designed to ASP-enable Smallworld desktop products such as Model.it. A leader in Asset and Facilities Management GIS and mapping, Intergraph has extended its GIS products to the internet with the GeoMedia products and services along with its wireless location services IntelliWhere unit. MapInfo delivers its cutting-edge MapXtreme application into the telecommunications and business intelligence markets.
As with any major software deployment a variety of issues must be considered including internal or outsourced provision, service level agreements, quality of service and security. While these are far from trivial concerns, by far the most important component of the spatial GIS is content! Without content to populate databases, or to conduct analysis, spatial ASP holds little intrinsic value. This point is well understood by data product vendors who play a key role with both spatial ASP vendors and end users.
Despite the considerable merits and advantages of spatial APS there are obstacles to overcome before its true benefits are realized. Traditional software licensing agreements are not aligned with the ASP business model. ASP GIS will spawn a large number of new users throughout the organization - unless such growth is stunted by traditional licensing models. ASP demands a true pay-as-you-go licensing model, that generates income for the ASP, the data vendor, the ISP and other stakeholders.
Growing support from the GIS industry demonstrates the strategic interest of the ASP model for consumers and vendors. Service providers continue to find new and innovative ways to build and integrate various location-based solutions and to deliver cost-effective products to a variety of industries. There is still work to be done to propel spatial ASP beyond the early adopter community. A more favorable licensing model and more analytical application functionality will help deliver the ASP value. But the value of spatial ASP is now a reality.
This is a shortened version of an article that originally appeared in online GIS resource Directions Magazine - www.directionsmag.com.
Gnome Office aims to produce a productivity suite composed of entirely free software. Gnumeric is the spreadsheet/math intended ‘to replace commercial spreadsheets.’ We tried AbiWord - pretty neat stuff! More from www.gnome.org.
Gstat performs multidimensional geostatistical modeling, prediction and simulation in one, two, or three dimensions. Options include kriging and cokriging. Gstat cooperates with GRASS GIS (see below). Download your own trial from www.gstat.org.
GMT is an open source collection of around 60 UNIX tools
for manipulating 2 and 3 dimensional data sets. GMT functions include filtering,
trend fitting, gridding and projections. More from
The Geographic Resources Analysis Support System (GRASS - www.geog.uni-hannover.de/grass) is an open source Geographical Information System (GIS) with raster, topological vector, image processing, and graphics production functionality.
Seismic Unix (SU) is free software for seismic processing from the Colorado School of Mines. SU is distributed with the full source code, so that users can alter and extend its capabilities. Download your code from www.cwp.mines.edu
Open CasCade is an Open Source tool for 3D modeling. Its fields of application range from mechanical CAD and analysis to architectural engineering and geographical information systems (GIS). More from www.opencascade.org and www.opencascade.com.
If you fancy checking out the most-used web server in the world - download Apache - either the source code or a Windows binary from www.apache.org.
MySQL - a free database running in PHP - a web scripting environment - deployed notably on the www.oilit.com website. More from www.php.org and www.mysql.org.
IHS Energy Group has released Drilling Wire on the Web (DWW), tracking all current U.S. active drilling and completions within hours of release. DWW delivers information culled from IHS Energy’s nationwide network of scouts and news sources offering subscribers information on new permits, staked locations, drilling wells and new completions.
DWW was previously available in hard copy. The online version permits interactive search on criteria such as completion date, operator, depth and location. A personalized report generator lets users define an area of interest (AOI) such as the drilling progress of wells that meet certain criteria. Once the AOI has been defined, users can be notified by email of new activity, eliminating the need for continuous monitoring.
News on demand
Also included with their subscription is the Energy News on Demand service, which offers in-depth editorial news, lease-sales data, regulatory activity, legislation, and oil and gas prices, which are published every 15 minutes.
DWW product manager Steve Trammel said, “Customers who previously received our hard copy reports are enthusiastic about the new Web-based product. All employees now have immediate access to critical information, much of it updated daily. DWW offers a powerful, discriminating search engine, eliminating the hassle of keeping up with hard copy reports. It doesn't take long for customers to realize the value and cost savings of our new, Web-based product.” More from www.ihsenergy.com.
The Network of Excellence in Training (NexT), a joint-venture between Schlumberger, Heriot-Watt, Texas A&M and Oklahoma universities has announced its new Petroleum Economics ‘e-Learning’ computer-based training. Petroleum Economics Interactive (PEI) is an 8-hour e-Learning series designed to give a solid founding in economic concepts relevant to the upstream petroleum industry. The subject matter is presented using real-life scenarios and fictional characters involved in a typical exploration and production investment decision.
PEI’s seven learning modules are Cash Flow Basics, Revenue, Expenditure, Fiscal Systems, Risk Analysis, and Investment Analysis. The series culminates in an interactive challenge that tests the student’s understanding of the material. A simulated board of directors meeting requires the student to apply concepts learned throughout the series to make the correct recommendation to the board. PEI is available for deployment on company networks or on CDs.
NExT CEO Claude Roulet, said “This latest addition to the NExT Interactive e-Learning catalog follows the successful launch of our ‘E&P Interactive’ series which gives an introduction to the upstream oil business. These training programs combine excellent content developed by the NexT virtual faculty, and world-class digital animations from TechnoMedia. Like E&P Interactive, which has already been adopted by several E&P operators as a substitute for new-hire classroom training, PEI delivers solid value for both non-economics professionals in the industry and support staff in a wide range of specialties.” More from www.next.ie.
PetroMod 7.0 offers a completely updated 1D package, revised input editing across the whole suite - with linked spreadsheets and graphics - and a new suite of 2D and 3D viewers. These include flash calculators so that all reservoir and accumulation properties and volumetrics can be analyzed under both subsurface and surface conditions.
The new release, version 7.0, boasts a range of simulation methods,
such as Darcy, flow path (ray tracing) or a hybrid process. PetroMod claims
to be the only system with fully PVT-controlled modeling of n-component/3-phase
relationships over the entire migration process. These include simple ratio,
symmetrical black oil and flash
calculations and well as gas diffusion modeling, in both 2D and 3D.
IES’ hybrid migration modeling technology provides a tight link between modeling methods during simulation, and enables detailed, high-resolution models to be comprehensively processed. Hybrid modeling improves calculation of hydrocarbon leakage in reservoirs and scale-independent processing, as hydrocarbon column heights are independent of cell sizes in the model.
IES offers an ‘open system’ allowing geometric and parametric data to be exchanged with other modeling packages. Binary data linkage is also offered to industry leading packages such as SeisWorks and OpenWorks, and to GeoQuest’s IESX, Charisma and GeoFrame. PetroMod 7.0 is available on Unix, Linux and Windows NT/2000.
The Petris Winds Internet Dashboard leverages SmartMoney’s award winning Map of the Market to visualize key performance indicators derived from upstream data assets.
Complex data sets
The web-based tool provides a graphical representation of complex E&P data sets. Early adopters have found the Dashboard to help analyze the health of a company’s assets, projects or facility performance.
The Dashboard enables real-time visualization of patterns, trends, anomalies and data inconsistencies. Petris has enhanced the SmartMoney technology, originally developed for the financial services sector, by allowing users to create a map using a dynamic link to a database or an Excel template. This feature gives the user broad flexibility in the use of the Dashboard and provides a standard for exchanging Dashboard views, or ‘maps’.
Rene Calderon, Petris VP of software development explained, “The Dashboard creates a hierarchy of data, allowing one to ‘drill down’ into various levels of data presented. Users can then visualize a map of their data where numeric values are shown as rectangles of various sizes, colors and position. By visualizing vital asset information at a glance, managers become more efficient, productive and responsive.” The Internet Dashboard offers views of corporate data from different time periods. Top-performing assets, or problem data sources can be highlighted by entering the asset’s name.
The original work on adapting Smart Money was done last year for Burlington Resources which uses Petris’ technology in its corporate decision support system. More from www.petris.com.
Drilling software specialist Knowledge Systems Inc. (KSI) has just released Drillworks/Geostress for real-time wellbore stability analysis. A Design/Analysis Mode enables the user to perform calculations to investigate the sensitivity of mud weight to wellbore parameters.
KSI VP Technology Terrel Miller said, “Geostress provides wellbore stability analysis while drilling is underway and helps in planning mud programs, wellbore trajectories and minimizes surprises associated with wellbore instability. Models developed in the planning stage can be updated while drilling.”
WITS data link
Geostress integrates KSI’s Drillworks/Predict overpressure package and benefits from real-time MWD/LWD data in the industry-standard WITS format.
Originally announced in June 2000 (Oil ITJ Vol. 5 N° 6), Advanced Geotechnology’s (AGI) StabView 2.0 is due for released in March this year. StabView is AGI’s multi-zone borehole stability, lost circulation, fracturing and sand production analysis software.
Joint industry project
StabView 2.0 was developed with financial support from Chevron, Shell, Petrobras, Nexen, Baker Hughes, Weatherford, PTAC, IRAP, the Drilling Engineering Association and others.
Version 2.0 adds comprehensive multi-zone modeling with advanced analysis and design capabilities for well planning. Version 1.5 will be maintained for single-zone ‘quick-look’ analyses and training purposes.
Burlington Resources has begun pilot testing Wellogix’ Internet software to improve communication within project teams and to enhance engineers’ working environments by providing instant access to information. Initially the new tool will be deployed in drilling and completion services.
Burlington project manager David Radar said, “We expect that Wellogix’ software will create value for Burlington by increasing the productivity of our people and improving the way we do business. High data visibility, improved accuracy and dependable data transfer, are part of Burlington’s goals as we continue to improve our business practices.”
Wellogix has received ‘SysTrust’ third-party security assurance for its WorkFlowNavigator suite following an audit by Ernst & Young. Wellogix maintains effective control over its applications by providing them in ASP-mode. More from www.wellogix.com.
Duncan McKay gave a follow up paper on the fate of the Saga North Sea document asses that Conoco UK acquired in 2000. Hays RSO is now image-enabled and is used by Conoco’s explorationists as a ‘one stop shop’ for locating documents. Conoco are looking to migrate to Open Text’s LiveLink to add full-text search to its document management, even though the software is expensive and does not allow for floating licenses. McKay’s experience of acquisitions and mergers has led him to advocate planning for data rooms and even to set aside a room for this permanently. Data is a critical asset for Conoco, while public data can be managed in a shared environment like CDA, good management of proprietary data is a source of competitive advantage for Conoco.
Erik van Kuijk (Shell) commented that the ‘missing link’ in document and information management is a ‘catalogue of catalogues’ allowing for a centralized search, adding that Shell is currently prototyping a meta catalogue at Shell. Van Kuijk believes that the basic metadata can be captured 42 attributes. Shell plans to share this categorization with other oil companies in the near future, through CDA and POSC.
Eldar Bjorge updated Statoil’s upstream IS/It strategy ‘SCORE’ project which has been running since 1998. SCORE is currently extending to the drilling arena with the NaviBoB project. SCORE has resulted in Statoil’s choice of OpenWorks as project data store, and Slegge for the Corporate Data Store (CDS). Statoil’s objective is for a ‘robust data management environment.’ To achieve this, some 40 standards are in daily use within Statoil. These cover around 100 data types in 10 disciplines, all with standard definitions of the most important data types, standard data flows. Data management processes are organized through the web portal. The aim is to populate the CDS with ‘used, quality controlled data.’ Population is performed at project milestones and data is flagged and documented as it enters the CDS. New projects can then kick off with validated data and interpretations. Staoil’s Slegge (also known as the Schlumberger EPDS or the ‘iStore’) currently holds around 30 data types. Schlumberger is also developing an EPDS browser ‘iSurf’ providing table and map browsing of data in the iStore.
Alan Smith (Paras) recognizes three types of data repository. The ‘bank vault’ – single company, the ‘marketing’ e-commerce data sales portal and the ‘club’ members only à la CDA or DISKOS. Smith drew from Paras’ experience – a lot of it in South America - to attempt to analyze the benefits of different repositories. He concludes that though costs can be reduced through sharing resources and cleaning up data jointly, such benefits may not always be realized. In the ‘clubs’, many still duplicate data and few have destroyed their own copies because of the expense. Moreover, the Clubs have proved to be slow movers. Smith’s vision of the future is of minimal data duplication. Contractors and data ‘publishers’ will offer online access to client company’s workstations. This is to be achieved through web services ( UDDI, XML and SOAP). Smith asked ‘what’s holding up web services’ suggesting that the desire to ‘touch and hold’ one’s own data may exist. On bandwidth, Smith doubted whether this would grow quickly enough to match rapidly expanding demands of upstream bulk data delivery.
Paul van der Kooy told how Shell Brunei’s data managers lacked feedback from users as to the poor quality of its corporate data. Shell decided to implement a data quality visibility program that monitored data quality by recording requirements, urgency and other metrics. A data table shows targets and the degree to which these were met or underperformed. Color coding of this information allowed for visual analysis of Data Quality Performance Indicators (DQPI). Another data issue in Shell Brunei was that the ‘last step’ of interpretation – populating the corporate database was often forgotten. The old system involved cutting and pasting of log data to produce a summary report for the corporate database - a complex and error prone process. The solution was to move the CDB to the heart of the workflow. Recall now constitutes the core of the work process and is used to generate accurate and timely TRAPIS reports. The CDB ‘is not dead’ – but is enhanced by bi-directional data flow to and from Recall.
Jill Lewis (Troika) and Sias Oosthuizen (FileTek) presented File-Tek’s StorHouse RM combination of relational database and hierarchical storage management system. FileTek has been working with data transcription specialists Troika to develop a seismic archival system capable of storing and recovering seismic data on a trace-by-trace basis. A paper by Richard Summers of startup Infoarchitectural Dynamic Technology described futuristic databases whose structure would learn and evolve from queries. Watch out for the Internet Enabled ‘True Spatial’ Database currently in development as Interspace. This will roll out next year and which will provide 2D and 3D mapping using this technology on a ‘cellular Linux’ database.
According to Nick Larcombe, acquisitions and mergers are making our business move at a greater speed than the data lifecycle. BHP’s Technical Information Systems Strategy Projects ran from 1995-1997 and covered the subsurface ‘value chain’ through to asset disposal. The projects looked at technical applications, data support, processes and organization. The data management long term goals derived from this were ‘to improve BHP’s E&P data management by providing on-line access in each regional office to essential technical data relevant to that office, validated, completed and up to date’. BHP Algeria has been a test bed for these techniques which have been deployed with help from Venture Information Management. A project steering group carried out a data management survey of the asset team using a questionnaire to determined the workflow and processes. For example what data went into Finder, and what data ended up on diskettes in someone’s desk drawer! Interviews were conducted and recorded. A results matrix of Data Type against Storage Location was established. 87 data management ‘issues’ were identified and grouped into categories. The study provided a powerful tool to identify bottlenecks, a business case for improvement projects, a holistic assessment of the impact of changes and a base line for measuring future improvements. Less tangible benefits included improved relationships with the asset team. Larcome concluded that culture is much more important that standard software in improving the overall work process.
Paul Maton (POSC) stressed the value of information sharing – citing the Open Source movement. A residual problem is the lack of a common understanding of semantics and context of data and information. While data management has driven down the time spent looking for data, data quality remains a problem. In 2001 POSC (based on work done by Shell) released a new ‘very simplified’ version of Epicentre and the Software Integration Platform (SIP). They ‘cleaned out lots of inheritance complications’ and have published a draft specification for comment. Also working with BGS and DEAL on a PipelineML. Maton presented what was described as an ‘emerging 2002 program’ as follows;
- Continue XML work
- Continue practical well log standards
- Pilot Web Services
- External collaboration
Maton concluded with a plug for the DTI/IBM/POSC web services initiative.
According to Alessandro Allodi, NAM has some 1,200 different information sources. To manage and access these in a coherent way, metadata (‘information about information’) is required. Allodi has developed a sophisticated and adaptable metadata management system to leverage metadata gathered in NAM’s ‘DIANA’ system. NAM’s data management vision is that ‘all the information that a business professional needs must be available at any time, any place in the appropriate format and with a known quality level.’ Central to this is the creation of the Information Asset Register – the metadata warehouse. This combines the best of object oriented and relational data modeling to make for ‘stability in a world of changing data and business models’. The development, using IBM’s WebSphere, Enterprise Java Beans (EJB) and XML data exchange, took a mere two months to complete – but much longer to get buy-in.
Jan Roelofsen, (IHS Energy - Petroconsultants) wants us to ‘make more of geological data.’ Roelofsen’s goal is to provide information management tools to help the basin analyst understand the timing and migration of petroleum in the context of the structural evolution of the reservoir. This is done with technology developed by Petroconsultants on behalf of ENI (AGIP) since 1998. The project involved creating an information system capable of replacing drafted petroleum systems charts (such as that due to Kingston 1980). The resulting product, Petroconsultants’ Basin Analysis and Sedimentary Environments (BASE) is an Access database coupled to a GIS ‘play map’ and a query builder to generate Basin Evolution Diagrams. Ages are managed through equivalence tables using a fine timescale with ~1my steps.
Trade-Ranger, which claims to be world’s largest online marketplace for goods and services traded in the oil and petrochemical industries, is to offer ‘strategic sourcing’ through Frankfurt-based Portum, a provider of e-sourcing solutions.
Under the terms of the contract, Portum will be the exclusive provider of online auctions to Trade-Ranger. Trade-Ranger members will be able to arrange ‘do-it-yourself’ auctions using the Portum technology.
Portum will also provides RFX (Request For Quotation/Proposal/Information) technology and professional services directly to Trade-Ranger members. These solutions will be tailored to members’ specific requirements to ensure proper support and implementation.
Portum CEO Gerald Heydenreich said ‘The sector has embraced the concept of e-procurement and e-sourcing. Cooperation with an established marketplace of this size shows how strategic partnerships can enhance the value proposition to members and deliver true bottom-line benefits.”
Trade-Ranger CEO Claire Farley added “This is a challenging
proposition but one we're confident that Portum can handle. Working together,
we can help the industry make valuable and efficient use of online sourcing
tools.” Procurement allocation by Trade-Ranger’s 20 buyer members represents
more than 25 percent of annual global spend in the energy and
Open Spirit has just released two components to integrate upstream data with ESRI’s ArcView. The Open Spirit Shapefile utility scan projects, in any OpenSpirit supported data store and outputs ESRI Shapefiles for a variety of upstream data objects. Well locations, 3D seismic survey outlines etc. can be extracted. Regular batch updates keep the Shape files up to date with the project data.
ArcView, or any other application capable of reading Shape files can be used to spacially view data contained in multiple project sets, reducing the time it takes to locate data of interest.
The OpenSpirit ArcView Extension works with the Shape files generated by the Shape Utility and turns ArcView into an OpenSpirit-enabled application. A user can connect to an OpenSpirit session and broadcast cursor tracking locations and object selection events to other OpenSpirit-enabled application’s.
OpenSpirit now ships ESRI-enabled well, section and 3D viewers ‘free’ with its runtime environment. ArcView can now be used as a geographic data selection front-end. The OpenSpirit viewer can locate, visualize and QC the data prior to loading to a OpenSpirit-enabled applications.
The annual meeting of oil and gas leaders hosted by Cambridge Energy Research Associates – CERA Week was a subdued affair this year. CERA MD James Rosenfield reminded participants that they were ‘at the epicenter of worldwide changes.’
Industry turmoil subsequent to Enron’s failure was discussed. Panelists discussing ‘The New World for Trading and Risk Management’ said it was too soon after the Enron bankruptcy to determine what the next ‘new idea’ would be, suggesting successful strategies for growth in energy trading, should include ‘sound business and financial practices, risk mediation, and consolidation.’ One speaker noted a change in business strategy from growth at all costs to ‘steady, reliable earnings.’ The real winners were companies with a consistent long term strategy.
Gas demand to double
During the natural gas session, one speaker opined that demand was set to double by 2020 and that gas was ‘the ideal bridge between a growing economy and the environment.’ Gas future looks strong, with real opportunities for the industry worldwide, according to both international and domestic providers.
Stephen Butler, chairman and CEO of KPMG said that the accounting profession must change metrics to ‘stay current with the new ways businesses find value.’ Mary Tolan, Managing Partner for Accenture, questioned whether what we are experiencing is indeed a ‘new world’ or merely the downside of a regular business cycle. Algerian Minister of Energy and Mines Chakib Khelil noted ‘blatant living conditions disparities’ as a major risk factor.
A new release of Hampton Data Services’ GeoScope 3.0 now leverages Java technology to improve usability and database connectivity. GeoScope offers document archiving and retrieval over Intranets and the Internet on a broad range of platforms and operating systems. GeoScope combines the search ability of a database with the intuitive searching of objects within a graphical interface (typically a map) to recover appropriate documents for any given search criteria.
PDM asked Hampton’s IT Manager Gary Marshall-Stevens how the Java technology had been deployed. Stevens said, “Currently the Java client is a 22MB download that requires the JDK to be installed. We are looking to run Geoscope as an applet - without any client installation in the near future. In the server there are two components - both written in Java - the GeoFile server and the GeoIndex server.”
“We try to integrate with our clients IT environment so that for instance our text searching, which normally runs on Microsoft’s Internet Index Server, can adapt to use Unix tools such as InfoSeek. We try to be modular. Our future plans include a move from Microsoft Active Server Pages to a cross-platform tool - Java Server Pages. We have been working with Canadian clients on tuning GeoScope for use with the PPDM data model. The PPDM data exchange XML work has helped us enhance GeoScope’s scalability significantly.”
Writing in the latest edition of in-house ProMax newsletter, Rick Williamson reveals Landmark’s strategy for Linux deployment in its new SeisSpace processing application. Williamson expects Linux to have a major impact on the E&P industry - especially for compute intensive applications like seismic processing. SeisSpace has been designed to take advantage of distributed parallel systems such as clustered Linux systems with parallel input and output.
A benchmark test performed by Landmark on a marine survey using shows a near four-fold improvement in parallel processing with SeisSpace over ProMax 2003.3. Both SeisSpace and a hybrid processing solution consisting of embedded ProMax code running within SeisSpace are anticipated to have near linear scalability up to and beyond 16 processing nodes.
Initially SeisSpace will be an optional component of ProMax that will add 3D prestack time migration to the processing workflow. Eventually though, SeisSpace will replace ProMax in Landmark’s product catalogue. SeisSpace will share code and integrate with DecisionSpace - Landmark’s ‘next wave’ E&P technology that offers a common desktop and data viewers.
Speaking in Paris this month, AAPG president Robbie Gries forecast that the world will still depend on oil and gas for the next 60 to 80 years. Current reserves are estimated at 66 years for gas and 37 for oil supply but this assumes zero growth in the world economy. When probable growth is included in the calculation, existing reserves will only last a couple of decades! ‘Unconventional’ energy sources such as such as coal bed methane and tar sands will likely ameliorate the situation.
Risk of mergers
Today, oil companies find it easier to increase reserves by merger or acquisition, but the risk here is that global reserves are not affected by these transactions. What the industry needs are new methods for reserve discovery. To find such overlooked reserves the industry needs to overcome dogma, to think ‘out of the box’!
Gries offered a review of dogma-busting thinking such as Maury Deul’s work on coal bed methane – which turned a hazard into a revenue stream. Don Todd’s work in Indonesia led to the birth of off-shore exploration and the first production sharing agreement. Closer to home (for Greis) the sub-salt play in the gulf of Mexico and Peter Vail’s work on seismic stratigraphy were lauded as dogma busters.
Gries announced that as of July 1st, paid-up AAPG members will have access to the entire AAPG digital library, from 1917 to the present.
Comment - Gries omitted to mention an interesting episode in the AAPG’s past. In the 1960’s and 1970’s when researchers worldwide were making spectacular discoveries in the field of plate tectonics, the AAPG’s reaction was to throw its weight behind the geological establishment with the publication of a Memoir backing the verticalist dogma of the day!
Paradigm Geophysical Ltd. has released a new version of its Geolog well data processing and analysis software package. Geolog extends its Nuclear Magnetic Resonance (NMR) processing and interpretation, well log imaging, petrophysical analysis and electrofacies modeling.
The new NMR processing and interpretation module is the fruit of ongoing technical collaboration between Paradigm professionals and NMR experts at Chevron and other major oil companies. The new module provides NMR log processing from raw field data to final results, using proven and robust inversion options and supports the latest generation of NMR tools. Enhanced Precision Mode (EPM) and Density Magnetic Resonance Porosity (DMRP) processing options further extend the package.
The advanced Facimage electrofacies analysis tool kit is a field-proven module to enhance petrophysical and geological analysis. Consisting of a suite of routines for electrofacies analysis and core data modeling, Facimage makes the TotalFinaElf Multi-Resolution Graph-based Clustering (MRGC) algorithm available to the industry-at-large for the first time. MRGC analyzes the underlying structure of data in order to define natural data groups, free from operator bias. It provides a multi-resolution electrofacies classification, so that a final model appropriate to the scale of investigation can be selected.
Paradigm Name Service
Paradigms’ CORBA-based Paradigm Name Service locates projects across multiple servers and databases. Data servers are currently available for Landmark’s OpenWorks, GeoQuest’s GeoFrame and T-Surf’s GOCAD. More from www.paradigmgeo.com.
OFS Portal, the Houston-based grouping of upstream oil and gas suppliers and service companies has announced the successful implementation of ChanneLinx software. This provides validated e-catalog content to customers in standard output formats, such as those used by Requisite, Ariba, and Commerce One.
OFS Portal CEO William Le Sage said, “We selected ChanneLinx because of the seamless manner in which we were able to integrate its functionality into our existing framework. This will bring additional flexibility to our offering, enabling us to meet the needs of both customers and suppliers, and will accelerate OFS Portal’s plans to deliver e-catalog content to the upstream oil and gas industry.”
ChanneLinx’s solution is compatible with OFS Portal’s API/PIDEX-compliant product classification system and will allow OFS Portal members to manage catalog content online. Oil company buyers can browse customer-specific sections of the overall catalog in a secure manner in order to search content and approve contract pricing.
Petroleum Place unit Paradigm Technologies has completed the integration of accounting data resulting from Marathon Oil Company’s recent $993 million acquisition of producing properties from CMS Energy in Equatorial Guinea, West Africa. Using Paradigm’s data transfer process and proprietary business tools; Paradigm technicians moved the CMS data into Marathon’s ExcaliburEDGE system within a 24-hour period.
Paradigm’s Integration Director Donald Willis said, “The challenge was to extract all subsets of accounting and operational data from CMS Energy’s comprehensive dataset and integrate them within Marathon Oil Company’s environment. CMS has been using Paradigm Technologies’ Excalibur EDGE software for years and Marathon wanted to continue managing these acquisitions with a proven, industry-leading application. Our team utilized ledger balances reported prior to, and following the exercise in order to establish and verify the stability of the dataset as it was integrated onto the Marathon server.”
IHS Energy Group’s Geneva-based Petroconsultants unit has launched Oil: Roundup – an e-mail based news service that provides a succinct weekly summary of oil markets and activity. Oil: Roundup uses information from various sources, including IHS Energy Group’s own databases, to form a ‘valuable resource for anyone with an interest in oil prices and changes to associated fiscal and political environments.’
Oil: Roundup editor Robert Copson said, “Oil: Roundup provides essential reading for many managers needing to keep track of fast-changing prices and markets,” he said. Oil: Roundup is available free-of-charge and the news brief can also be accessed for free in the “News” section of the IHS website - www.ihsenergy.com.
IHS PropertyMarket offers instant visibility of Canadian oil and gas properties to the 7,000 users of AccuMap, claimed as Canada’s most widely used desktop exploration tool.
IHS’ Kris Dudas said, “AccuMap users can simply turn on a map layer in the AccuMap desktop mapping application and access the latest information on available properties. It’s like having an oil and gas property realtor at your disposal for the Western Canadian Basin.”
Accumap CTO Ganesh Murdeshwar added, “This is a huge advance in ease-of-use and customer reach in acquisition and divestiture. If you want to search for available properties today, you’d have to leave your AccuMap environment and perhaps log on to as many as 20 different Web sites.”
For property advertisers, PropertyMarket offers the most direct access to the largest market in Canada. More from www.ihspropertymarket.com.
The UK’s Industry Technology Facilitator is seeking proposals for its 2002 program in fields such as high resolution imaging, ‘brown field’ reservoir management, increasing recovery efficiency, accessing small reserves and improved well placement. The ITF is also interested in ideas for innovative technology that does not fall within the main themes but could significantly benefit the UK’s oil and gas industry. Proposals that do not fall within the themes will be assessed at regular intervals throughout the year.
New for 2002, is a special fund for potentially ‘game-changing’ technologies that are at an early, uncertain stage in their development. Under the Pioneer Programme, operators and major contractors will be invited to contribute to the fund. This will then be allocated to small pre-JIP projects in such a manner as to spread the risk between the contributors. The first contribution to the Pioneer Programme Fund will come from ITF itself, with £50,000 from 2001’s operating surplus being used to kick-start this important initiative.
Previously ITF membership was restricted to oil companies. For 2002 onwards, ITF is ‘seeking closer links’ with the service sector. More from www.oil-itf.com.
A survey released last month by data warehouse specialist Kalido Ltd. claims that 96% of global enterprises suffer from ‘information frustration.’ The survey, carried out by IT research house Harte-Hanks, studied ‘the most critical information issues that the world’s leading companies are facing today.’ Nearly 70% of the respondents felt that their information systems delivered inconsistent reporting, and that it took too long to collate information. Almost 60% found their current systems did not have the flexibility to handle change. A similar number were concerned about the accuracy of their data sources. Some 60% of respondents were planning data information integration projects and while the survey indicated that tackling integration would be difficult, market studies estimate a combined growth for ERP, CRM and SCM vendors from $50 billion today to $96 billion by 2005 (AMR Research).
Mary Fleming is the new executive director for the Society of Exploration Geophysicists Fleming was previously Director of Programs at the American Statistical Association.
Network International (formerly NetworkOil) has promoted Stuart Page to Chairman and CEO and J. Boyd Heath to president and CFO.
Paul Frame has been elected Chairman of the Board of Directors of Seitel. He is to continue in his role as President and Chief Executive Officer. Frame replaces Seitel founder Herbert Pearlman.
Dane Isenhower has been appointed VP and General Manager of Petroleum Place Energy Advisors. Isenhower was previously director of the Houston office of Waterous & Co.
Bill Via has joined Subsurface Consultants & Associates (SCA) VP Training. Via was President of independant Sharpe Energy.
Kurt Hillman is to lead UpstreamInfo’s global business development effort. Hillman was previously on the Upstream Merger Team with ChevronTexaco.
3DGeo Development Corporation has opened an office in Houston with 40 SGI CPUs and a 512 CPU Linux Cluster destined for 3-D wave-equation depth imaging.
Landmark’s GeoGraphix unit is to resell VoluMetrix’ FastTracker Windows-based reservoir model building software. The agreement gives GeoGraphix worldwide distribution and marketing rights over the FastTracker reservoir modeling and visualization product.
FastTracker uses innovative model building technology that includes automatic documenting and recording of the model building process. FastTracker’s ‘UpdateAbility’ functionality records every step in the model building process and allows for editing of model building steps so that a new model can be built incorporating updated information or new hypotheses.
FastTracker, an interactive, dynamic system, provides geoscientists and engineers with the ability to assemble relevant geologic, logging and seismic information to build 3-D models based on a reservoir’s lithological and physical properties prior to export to the flow simulator. FastTracker extends the geophysical and geological workflow of GeoGraphix’ Discovery solutions suite downstream to 3-D reservoir modeling visualization and upscaling for export to the flow simulator.
Landmark president and CEO John Gibson said, “Adding FastTracker to the GeoGraphix Discovery solution is an indication of Landmark’s commitment to the PC revolution in the E&P marketplace. The availability of FastTracker as part of Discovery allows our customers to integrate workflows from prospect generation to reservoir modeling on their Windows-based desktop.”
Volumetrix MD David Cram told PDM “FastTracker was designed from scratch with the Petroleum Engineer in mind. The underlying philosophy is that it is the business process - not just building a model - that counts.”
Volumetrix origins go back to Sperry-Sun and Dresser and an old BP Alsaka project - ‘Decision Driven Reservoir Modeling.’ FastTracker, developed in C++ with no legacy code or libraries, takes a rigorous approach to making and updating the model. There is no manual ‘fiddling’ with the data inside the model - users can only manipulate the input. All objects - faults, horizons etc. - ‘know’ their ancestors and descendants. A right mouse click reveals object dependencies.
Drag & drop
Windows drag and drop functionality is exploited so that a model can be dropped onto the section viewer. A range of modeling options are available - Monte Carlo simulation, Geostatistics and geo-object modeling. The Landmark deal initially ties Volumetrix FastTracker to the PC-world of GeoGraphix. Subsequent extension into the UNIX OpenWorks environment is probable.
In the early 1990s, BP began looking at IT outsourcing as a way to reduce costs, gain flexible and ‘release value’ in its UK Exploration unit. Science Applications International Corporation (SAIC) was selected and a new ‘open book’ ‘risk/reward’ outsourcing model was tailored to BP’s needs. Subsequent generations of this model have been successfully applied, notably to BP’s Houston unit in March 2000 (Oil ITJ Vol. 5 N° 4). BP claims its IT costs were reduced by 40% globally over the first three years of the outsourcing engagement and have continued at a 10% annual rate. Now BP’s Houston unit has awarded SAIC a further $360 million contract to provide more IT infrastructure and managed services.
The new contract is for a four-year period to provide IT services to BP’s lower 48 upstream business units in Houston. SAIC will act as a single point provider or aggregator of IT services including data management, web development, applications support, and consultancy services.
Mark Pierson, SAIC senior VP and manager of SAIC’s Energy Technology and Services Group said, “SAIC has established a long relationship with BP around the world, and this new contract allows us to implement a new service delivery model and IT architecture enabling BP Houston to take the next major step in achieving cost reduction.”
Steve Peacock, VP of BP’s Upstream Digital Business organization added, “The new service delivery model does several things for BP - it rationalizes a complex suite of existing contracts and services, and brings a sharper strategic focus to what we source in-house versus acquire from the service industry. It also positions us to take advantage of future technologies that will leverage Web-based, externally sourced services.”