Landmark's Atomic mesh flow simulation
PennWell’s 3 year ‘long march’ from beneficiary of FTC’s largesse to implementation looks as though it is nearing its goal. As reported in Oil IT Journal Vol. 4 N° 6, PennWell’s PenPoint subsidiary was created to leverage the dataset it inherited from Dwights after the merger with PI, way back in 1995. This action was an attempt on the US Federal Trade Commission’s part to avoid a monopoly situation in oil and gas data delivery.
Since then, PI/Dwights has become IHS Energy, acquired Petroconsultants and generally strengthened its dominant position. Despite great intentions – and plans to develop a brand new ‘state-of-the-art’ database – PennPoint has so far failed to set the data world afire. This may be about to change as PennPoint – now re-baptized PennEnergy is to ally with upstream e-business portal specialist PetroWeb.
1.3 million wells
PennWell has signed a ‘multi-year’ agreement with PetroWeb to ‘develop the data delivery infrastructure and interface’ for PennEnergy’s data. The data set comprises well and production data for over 1.3 million U.S. oil and gas wells.
PennEnergy Data CEO Jayne Gilsinger said, “Since receiving the FTC license to the historical well and production data file, we recognized the real challenge for a successful energy data product is to move away from the cumbersome and costly traditional collection and delivery processes. Our new data product is more affordable and streamlined to support today’s data management requirements.”
PennWell president Bob Biolchini added, “PetroWeb makes a natural partner to develop our data delivery framework, given its industry leadership position. We are fortunate to have the unique combination of oil and gas and information technology expertise with PetroWeb.”
PetroWeb president Dave Noel concluded, “The marketplace will have a powerful, new approach to data access, management and delivery. PennEnergy Data will offer the industry cost-effective data solutions based on leading-edge technology.” PennEnergy must now play catch-up with IHS Energy which has consolidated its market leadership while PennWell was getting its energy data act together.
San Francisco-based private equity firm Fox Paine is to acquire Paradigm Geophysical in a $100 million transaction. Fox Paine is paying a 38% premium over Paradigm’s recent average closing price and already has agreement from 42% of Paradigm’s shareholders.
Fox Paine CEO Saul Fox said, “We have been active investors in the oil and gas and power industries since the ’80s and understand the significance of Paradigm’s industry leading products and services. We are pleased to be associated with the Paradigm management team, led by chairman Eldad Weiss, who has built a dynamic, global operation serving the upstream petroleum industry.”
Weiss added, “This is a vote of confidence in our people, products and customer relationships. This deal will enable us to pursue our goals of delivering superior technologies, expanding through acquisitions and widening our range of products and services for the oil and gas industry.” The acquisition was made by Fox Paine unit Paradigm Geo-technology BV, located in London and the Netherlands. The deal is expected to close in August 2002.
Although the oil price has been in the mid $20s for a while, oil and gas companies around the world are pursuing the business strategies that evolved during the last downturn. One facet of this activity is that sooner or later, many folks who thought that they had an oil company job for life, now find themselves ‘outsourced’ with a brand new employer. I don’t personally have too much of a problem with this, having been ‘restructured’ myself a couple of times in my career, but I have been puzzling over the business model that underlies the outsourcing philosophy.
Recently, outsourcing offerings come from top-flight management consultancies located in marble-clad buildings downtown. The management consultants are rather different beasts from other oil and gas service companies. Many companies perform de-facto outsourcing doing things like seismic processing, tape transcription, core analysis and so on. But most of these outfits aren’t located in the marbled downtown buildings - they are more likely to be somewhere out in the sticks, on a industrial estate. Generally speaking, the folks working for these companies, while making an honest living, are less likely than their consultant peers to have six figure salaries and stock options coming out their ears.
So my recent puzzling has been along the lines of - how does the top-flight management consultant manage to a) pay its employees quite a lot, b) pay its shareholders sometimes substantial dividends and c) to ‘leave’ value - in the form of a market capitalization - inside the company.
Where does this money come from? Barring short-term ‘irrational exuberance’ from investors, it can only come from the client. Paying top dollar day rates may be justified in some circumstances. Take the case of the company auditor. Companies will pay a huge premium to get the approval of an internationally-recognized top firm. This is justified if the top-firm label virtually guarantees the fidelity of the accounts. The big accounting firms - at least in this context - generate cash from goodwill. Hence the catastrophic destruction of Anderson’s value in the aftermath of Enron.
The management consulting business leverages goodwill in a similar fashion. At face value, these organizations develop and apply new theories as to how improve business by generating more revenues or by cutting costs. As you all know there has been a lot of debate recently on the legitimacy of a consulting outfit auditing the accounts of its clients. It is held as self evident by just about everyone except some of the more reticent consultants themselves, that advising a company on how to run a business and auditing the same company is a manifest conflict of interests. The recognition of this is leading management consultancies to separate their consulting and audit functions with more that just the ‘paper walls’ of yore.
As long as you are talking about an activity that is going to affect the corporate bottom line in a big way, then the overhead of the marble clad building, the consultant’s stock options, the shareholders’ dividends and expectations of future capital gain may be justified. But what about more mundane outsourcing activities closer to home - like scanning reports, loading workstations and copying tapes? It seems to me that the ‘natural’ homes for these thankless tasks are the specialist companies located on industrial estates - much along the lines of the seismic processor. Such outfits will hopefully generate a modest amount of excess cash - but their business model is not exactly that of the cash-cow consultant.
What I can’t figure is where the SAICs, the Schlumberger-Semas and the Accentures lie in this space. Particularly as they all associate the outsourcing of rather mundane tasks with cost savings to the client. I can’t quite square cost cutting on already low-value activities with the business model of a top six (five? - who knows...) consultancy.
Just as the consultants are being forced into separating auditing from advising I wonder whether there isn’t a case to be made for separating advising from execution. Maybe the consultants should be eating some of their own dog food and reflect on what is and what is not their own ‘core business.’
Or maybe there is another agenda. One ugly possibility is that consultants are used as a staging post in post merger down-sizing. Farming the redundant folks out to an outsource partner is more elegant than outright firing. Alternatively, as one anonymous industry source told me recently, “The whole outsourcing thing really reflects oil companies’ management refusal to fund internal IT adequately. This has pushed IT managers down the outsourcing path - but they are doing it to prove to management what the true cost of running IT support really is. Management is in for a shock.”
I’m not quite sure if this next bit really fits in, but one of the consultants’ claims over the last couple of years is the ‘poor return on capital’ of oil and gas companies. This is often cited as a reason to outsource, merger or whatever is the latest fad. I was therefore amused to see that oils made three of the top ten European companies as measured by added value in a recent survey by the UK DTI. Multi-merger, outsource crazy BP made the N° 3 slot with some € 36 bn added value in 2001. But pipping BP at the post is (until recently at least) merger-averse, ‘what’s outsourcing?’ old-timer Shell at a tidy € 40bn.
Data integration is a hot topic these days and many tools have been developed for moving data. Usually, these tools are SQL scripts and flat file loaders. At InnerLogix we have been investigating the accuracy of both the data movers and the processes used for data integration.
Much petroleum industry data relies on reference values. Measured depth values are referenced according to an elevation. Deviation surveys and well-paths refer to a north reference (grid north, true north, etc). All position values are based on a cartographic reference system (CRS). Much of our data is also unit dependent and will probably involve the infamous NULL value standards.
We have compared data from different data sources using our company’s DataLogix application. DataLogix automates data migration and integration and offers version control and QC. DataLogix also lets us compare data from different sources. In order to make the comparison meaningful the software first normalizes all the data elements to a set of common reference values. Many of the data sources for this study did not contain enough information to enable such normalization. For such data sources DataLogix allows this information to be supplied and associated with each data source.
We then tried to find people that knew how the source data was created. However, such people are not always still with the industry! Their knowledge has been lost and the value of many data sources is greatly reduced (some would say eliminated!). Using new technologies we were able to ‘reverse engineer’ many of these inconsistencies and restore the original value of the data.
Where possible we perform various quality checks. DataLogix uses statistics, geo-statistics, business logic, and fuzzy logic algorithms for QC. Such checks revealed major flaws in existing data integration processes. We discovered well-paths that had simply been copied from one data-source to the next without regard to the north reference. Well locations that appeared identical were actually based on different CRSs. Internal consistency is also an issue. Over time, master data sources are updated with data corrections and new data items, but these updates are not always reflected in the child data-stores.
We used the ‘differential-view’ feature of the software to compare the child data stores with their parents and discovered that several key updates had not made it from parent to child. We also discovered that some data was corrected in the child data store, but these changes were not reflected back into the parent data set. Since the differential-view feature is editable, we were able to reconcile these data sources.
This study showed that some existing data integration processes do not accurately normalize data before migration. We also found that few processes are in place to ensure that data is synchronized between data stores. With the use of the right software tools these errors can be efficiently detected and corrected. More on DataLogix from www.innerlogix.com.
Landmark has released a new version of The Oilfield Workstation/client server (TOW/cs) production data management system. The new release streamlines workflows through the provision of zone-level allocation, enhanced data import and export capabilities and customizable user desktops. TOW/cs now runs on Microsoft Windows 2000.
Apache Corp. is an enterprise user of TOW/cs. Production Systems IT Manager Lisa Petty Lehle said “TOW/cs is a key piece of technology that we rely on in our day-to-day business to manage Apache’s production from more than 13, 000 wells.”
Schlumberger’s Oilfield Manager (OFM) 2002 claims accelerated production forecasting, enhanced engineering techniques and improved data connectivity. OFM 2002 can create multiwell or multivariable stacked plots showing production from sub units.
New in OFM 2002 is analytical forecasting – production history can be matched using a variety of models of geometry and porosity distribution. Analytical modeling provides rigorous forecasts for wells with little or no production history, or wells that produce in the transient flow regime for an extended period.
We stated last month (and in an earlier issue of Oil IT Journal) that Peloton’s WellView was originally developed by Merak. Peloton pointed out to us that it developed WellView itself. WellView was marketed and supported by Merak from 1993-2000, while Peloton focused on software development. Today Peloton manages all aspects of their petroleum software products and has over 100 clients worldwide, from the largest oil and gas producers to one-man consultants. Oil IT Journal would like to take this opportunity of apologizing to Peloton for our inadvertent misrepresentation of WellView’s origins.
AnTech Oilfield Software has announced its StringView Millennium Software Developers Kit (SDK). StringView lets developers use data from engineering drawings, databases and spreadsheets to create professional quality well diagrams and reports.
StringView includes customized string diagrams of multi-lateral, vertical, deviated and horizontal wells, as well as single and dual completions and multiple strings. The SDK can be embedded into existing IT applications and leverages existing digital data and resources to produce vector-based graphics that do not degrade when stretched or curved.
StringView Millennium SDK uses a feature called IntelliDepth to enlarge small components by increasing their plotted length whilst maintaining their linear positions relative to components featured in other strings. The SDK is built around Microsoft’s ActiveX technology and a full WYSIWYG graphical user interface.
UK-based Geologix’ Well Information Portal System (WIPS) leverages its GEO Software applications to provide a simple, intuitive and cost-effective way of collecting and maintaining information generated during the course of a well, together with post-well reports, documents and analyses.
User feedback drove the development of the new tools which provide a compact, single-file format for well information along with the ability to distribute and share information. WIPS handles documents, reports, maps, diagrams, spreadsheets, photographs, audio files and movies in a single, compressible , GEO database file.
WIPS lets users view well logs in an internet browser and to click on a log element (for instance a core symbol), to view the associated document (e.g. a core report, core log, core analysis spreadsheet, etc.) More from www.geologix.com.
Shell Deepwater Services, a division of Shell International Exploration and Production Inc., has contracted with National High Definition Systems (NHDS) for the delivery of a knowledge management solution for communicating with its deepwater drilling rigs. Following a 16-month trial period, Shell is installing the software on its deepwater rigs around the globe.
NHDS’ patented ‘High Definition Information’ (HDI) technology enables offshore rigs to send and receive information, perform complex calculations, and display daily drilling data (including actual and projected costs, as well as depth and trouble time), lessons learned, best practices, and health and safety alerts. From rigs or land-based offices, HDI software enables engineers to monitor the drilling progress of a single rig or compare the performance of multiple rigs in a region, as well as capture, share, and retrieve knowledge for solving problems, minimizing drilling costs, and avoiding mishaps.
NHDS president Stephen Beller said, “Shell came to us looking for a way engineers on deepwater rigs around the globe could share their knowledge, despite communication constraints that made traditional Internet solutions slow and impractical. Our software not only solved their bandwidth problem, but also provided an entirely new richness and quality of information, over and above the usual drilling data. The Virtual Forum provides engineers with a very efficient way to talk about what they’ve learned and to share knowledge that helps avoid and fix problems down the line.” More from www.nhds.com.
Robertson Research has stopped publication of the EIS Energy Information Service publications Drilling Weekly, Onshore Weekly, Production Drilling Monthly, International Explorer and N W Europe Activity Maps. The EIS Redhill office has been closed and the data and databases relocated to Robertson’s offices in North Wales, where they will be integrated with other Robertson products and services to add value to them, and to form the basis for new offerings, the details of which will be announced in the coming months.
Roberstons cites declining circulation following consolidation amongst oil and service companies as the prime reason behind what it describes as a ‘difficult decision’. Additionally, the market for information services has become increasingly competitive, making the publications no longer commercially viable.
Satish Pai, who was on the way up from VP of the old GeoQuest unit to head up Schlumberger Oilfield Technologies, gave a bullish keynote on the growth of “i-business” within Schlumberger. With e-commerce projected to reach $4.3 trillion by 2005, and a $750 million deal done on IndigoPool last year, Pai sees this as a evidence that ‘i business has truly arrived.’
Schlumberger recently questioned some 30 major clients throughout the world to discover that security, connectivity and data analysis were key concerns. Companies are also looking at collaborative work and ERP integration along with e-trading and e-commerce. Pai claims that the advent of GeoFrame 4 has drastically reduced interpretation cycle times. Another novelty is that Eclipse now runs on Linux, and the rest of GeoFrame will be ported to Linux “by August 2002.”
Pai stressed the importance of the Schlumberger/Statoil data management alliance. This is revolutionizing the SIS offering in the fields of interpretation results management and web access thanks to the new Federator technology. SIS has been working to quantify the value of data management. In 1996, surveys showed that only 20% of an interpreter’s time was actually spent interpreting. In 2001, this had increased to 50%. Schlumberger’s goal is to go to 85%.
Schlumberger’s Network Solutions unit is now offering turnkey IT infrastructure deployment incorporating its DeXaBadge security solution. ChevronTexaco and Shell use the system. Schlumberger’s internal ‘internet’ SiNet has been re-baptized DeXaNet and has also been upgraded to offer secure broadband access to ‘frontier’ oil and gas provinces.
The ‘Expert Services’ branch leverages this global network infrastructure to enable knowledge management (KM) based decisions making. KM and training initiatives include SEED – donating software and hardware to schools, NeXT – educational joint venture with BP and Heriott Watt and the SIS internal Technology Mastery Program – which tracks training and skill sets of employees.
Examples of killer synergies include QuickSeis where rapid screening of seismic attributes has been successfully applied to Austin chalk drilling prospects. Inside Reality’s Real Time drilling and well planning is claimed as the ‘first 3D/VR workflow.’ Looking to the future, Pai sees a new technology and service paradigm as oil companies are ‘going digital.’ This will enable information access any time, any place – from field operations to the ‘back office.’ By around 2005, technology will bring field data to external stakeholders through portals and exchanges. The new service paradigm will be built on IT infrastructure, data and applications like SAP and Oracle. Looking at SIS’ own application portfolio, Pai sees a new software architecture built around Open Spirit – a ‘Pack and Go’ real-time cache feeding data to a Shared Earth Model usable by component-based applications. These will run on a range of platforms – from high end visualization – for 3D/VR, through web-based 2D/3D to desktop and the PDA .
Data Management guru Andrei Kalinichev introduced the new concept of the data ‘stream.’ The various data (as opposed to workflow) ‘streams’ bring together (repackage) different bits of GeoQuest technology to provide ‘joined-up’ data management. Data streams current are Well Stream, Seis Stream, Log Stream and Production Stream. All share the same web and security infrastructure. The streams were originally developed for BP – which has handed over the IPR to Schlumberger. When Schlumberger sells a ‘stream’, this will be customized to a client’s workflow – and could even include Landmark technology. Streams are ‘perpendicular to the data integration spectrum.’ The process starts by observing a client’s workflow for a month or so, before developing a stream – a path through client data and workflows. Third parties will likely be involved – Hayes, Kelman, Landmark and Western Geco. ‘It doesn’t matter who you use, SIS will stitch them into your workflow.’
Another client contribution, from Statoil, is a web-enabled interface to data services, along with ‘business objects’ to capture user workflow. Web access to the Federator meta data catalog acts as a front end to the ex-Slegge data store (also known as GeoTrack or just ‘the E&P Data Store’!) which allows for management of interpretation results.
Secure Data Access
Steve Skilitami presented Schlumberger’s answer to the entitlements and access problem inherent in National Data Repositories - Secure Data Access (SDA). Entitlements should be transparent to the user. This requires read/write access at table & row level. SDA is bundled with the January 2002 Finder 9.2 release and leverages Oracle security technology.
BG outsourced its’ upstream IT/IM ‘Petrotech’ unit to SAIC and Schlumberger five years ago. Mark Setrem described the challenge set by BG management – to minimize unproductive time spent on data management, to provide ‘in your face’ data and to ‘maximize data value’. BG spends a lot of time moving data through its GeoFrame-based units, keeping data in sync with overseas offices. BG embarked on a review of applications and determined that in the future, data management must support everything. There is to be ‘no more widget writing.’ An evaluation of GeoFrame determined that user buy-in was good - ‘at least as good as Landmark!’ Third party links remained an issue. BG is working with Roxar on the link to IRAP (waiting on an Open Spirit link – hopefully the panacea!). Another problem is that users are reluctant to change workflows and still use ‘Landmark-style’ workflows which don’t fit with GeoQuest products.
The aim of the BG Visualization Center is to ‘bring the right people together in the right place at the right time.’ Now that Schlumberger has acquired IR, the problem of multiple 3D viewers is very much in evidence. GeoFrame has three different 3D viewers and there are upwards of six others within the GeoQuest product line. This makes it hard for users. The aim is to have one integrated 3D solution, with an ‘Open API’ for third party access.
Chris Lockyear (BP) outlined BP’s research into new ways of learning - particularly with NeXT online training initiative. The Competency Home Page allows for self assessment and priority setting. The system ‘tells you’ how to achieve improvement and recommends suitable courses. Uptake remains ‘patchy’ with half of the users completing under 25% of the course. 70% completed the study program at home - they felt guilty doing it at work. E-assessment and e-learning have a role to play, but they are ‘not the whole answer.’ One to one conversation and conventional training are essential. Effective use requires setting expectations, providing a work-space and setting aside time for ‘hybrid e plus conventional’ learning during the ‘teachable moment.’
GeoFrame IV in NAM
Shell’s domestic NAM unit specializes in very large projects, with over 1000 wells, 20 seismic grid libraries, which presented a challenging migration task. NAM’s van de Sande stated that the move to GeoFrame IV was driven by its improved integration, especially with Drilling Office, and enhancements such as a ‘true shared database ’ and synthetic seismograms ‘that work’. The GeoFrame 3.8.1 – 4.0 upgrade process was ‘complicated’ and required better Oracle design competency and significant upgrades to hardware. Restoring GeoFrame proved time consuming – between 4 and 24 hours. The size of NAM’s projects stretched the GeoFrame upgrade process to the limit – ‘perhaps we are approaching the limits of the engineering.’
Terje Flaten (Statoil) shared Statoil’s ‘e-field vision’ of intelligent, self-actuated valves and drilling technology aimed at optimizing production. E-field business drivers are lower cost and the ‘skill gap.’ Key e-field processes include real time, right time and on demand data (such as 4D seismics). According to Flaten, ‘The live simulator is on its way.’ 3D has produced a ‘dramatic improvement’ in mapping. Permanent seabed captors have yet to prove their worth. Surface imaging may be lacking but downhole seismic technology ‘never really took off.’ Flaten deprecates modern ‘CAVE’ technology – saying we need a better user interface. While there is a lot of fiber in the North Sea, it is very under utilized. This could be used for visualization. Statoil has combined fiber and hydraulic cables with a high voltage line to the Snohvit gas field to allow for remote, unmanned operations.
Gert van Spronsen described Shell’s ‘extensive multiphase experience’ notably on the FLAGS – 450 km 36” pipeline. Van Spronsen claims there is no commercial multi-phase modeling software so Shell developed its own tool ‘TwoPhase’ in-house. TwoPhase has been extensively tested against measurements made on Shell’s Backton (UK) multiphase test loop. In the 1990s, multiphase modeling was considered to be ‘mature,’ but use of TwoPhase was limited to domain specialists. Today this has changed. The in-house algorithms have been moved into DLLs, which are added-in to commercial modeling tools. The DLL contains Shell thermodynamics and the multi-phase routines – programmed in C++. The Shell DLL was incorporated into Baker Jardine’s PipeSim (now part of Schlumberger) allowing Shell to decommission TwoPhase in 2000.
Schlumberger’s information management focus is on both data access and ‘external and internal integration’. The Finder data footprint is expanding to include production, drilling data and core/sample management. The Hummingbird text search engine has proved functional and fast. Open Spirit remains key technology – third party vendor Infologic has added geochemical analysis to the Open Spirit framework. Portal development with Plumtree is another major focus. SIS increasingly outsources software development to Infosys which has 3000 developers based in India. This relationship is claimed to offer flexibility – if a customer has a problem, a 20-30 people team can be brought to bear on the problem ‘within weeks’.
What’s next ?
The Web Solutions unit is to combine the Digital Workspace, iSuite Applications with iStore Information Management into a new Enterprise Application Framework. This will involve the consolidation of the web interface of Finder, AssetDB, SeisDB and LogDB into the DecisionPoint PetroTechnical Business Framework. A new front end – “iSurf” will offer GIS and data browsing over multiple databases. Web-based data management workflows will underpin data transfer, comparison, editing and loading. A similar geological workflow ‘Sample Streams’ will capture core and cuttings attributes. SmartView, the replacement for GeoQuest’s embedded GIS browsers, will use ArcView 8.2 and allow for full well path labeling, shotpoint annotation, xyz scatter files and the export of ShapeFiles from Finder. Finder 9.3 (due for Aug/Sept 2002) will include the EuroFinder sample/core data model. Finder will migrate to Oracle 9i. A new ‘AssetDB lite’ product will be released to allow for management of physical assets ‘from the file room’ – AssetDB (heavy) is for physical asset management in the warehouse.
This report is abstracted from a 16 page report on the Schlumberger iDiscover 2002 Forum produced as part of The Data Room’s Technology Watch Reporting Service. For more details on this service contact email@example.com .
John Sherman’s top-level view of the state of the industry was a ‘tale of 3 curves’. Over the last century, the oil price has averaged at $16-18 per barrel. Globally, we have always appeared to be just behind the peak in world oil production. Moore’s law is driving the IT challenge and opportunity – one facet of which is ‘to put more compute resources down hole.’ Sherman cited a CERA study that suggested the ‘efficient use of integrated technology can cut finding costs.’ Sherman illustrated such possibilities with some impressive Spec Decomp imagery of a deltaic environment two miles beneath the Gulf of Mexico.
Sherman claims ‘breakthrough’ status for Landmark’s new ‘modeling while interpreting’ technology - first revealed in Oil IT Journal Vol 6 N° 9. This is enabled by a move away from the traditional surface-based interpretation paradigm to a new ‘polyhedral-mesh’ based construct. Instead of working with surfaces, the action of picking a horizon or fault defines, or refines, a set of space-filling polyhedra. At any point in an interpretation, models that require fully-determined 3D spatial elements can be used. The boundary between geophysical, geological and reservoir engineering models is progressively removed. This ‘will take about another two years to develop fully.’
Another new technology has developed out of Landmark’s Decision Space economic modeling. Infrastructure-led interpretation incorporates surface facilities and their economics into the interpretation and modeling process. All this leads towards the Field of the Future – a.k.a. the E-Field where integration cuts across 4D seismics, model building and field automation. This is a part of the ‘real time paradigm shift,’ – a movement from waterfall (sequential) to parallel process. All this underpins faster prospect generation – giving companies a competitive edge.
Push right, push left!
Following his stint with Dell Computer last year, Sherman was able to provide an insight into Dell’s outsourcing business strategy of ‘push left, push right.’ Dell is always seeking to externalize upstream and downstream components of its business. It is pushing ‘left’ to outsource and streamline manufacturing, and pushing right to let the shipping company do more of the distribution. The equivalent for an upstream oil company for push left is clear enough in the Norwegian context – you ‘push’ data management into PetroBank. Pushing ‘right’ in the E&P context would involve outsourcing prospect analysis.
Nancy Benthein presented Landmark’s innovations - particularly the growing importance of the Linux operating system. Benchmarks of SeisWorks on Linux have produced spectacular improvements. For example the speed numbers for a Sun Ultra 60 are comparable to a bi-processor Dell, but the price differential is tenfold! Similar comparisons have been made with Zmap+ running on IBM Linux Intellistations versus SunBlades with a 4x speed improvement and a 5x price edge – giving an overall 20 x price/performance in favor of the IBM/Linux machine. David Malicki underlined the fact that Linux was making headway in the commercial world with Oracle and SAP ports. Linux is deployed by Google, Conoco, Shell, Amerada Hess, WesternGeco and CGG.
GeoProbe not on Linux!
Contrary to previous indications, Landmark has no immediate plans to port Magic Earth’s GeoProbe to Linux. No doubt this reflects the problems of marketing ‘high end’ visualization technology at the same time as discovering the power of commodity hardware and a free operating system!
SeisWorks enhancements improve horizon flattening while different windows can now be locked together to allow for coherent zooming or movement within cubes. Recumbent wells can be displayed. New ToolTips offer pop-up help on mouse-over to offer additional well information for instance. SeisWorks PowerView will be released in 2-4 months and will offer multiple synchronized views of horizons with different color bars. By year end 2002, Zmap+ will be available in the Power View which will also support 4D workflows. OpenVision now has probes (à la Magic Earth) and can now display CAD models – to allow for visual integration with surface facilities. Again – Linux is claimed to beat Unix hands down in price-performance.
Steve Smith, Dell’s high performance computing business manager believes that Linux ‘breaks the link between hardware and software’. Dell has no proprietary Unix to protect and Linux fits with Dell’s direct selling strategy. Dell offers ‘systems management, performance, reliability and service,’ working with Red Hat for support with drivers. Linux supercomputers can be upgraded with new processors avoiding the ‘fork lift upgrade’ problem – i.e. having to remove the old machine to bring a new one in à la IBM SP2 or SGI Origin. Amerada Hess has 500 workstations doing seismic processing in Houston. CGG has a 3000 cluster machine. The Cray T3E is now a Dell cluster! Dell is working with Saudi Aramco on a cluster for reservoir simulation based on Intel Xeon, 8 node 16 processor cluster. This machine will give 64% of the performance of an IBM Nighthawk 3 for about 3% of the cost.
Supergrid extends PetroBank to allow arbitrary random line retrieval and seismic volume merging. On-the-fly post stack processing can be performed during data retrieval. Aberdeen-based Shell Expro has outsourced its seismic data management, in a five year contract, to Landmark’s PetroBank. The contract covers post-stack navigation and trace data, metadata QC and loading of pre-stack and field seismic, the Surf&Connect web front end and a large seismic remastering project. Landmark also provides on-site support for workstation project management. Shell’s legacy systems and data are migrating to PetroBank and other data stores. A demo showed a live, Hummingbird Exceed-based connection to PetroBank. ‘Impenetrable’ security deploys the ‘Digipass’ PIN-generated password system reminiscent of that used by John Nash in ‘A Beautiful Mind.’
Sheldon Gorell (Landmark) described how new technology is coupling the reservoir simulator with a representation of surface facilities and flow lines. This allows for optimization of both wells and production facilities by linking the VIP reservoir simulator to the Surface Pipeline Network (SPN) modeling tool. The SPN tool was developed with BP for Prudhoe Bay’s massive infrastructure. The Prudhoe Bay case history showed how the choice of surface facilities – and the ability to gather and process hydrocarbons from neighboring discoveries influenced the ultimate recoverable reserves of the field and the overall project economics. The technique is now used ‘pervasively’ throughout BP. The SPN graphical user interface (GUI) is available now for internal use within Landmark – a productized version will be out within 6 months.
Landmark’s new simulator ‘Falcon’ has been developed from scratch under another BP partnership. Falcon supports unstructured grids through an embedded version of Veritas’ SureGrid. Falcon also supports multiple reservoir models and has built-in facilities and network functionality. A new 3D Decision Space Shared Viewer will be available in the 4th quarter 2002. This was also developed for/by BP to view the Thunder Horse subsurface and surface facilities.
Accenture’s Ian de Snoo is head of the Accenture/Landmark European Alliance, Jon Lewis heads up the Landmark EAME side. De Snoo believes that this alliance ‘is different!’ It is ‘core strategy to both Landmark and Accenture,’ and is part of a ‘five-year plan’. A joint team of around 40 people are co-located in Houston (see the Asset Management Center announcement on page 12 of this issue). The Alliance is backed by both managements and by ‘the C-levels in our client base’.
BIC - ITO
Alliance components are Business Integration Consulting (BIC) and IT Outsourcing (ITO). BIC puts tools on the desktop so that clients can start on their added value work as soon as they arrive at work. BIC is ‘not just a portal’. One client is working towards a virtual team. In general, Accenture will supply the communications, desktop and IT infrastructure and Landmark will provide applications and data management.
Lewis told Oil IT Journal that the Alliance has ‘no religious conviction’ as to total outsourcing – some clients do everything inside the firewall. Others outsource IT completely. The ‘BIC value proposition’ is very open. The Alliance ‘could receive payment out of increased revenues from an asset.’ Bold claims are made for the Alliance’s potential – a 10% production increase, 10-20% productivity enhancement and a 10-20% reduction in cycle time. When an Alliance proposition goes in, it goes with a ‘guaranteed minimum cost reduction promise.’
Helen O’Connor and Ben Trewin gave a joint presentation on the Team WorkSpace (TWS) portal. TWS is a set of pre-defined workflows implemented in a fairly rigorous manner. Workflows are customized to a particular type of activity and managers can track project status and approve milestones. In a set-piece demo, a ‘virtual team’ was built to leverage driller’s knowledge from elsewhere in the ‘organization.’ Knowledge management software was used to capture discussions between economists. All in all, the ASP hosted software appears to work as advertised. Bandwidth does not appear to be an issue.
Note that the ‘Grand Basin’ name has been quietly dropped. The application hosting service is now delivered directly from Landmark. The ASP demo used a link to Houston at ‘around 50-100 kbps.’ OpenWorks ran faultlessly. The service will be hosted from Stavanger, Aberdeen, Houston and Calgary. Landmark already has two oil company clients in Houston. Different ASP configurations are available to cater for client preferences – one has retained some IT infrastructure and is moving stuff out piecemeal, another has in-house based ASP. Two clients have gone for a complete outsource. The Landmark ASP/outsourcing offering can include legacy components and third party applications.
Bill Shea demoed Kidra, a Norwegian high-end visualization and interpretation service provider, associated with Magic Earth. Kidra’s technology rolls in components from Foster Findlay to allow for seismic attribute generation on the fly. Magic Earth’s GeoProbe allows for ‘intuitive investigation’ of a seismic volume prior to interpretation. Arbitrary lines can be created to check auto-picking accuracy or to control and verify an interpretation. A history tracker keeps a record of interactions – such as deleted points etc. Foster Findlay’s ‘Azimuth Volume’ structural attribute was used in an impressive combo display. The probes appear faster than ever. Geobody sculpting can isolate a body between fault planes – with amplitude extraction and transparency. Put the stereo back on and according to Shea, “I guarantee that you will see faults and structural detail that you have never seen before – including small faults that may be very important in well planning and reservoir drainage. – You will only see these phenomena in a high-end 3D visualization environment.”
This report is abstracted from a 12 page report on the 2002 Landmark Stavanger City Forum produced as part of The Data Room’s Technology Watch Reporting Service. For more details on this service contact firstname.lastname@example.org.
Houston-based IT consultants ZettaWorks and Tibco Software Inc. have just announced ZettaWorks Operations Advantage (ZOA). ZOA offers real-time connectivity to oil and gas operators and is claimed to achieve rapid, low-cost integration of disparate systems. ZOA extracts user-defined data and reports from production facilities and delivers them to individualized portals.
ZettaWorks CEO Ken Neusaenger said, “Large cost savings may be realized in field operations by analyzing and comparing data in near real-time. One of the most frequent requests we receive is to automate the process of data collection and presentation from a set of disparate sources. Having a seamless and coherent view on data helps people ask the right questions and reach the right decisions.”
Tibco marketing VP Rene White added, “Integrating information onto the desktop or PDA is vital in the oil and gas sector. We are working with ZettaWorks on the development of ZOA as part of our oil and gas Industry Solutions Initiative.”
ZOA is buildt on the BusinessWorks (BW) integration solution, part of Tibco’s ActiveEnterprise product family. BW is claimed to be an easy-to-use platform that solves integration challenges in ‘bite-size chunks’. BW also provides standards-based ‘comprehensive, cross-platform’ web services for new and legacy systems, including internal application and business process integration, as well as real-time monitoring and management.
IHS Energy Group has moved into application service provision with the release of a hosted version of its Probe GIS information browser. Probe Hosted offers users browser-based access to a variety of tools for analysis and mapping of IHS Energy Group’s international E&P databases. By taking the hosting route, IHS’ clients automatically access the latest data and avoid having to manage software in-house.
Nancy Maher, Senior Manager of IT at IHS Energy Group said, “With Probe Hosted, we are not only increasing speed and convenience for existing users, but we also are making it appealing to potential users who require a high-degree of flexibility and power, yet do not want the IT requirement of managing their own software and data updates.”
IHS claims that the vanilla Probe has already proved its worth on the desktop as a front-end to IHS’ databases. Probe lets end-users develop complex queries and retrieve results in the form of spreadsheets, graphics, forms and maps. Probe is used for analyzing and evaluating a wide range of tasks including new ventures, exploration, negotiation, engineering, reservoir engineering and cartography.
OFS Portal is extending the API/PIDX transaction standards to meet the needs of its US and European customers. OFS Portal has been collaborating with UK government-sponsored LOGIC promotional organization and the UK Department of Trade and Industry (DTI).
OFS Portal CEO Bill Le Sage said, “OFS Portal led the team contracted to PIDX in the US last year to define a set of XML transaction standards for oilfield e-commerce. We wanted to make sure that these standards also addressed the needs of customers outside of the US. Through our collaboration with LOGIC, DTI, and other UK groups we have worked with PIDX to extend the transaction standards to the EU.”
The PIDX standards cover typical business processes associated with procurement of oilfield products and services. OFS Portal has been involved closely in this effort since the middle of 2001, providing resources and funding, and leading the PIDX ComProServ standard.
The UK DTI sponsored a gap analysis of these transaction documents against emerging EU standards from the UK Business Applications Software Developers Association. The PIDX standards have been modified to take account of different currencies and European tax structures.
These XML-based e-commerce transaction standards are expected to be ratified at the PIDX General Committee Meeting in Westminster, Colorado this month.
Petris Technology, Inc. has announced a new release of its Winds Integrated Decision Support System (IDSS) web-based data warehouse application. IDSS V2.0 provides engineers, production personnel and managers with access to corporate operational, economic, financial and land information.
Economics and land
IDSS extracts and stores data from applications and business systems running on mainframes, UNIX servers and Windows. The IDSS Version 2.0 adds economics and land information access to the financial and business queries.
Ascential’s DataStage XE is the core data integration technology inside the IDSS. DataStage automates data integration across multiple data sources, breaking down large integration tasks into smaller jobs that users can schedule sequentially.
One Petris client, Burlington Resources, will be using IDSS monthly to consolidate more than 10 million rows of data from six different operational systems. Using DataStage instead of hand-coding data integration functions, Petris reduced implementation time for Burlington by 25 percent saving development effort and system costs.
Petris president and COO Jim Pritchett told Oil ITJ, “Energy companies evaluate their profitability on a property basis, but previously had difficulty capturing and analyzing property data effectively because it’s spread over multiple systems. By building IDSS, our customers can now get at this data much faster, to accelerate decision making and maximize profitability.”
Calgary-based Kobayashi and Associates (K&A) Ltd. has advised IHS AccuMap Ltd. on the development of an online acquisitions and divestment portal for Canadian oil and gas assets and companies. The PropertyMarket portal is now online at ihspropertymarket.com and is also available as a map feature within AccuMap.
The combination of AccuMap’s data analysis with K&A’s financial information is said to offer buyers ‘an accurate, independent property evaluation.’ The new portal claims ‘unparalleled market penetration’ into the Canadian and U.S. oil and gas sectors, providing direct access to over 7,000 AccuMap users.
AccuMap’s PowerTools will be available through the PropertyMarket to provide online economic evaluation of properties. K&A plans to use the AccuMap portal for all its new divestment projects as an additional, free service to its clients.
Kobayashi & Associates Ltd. is an independent advisory firm specializing in the divestment of oil and gas assets and corporations. In the last 12 years, K&A has been involved in 125 offerings to the oil and gas industry.
ESRI is offering a selection of its GIS software on Linux as what it describes as its ‘continuing commitment to open systems.’ ArcIMS 4, ArcSDE 8.2, MapObjects-Java Standard Edition, and ArcExplorer 4 software are now all supported on Linux.
The latest version of ESRI’s internet mapping system - ArcIMS 4, which began shipping last month, now runs on Intel-compatible Red Hat Linux 7.1 in addition to Microsoft Windows, Sun Solaris, IBM AIX, and HP-UX operating systems.
Version 8.2 of ESRI’s spatial database engine is described as ‘the GIS gateway’ for managing spatial data in a commercial database management system. ArcSDE 8.2 for Oracle is the first ArcSDE release that supports Linux servers.
MapObjects Java Standard Edition is a suite of over 900 mapping components that can be used to build custom, cross-platform GIS applications or applets. Because it is pure Java, MapObjects-Java Standard Edition can be used with Linux, Windows, and a variety of UNIX operating systems.
ArcExplorer 4 Java Edition is ESRI’s latest free GIS data viewer. Again, by developing Explorer in Java, ArcExplorer 4 is assured cross-platform support including compatibility with Linux, Windows, and a number of UNIX operating systems. ArcExplorer 4 can be downloaded free of charge from www.esri.com/arcexplorer.
PetroVantage (PV) and Triple Point Technology (TPT) are to integrate their technologies into a ‘comprehensive solution’ for petroleum industry trading, logistics, and risk management. PV, a subsidiary of Aspen Technology, provides collaborative software for trading and logistics to the petroleum industry while TPT supplies enterprise-wide transaction processing and risk management software.
The companies will work to integrate and jointly market their products to give downstream oil companies life-cycle deal management capability, from opportunity evaluation and deal negotiation through logistics coordination, transaction processing and risk management.
Data captured in TPT, such as physical inventories, paper and physical positions, and risk profiles, would be integrated into PV’s role-based consoles for trading and logistics staff. Deal attributes captured in PV are transferred to TPT’s deal capture system, ensuring accuracy and eliminating error-prone manual data entry. Deal costs calculated in PV are transferred to the TPT system, enabling a comparison of estimated versus actual costs.
PetroVantage president Charles Moore said, “Combining solutions from PetroVantage and Triple Point Technology will create breakthrough improvements in staff productivity, deal margins and risk management.”
Peter Armstrong, TPT president added, “TPT’s position supporting the energy marketplace with transaction processing and risk management tools will be enhanced by the physical trading and logistics solution from PV to improve decision making and operations coordination.”
Shell is the first major customer for Open Text Corp.’s latest knowledge management offering - Livelink VirtualTeams (LLVT). LLVT is described as a comprehensive, integrated team environment built on Livelink which combines a tested methodology for virtual teamwork with ‘state-of-the-art’ collaboration technology.
LLVT integrates technology developed by researchers from Open Text affiliate NetAge. The Virtual Teams Methodology was developed over a 20 year period by NetAge CEO Jessica Lipnack and chief scientist Jeffrey Stamps.
RoyalDutch/Shell group’s Ben Krutzen, said “As first customers, we have licensed 10,000 seats of Livelink VirtualTeams. This marks a new phase in our close collaboration with Lipnack and Stamps to make our virtual teams more effective. The LLVT module intuitively guides users through the steps that make their work easier, and puts the relevant process information at their fingertips.”
LLVT enhances team meeting productivity through asynchronous collaboration between dispersed team members using technologies such as virtual real-time application broadcasting, shared text pad and whiteboard, chat and instant messaging. Meeting information is captured on the fly to the knowledge repository.
Read the book
Lipnack and Stamps have written six books over 20 years on this subject, most recently, the ‘best-selling’ Virtual Teams (John Wiley & Sons, 2000). More from www.virtualteams.com.
Trade Ranger, the major oil company supported e-business portal has published the first version of its CIDX Gateway. The CIDX Gateway is a bridge between CommerceOne’s XML Common Business Library (xCBL) and the chemical e-Standard, supported by major petro-chemical companies and leading e-marketplaces. Trade Ranger uses CIDEX to describe products and services online.
CIDX General Availability (GA) 1.0 supports the RosettaNet Implementation Framework 1.1. RosettaNet is used by the electronics industry to underpin its e-business and is a proven specification.
Take a TRIP
To facilitate interoperability between different XML standards, Trade Ranger has introduced the concept of TRIPs. TRIPs enforce business rules on document exchange, sequence and timing and remove ambiguity as to how business documents are exchanged and clarify the purpose they serve.
Proof of concept
Trade Ranger believes that the CIDX Gateway proves the power of XML-to-XML transformation, lays the foundation for a framework-based architecture and is ‘significant progress towards a sustainable e-architecture’.
The Houston-located Landmark/Accenture Asset Management Center (AMC) is described as a ‘state-of-the-art laboratory and demonstration facility’ designed to showcase new technology in field automation, systems integration and immersive visualization. These are said to make real-time asset management and the web-enabled oilfield a reality.
The AMC collects real-time data from field operations which is integrated with the reservoir model in a secure, multi-client hosting center. Centralized data and application hosting provides support for knowledge sharing and teamwork between asset team members, partners and service companies. Distributed access allows collaboration across immersive visualization rooms, partnership operations, remote office and home/hotel locations.
Landmark president and CEO Andy Lane said, “Landmark is committed to enabling real-time asset management for our clients and making the vision of the Web-enabled oilfield a reality. We believe that this will have lower finding and lifting costs and maximize the value of our clients’ assets. This center is a proof point for the future of the industry, everything we’re showing in the AMC is based on technology that exists today.”
Bill Warren, Accenture’s managing partner for the upstream added, “The AMC offers clients the opportunity to see what these technologies can bring to their businesses. For energy companies who seek to manage their intellectual capital more effectively, the combination of Halliburton, Landmark and Accenture can help lead them into the Web-enabled oilfield era.”
Last month Landmark also opened a new multi-client data management hosting center in Calgary. The data center, located at IBM’s e-business hosting facility, will deploy the ex-PGS PetroBank data management solution.
IBM’s e-business sales manager Byron Manastyrski explained, “Availability, security and scalability are the cornerstones of IBM’s hosting services. The Calgary facility’s state-of-the-art hardware, security systems and environmental controls ensure the utmost accessibility and safety of E&P customer data, and are built to scale quickly and efficiently with our customers’ e-business needs.”
Landmark Canada manager Darcy Cuthill, added, “Currently, many E&P companies must archive redundant data stores at locations throughout Canada, making it difficult for them to provide their asset teams and business partners with consistent access to quality data. Their business-critical data assets are being stored in a secure and scalable environment.”
Mainland Information Systems Ltd. helped design and implement the hosting center. Mainland VP Michael Koury said, “Our mission is to ensure that the data and system requirements of Landmark and its customers are met without exception.”
Schlumberger Information Systems has just released a white paper on its new software and service offering the ‘Living Business Plan.’ The paper begins with a distinctly ‘dot-com’ era analysis of oil company finances – claiming anachronistically that oils have ‘historically produced a consistently low rate of return.’ The investment community still has seemingly ‘high expectations’ of oil companies and expects them to ‘manage risk and minimize revenue fluctuations.’ Heck, in the old days investors expected oils to take risks and maximize revenue!
The white paper makes a strong case for a holistic approach to the management of oil company data and financial information. The idea is to have one-time data entry or capture, and an efficient distribution system that ‘manufactures’ financial data in near real-time, ready for roll-up into a corporate financial plan, and offering the ability to try out scenarios such as acquisition opportunities.
Same old suite?
What new software underpins the Living Business Plan (LBP)? The LBP is not a new ‘killer app.’ Any company that already has the Merak suite, along with Finder, FieldView, OilField Manager and maybe DecisionPoint has all the building bricks for an LBP implementation. If you don’t have all these, no problem, you can build your own virtual software portfolio using Schlumberger’s LiveQuest Application Service Provision.
The potential of this new way of working is well illustrated in a fictitious scenario in the white paper. Houston-based ‘Bravo Oil’ used the LBP to evaluate the financial impact of a deepwater gas acquisition opportunity. The LBP allowed Bravo to go way beyond the traditional NPV and cash flow analysis by integrating future cost and revenue streams within pro-forma accounting and capital requirements.
The key to the LBP is described as a ‘shift in the way business is done.’ Rather than maintaining separate economics and asset-focused teams and data sets, the LBP offers the corporation dynamic sharing of information. Each group understands corporate goals and can see how its own contribution can affect them. The focus thus moves to a consistent treatment of high-value projects.