Midland Valley's 3D Move
Veritas and PGS have failed in their bid to merge. A new company Veritas-GeoServices was to be created in what had been trumpeted as ‘a merger of equals’. But, as George Orwell put it in Animal Farm, some are more ‘equal’ than others.
Following protracted discussions the companies recently signed an amendment to the original agreement that shifted the balance of power from a 60/40 split in favor of PGS shareholders down to a 55/45 split.
Veritas CEO Dave Robson was to be CEO of the new company while PGS CEO Reidar Michaelsen would have been chairman of the board. The negotiations were tough even before the deal was abandoned, as Michaelsen and Robson revealed, “Completing this transaction has been a challenge, but the compelling benefits for both companies have led all concerned to work towards a solution that meets all of the major objectives.”
“Ongoing consolidation among our customers and the need for continued investments in people and technology make this a great opportunity. We will be uniquely positioned to offer our customers a much broader array of sophisticated 2D, 3D and 4D geophysical data and services.” The combined companies would have mustered 21 marine crews, 8 visualization centers and 27 seismic processing centers throughout the world, along with a 400,000 square kilometer library of modern 3D.
A get-out clause was invoked by Veritas as attempts to assure financing for the new company failed to bear fruit. The companies were exploring the availability to of additional financial resources and evaluating several financing alternatives.
But current market conditions meant that finding such extra finance was this a tough call. Merger plans were scuppered when PGS received a written communication from Veritas to the effect that its board of directors had withdrawn approval for the pending combination. In light of this communication, PGS announced that it was terminating the merger agreement forthwith.
Landmark has signed a technology access agreement with the Kuwait Oil Company (KOC). The five-year contract creates a strategic relationship that gives the KOC access to the full suite of Landmark’s geophysical, geological, reservoir, data management and economic software.
Landmark is also to provide on-site technical support and project consulting services. Landmark president Andy Lane said, “We are extremely pleased to have formed this strategic relationship with the Kuwait Oil Company, the first of its kind with a national oil company in the Middle East. This technology agreement will provide KOC with access to a fully integrated, state-of-the-art software suite that will enable their geoscientists and engineers to analyze opportunities quickly and efficiently.”
KOC chairman Ahmad Al-Arbeed added, “The partnership with Landmark and our open access to this leading technology ensure that KOC will continue to be one of the most efficient managers of hydrocarbons in the region and one of the lowest-cost oil producing companies in the world.”
Shell acquired London-based Enterprise Oil earlier this year and is in the process of subsuming it into the Shell Group. Ever since Oil IT Journal was launched back in 1996 as Petroleum Data Manager, Enterprise has been a particularly rewarding source of upstream IT news. I though it was appropriate therefore to run through Enterprise’s IT efforts by way of a tribute for the departing team.
Enterprise was an early adopter of workstation technology being one of Landmark’s first three clients for its mammoth, 1,000 lb. wt. ‘Landmark III’ workstation back in 1982. Thus began a 20-year relationship between the companies, which was to have a major influence on upstream IT. Enterprise also co-funded development of the first (and some would claim still the only) truly integrated interpretation suite – Tigress.
Being digital made Enterprise aware of the need to manage data behind the workstation. Enterprise, like other oils, set out to build its own corporate database. Eocene, a forms-based Oracle database with a limited range of data types, went live in 1994. The experience with Eocene and the Tigress development demonstrated the potential of integrating the database with applications, and became a blueprint for a joint-venture with Landmark to design the Enterprise E&P Data Store (EPDS), a ‘tightly integrated application and data environment,’ built around OpenWorks.
Enterprise head of technology, Tim Bird, one of the EPDS designers also liked to proselytize for good data management practice. Bird criticized systems that focused on ‘storing and moving data with ever greater speed’ while overlooking data quality. For Bird, the focus had to shift to clearly-defined units of measure, standard naming conventions, data dictionaries and unique well identifiers.
Enterprise was also an early adopter of knowledge management. CIO John Keeble was an advocate of communities of practice to ‘remove time and geography as barriers to communication by setting up a knowledge community’. Pragmatism was Keeble’s watchword, ‘Remove KM theory and language - and then remove some more!’ For Keeble, data and information was all ‘stuff’ that needed managing and exposing to end users.
Portals and GIS were also taken on board by Enterprise to provide what was termed technically as ‘data in your face.’ With the Regional Access to Portal Information and Data (RAPID) ARC IMS-based portal, the focus shifted from the data itself, to its delivery to end-users. Multiple data sources were synthesized into a single GIS-based point of access to QC’d digital data.
RAPID proved the epiphany of Enterprise’s data management efforts. The intuitive front-end provided users with access to data previously tucked away in the corporate data store and other repositories. Instrumental in this acceptance was an outsourced development – Web Open Works (WOW) performed by Exprodat. WOW quickly earned ‘killer-app’ status. Enterprise’s head of data management Ashley Dunlop commented, “Non IT-savvy professionals can drill down to binary data in a totally intuitive manner. The RAPID/WOW combination is much faster than using an application.”
Exprodat director Bruce Rodney told Oil IT Journal, “Enterprise had always understood the need for technology in executing their core business. It wasn't just the flashy spinning 3D seismic cubes. Data management and infrastructure also got management’s attention and funding. I remember one occasion when Exploration Director Andrew Armour grabbed the mouse during a demo, demanding to try the software himself. On the knowledge front, CIO John Keeble promoted an enlightened policy of ‘universal read access’ to all technical data - you had to make a case to hide data, rather than the other way around.”
Landmark VP John Sherman concurs, “Enterprise Oil was a ground-breaking leader in several key areas of upstream IT. They were one of the first companies to view data management and application integration as key parts of the supply chain. Enterprise was an early adopter and thought leader in basing the asset team approach on an integrated suite of data and applications. Enterprise was, in some cases, years ahead of the industry as a whole. Much of the design of OpenWorks is due to input from Enterprise. Enterprise was a pioneer in the deployment of distributed data management systems and sat alongside giants like Shell, Chevron and Amoco on the board that drove the design of OpenExplorer. Enterprise also developed an innovative QC system based on artificial intelligence to detect and correct data problems from multiple vendors.”
One cannot conclude this review without asking the question - did all this IT trailblazing do Enterprise any good? Consultants and vendors sometimes like to deprecate efforts at ‘doing’ IT in-house and some will no doubt be making a case for Enterprise’s take over being due to a ‘lack of focus’ on the core business of exploration (this of course begs the question of whether being taken over by a major is failure, or the ultimate accolade for the independent). The answer, for Enterprise at least, is that the early use of workstation technology in the discovery of the Nelson field was a determining factor in Enterprise’s early corporate growth. In fact, early adoption gave Enterprise both a competitive advantage in E&P and allowed it to punch well above its weight in influencing upstream software development.
The key tools for cross-organizational collaboration are e-mail, bulletin boards, chat sessions, and virtual rooms. While these are powerful enablers of electronic collaboration, they are only initial capabilities. This article outlines the additional requirements for the ‘ideal collaboration environment.’
Users must be able to quickly identify, locate and contact specific staff or subject matter experts in or without the corporation and obtain contact information, levels of access, and areas of expertise. The system should allow users to locate staff-members from partial information – a search for an employee called ‘Robert’ who works in Paris should return a list of possible matches. When key people have been located, the system should facilitate the organization of dispersed personnel into virtual teams on an as-needed basis. Electronic collaboration tools will help coordinate and conduct ad hoc and formal virtual meetings–possibly in virtual conference rooms with round-the-clock desktop teleconferencing.
Video teleconferencing should be available to remotely located desktop users. Automation allows an individual to be contacted immediately in a crisis. Some systems provide an ‘awareness’ feature. If they are on-line, the system will alert them that someone is trying to contact them, and will support an instant chat session. If not, the system will phone and/or page the individual until they respond.
Collaboration support lets staff and remote personnel brainstorm together and exchange insights. Business workflows unify related work activities to reduce cycle times. Intelligent search tools work across the organization to locate information using keywords and context, pruning the search space to provided a limited number of highly relevant matches.
Smart product delivery brings the right information to the right people as soon as it is available, even if not specifically requested. Security systems need to be proactive and disallow unauthorized access, detecting and disabling intrusions before damage or compromise occurs, and protecting systems from malicious code and viruses. Advanced security systems such as biometric recognition reduce the potential for unauthorized access. System monitoring identifies suspicious behavior from user profiles. A user logging onto the system at midnight on a Friday would probably raise a flag.
The need for industry standards and commercial standards-based products is seen as the single most important factor for enabling application and data interoperability. But where industry standards are lacking, organizations need to define their own. The ideal collaboration environment provides services using standards-based products.
Collaborators are responsible for assessing the expertise, knowledge, and accuracy of information from other staff before basing their conclusions on new information sources. The ability to rely on information from others is referred to as ‘trust’ and poses a challenge for virtual teams whose members are remote and not known personally. How can you assure sufficient levels of trust when collaborating with possibly unknown co-workers? As we approach the ideal collaboration environment, we forget what it was like to collaborate when limited to the phone, video teleconference centers, and physical meeting rooms. A colleague recently asked how briefings were prepared before Microsoft PowerPoint was invented. No one could remember!
* Mitre Corp. affiliation does not imply support for the opinions expressed by the author.
This is an edited version of an article originally published in Crosstalk, The Journal of Defense Software Engineering.
Western Canadian operations can provide a stress test for communications with a turn-around time from spud to rig release of only 3-4 days. Calgary-based Datalog is supporting the data management and communications associated with such intense drilling activity with a new data service provision WellHub.com. WellHub offers a central repository and delivery mechanism for wellsite information including logs, reports, documents and images. Along with vertical E&P datatypes, WellHub provides secure Internet access, instant messaging and confidential e-mail. Datalog claims WellHub improves efficiency by offering automated data access to partners, regulatory agencies and service companies. WellHub provide real time viewing of geological and drilling data, including LWD, wireline, and other logs.
Wellhub was trialed by a Western Canadian operator drilling a large number of shallow wells in quick succession. On spud, the system informed partners and other interested parties by email that new information was available. The Hub provides a central data repository during operations. Data can be left on the Hub until all financial and wellsite operations have ceased and the AFE has been closed.
Following the trial, WellHub was adopted as the operator’s standard reporting system – saving operations personnel at least an hour a day each. An add-on, the WellWizard allows for real time distribution of logs, charts, and drilling parameters. WellWizard uses WITS and WITSML formats to integrate all sources of wellsite data.
‘Introduction to Knowledge Management (IKM)*’ is an important book for oil and gas knowledge workers for two reasons; books on KM are rare and this one has special relevance to oil and gas with an in-depth analysis of Statoil’s Faros KM experiment.
IKM was written for students in Norway’s NTNU research institute and the text-book bravely attempts to define concepts, along with many references to various KM gurus. Unfortunately, the nebulous nature of KM shines through the contradictory definitions. Taken as a whole IKM has a kind of ‘fractal’ self-similarity. Read almost any paragraph, and the asides and qualifiers provide a resume of the whole.
What is KM?
The ‘what is KM?’ question is kind of answered in five or so pages. The answer is vague and in part, tautological ‘KM is the conceptualizing of an organization as an integrated knowledge system.’ KM is later categorized as being ‘concerned with effectively connecting those who know with those who need to know’, and by ‘converting personal knowledge into organizational knowledge’. That’s a bit easier to grasp. Things get even better on page 17 where we are encouraged to ‘capture content and bring knowledge to teams and communities’. But only a few pages later, we are told categorically that ‘knowledge cannot be captured!’
The book ranges idiosyncratically around its subject. The chapter covering infra-structure is some 50 pages long, but provides few hard technological facts. Lotus Notes (the granddaddy of KM infrastructure) receives no mention in the skimpy index, and gets short shrift in the text, ‘there are better alternatives-such as the web, as a basis for a collaborative environment.’ 20 pages of the technology section are devoted to neural networks, genetic algorithms and case-based reasoning.
Such apparent digressions are really at the heart of the authors’ subject, a soft-sell of the Corporum KM product from CognIT. Corporum (with five mentions in the index!) is an Autonomy-like tool for extracting and classifying textual information. Two IKM authors are officers of CognIT.
Statoil’s Faros project leveraged KM in a decision support system for the ill-fated Asgard field development (see Oil ITJ Vol. 4 N° 12). IKM provides a substantial amount of information on Faros along with some insightful lessons learned. These include the necessity to regard KM as a continuum of technology, content and people; to build user-friendly systems with minimal clicks between query and results; to start small; to store information at unique locations-while providing multiple, redundant paths to documents and finally to ensure that the KM system offers a learning capacity.
The last chapters, on measuring KM performance and the future of KM dissolve into a kind of ‘bluffers guide’ to KM and MBA jargon. One’s heart goes out to the NTNU students who have to learn this stuff! On balance though, oil and gas K-Managers should probably spend a few days plowing their way through IKM because there are no doubt some experiential gems hidden away in the verbiage.
* Introduction to Knowledge Management. Wang, Hjelmervik and Bremdal. Tapir Academic Press. ISBN 82-519-1660-7.
Petris Technology has added collaborative project management to its Winds web-based Enterprise Application Integration platform. Plan-IT underpins project management by organizing and analyzing schedules tasks, deadlines and resources. Project goals, milestones and timelines can be defined. Team members can upload documents for centralized storage and shared access. Petris CEO Jim Pritchett said, “The value of information is often significantly increased when shared. With reciprocal online relationships in secure, shared workspaces, Plan-IT users are able to act on and react to situations that could not have been dealt with independently. It’s a competitive advantage that can only be realized with this technology.” More from petris.com.
Pipeline and utility companies use UTSI International Corp.’s UTSeyes software to provide remote users with web-based access to data in the corporate database. Staff and customers can interactively request data from corporate database servers over the web. The product comprises a server on the corporate host, and a self-downloading Active-X browser plug-in. Permissions can be issued to individuals or groups to control data access. Data is streamed to users for local processing and display. Users can configure data points and time periods as desired - removing the need for periodic reports. Data is constantly available on-demand. UTSeyes is licensed by the server, not the number of clients, making for cost-effective deployment. More from utsi.com.
Halliburton Energy Services (HES) is ready to comply with the American Petroleum Institute’s (API) Recommended Practice 3901, e-Commerce transaction standard. The API ComProServ task group has been disbanded with all its objectives completed. The resulting spec is API RP 3901, Parts 1,2,3 and 4. The XML-based specification was developed to underpin e-business transactions involving complex products and services. The spec sets out to minimize the duplication of effort in paper transactions between service companies and operators. Among the 12 transactions that are included in the standards are requests for quotations, order create, field tickets and invoices. HES has already completed two pilot programs using the new standards.
Petris Technology and Theta Enterprises are offering four new production engineering applications via the Internet. These products will be offered on a monthly subscription basis via Petris’ Winds Now! Application Services Portal (ASP). The Theta products to be supplied in ASP mode are RodStar, RodDiag, XDiag and CBalance. According to Theta, these tools are the artificial lift industry’s leading rod pumping design, optimization and analysis software tools.
Theta president John Svinos said, “Being able to run these applications from anywhere is very convenient for the user. Now anyone can analyze their rod pumping wells, redesign them to fix problems, or balance them, no matter where they are.”
Operators can now access these engineering applications with only one contract. Petris claims that ASP helps producers reduce internal support requirements, extends the life of existing hardware and reduces capex. Petris Winds Now! can be provided via the Internet or a similar solution can be offered from within a company’s firewall.
Petris COO Pat Herbert added, “Theta’s partnership greatly enhances the solutions offered by Petris Technology to its clients. Theta Enterprises is a good complement to our growing body of software and services offered to the oil and gas industry. This specialized software will benefit those in the production engineering segment of the industry by helping them perform with greater efficiency and lower costs.” More from gotheta.com and petris.com.
Wood Mackenzie has just announced its Russia Pathfinder database and mapping tool for what is described as ‘the world’s most complex upstream industry’. The Russia Pathfinder supplements the asset and regional descriptions and analysis in Wood Mackenzie’s EnergyVision and Global Economic Model products and provides clients with direct access to Wood Mackenzie’s regional database.
Russia PathFinder contains field, licensing data and oil and gas reserves for 310 fields along with the locations of over 3,000 fields and discoveries. Production statistics and forecasts are supplied for 195 oil and gas fields. The graphical interface also includes all of Russia’s key trunk pipeline systems and related terminals and refineries.
VoxelVision’s GigaViz 1.0 has been successfully stress-tested by Nork Hydro. Hydro used the high performance, cluster-based visualization software to investigate complex shallow geology near the gigantic Storegga slide in the Haltenbanken area. The Storegga slide took place about 8,000 years ago and left a headwall at the shelf break of close to 300 km length.
A 2.300 by 2.650 line 3D survey was loaded on a VoxelVision 16 CPU ‘VoxBox’ cluster in Trondheim. Within the a few hours, four horizons in complicated shallow geology were mapped and displayed. Horizons were then transferred to the interactive fly-through polygon viewer, revealing new clues to the origins of the Storegga slide.
In another test, the Linux software was successfully used to view a data volume of a staggering 135 GB. GigaViz offers interactive tracking, attribute analysis and AVO. An OpenSpirit link to other environments is available.
UK-based consulting firm Allomax has formed a new division to market its software. Allomax’s systems deal with risk management, costing and economic evaluation, competence assurance and project knowledge management.
Allomax clients including BP, Shell, ExxonMobil, Global Marine, Halliburton, Kerr McGee, Amerada Hess and Lasmo ENI. Allomax products include Casmax, a Competence Assurance System which maps the job description of positions vacant with individuals’ qualifications and Wellmax, a knowledge and project management system. Lessons learned during a project are stored in a Microsoft Access database. Procedures ensure that documents and best practices are kept up to date. A ‘process map’ is used to graphically link all aspects in a user-friendly environment.
Allomax has just signed a world-wide agreement to supply Casmax competence assurance software product to Shell International. More from allomax.com.
Enterprise Oil, now a wholly owned subsidiary of Shell, has had an active KM program in place since 1998. Mark Sawyers described Enterprise’s five communities of practice (COP) active in turbidites, fractured reservoirs, rock physics, salt and deep water. Each COP has 30-40 members, holds workshops and teleconferences and maintains dedicated websites. Enterprise uses Adept’s Connect (originally developed for BP) as an expert directory and yellow pages registry of skills and other personal information. For Enterprise, KM is more than ‘just IT’ and encompasses regular conferences, departmental ‘away-day’ retreats and inter-business conferences. Other Enterprise KM initiatives include the ‘Peer Group Assist’ process for structured workflow-based decision making and ‘Making Better Decisions’ for after-action review. A ‘Post Well Evaluation Conference’ is held to avoid inconsistencies in project risking. An intranet site captures each event into a Lessons Learned database. Another project ‘Influencing the Operator’ (ITO) was initiated to avoid the confrontational nature of the Technical Committee Meeting. ITO attempts to build an ongoing relationship with the operator throughout a project’s life span.
David Briggs from Performing Teams UK Ltd. has been applying management guru Richard Pascale’s ideas. Pascale says we need to “Mobilize people to self-discover the need to change: personal interest energizes change.” Briggs has worked with BP on a Community of Practice supporting polythene manufacture in BP’s European refineries. This effort suffered from a ‘not invented here’ syndrome. To implicate individuals more, BP tried the ‘manufacturing game’ involving teams working to experiment with different ways of doing business. In the upstream this has helped BP ‘discover better ways of bring in a well.’ The key message is to create an environment where ‘learning is risk-free and the feedback loop between actions and results is transparent’. Briggs observed that ‘informal communities that share tacit knowledge usually outlive and out-perform those which are brought together formally, even when separated geographically.’
Rick Jeffrey (Knowledge Leverage Inc.) was until recently KM advisor to ex PanCanadian. PanCanadian’s approach (before becoming EnCana) is to ask before every project: has this been done before? where and when? and what was learned? PanCanadian’s KM guru is Larry Prusak, director of IBM’s Institute of Knowledge-Based Organizations. Pan-Canadian found that around 40% of time was spent on non added-value activities associated with poor KM, which translated into excess costs of $125 million per year. During one gas field’s transfer between business units, exploration information was lost. The new team spent about 6 months redoing G&G. The delay getting the gas to market cost an estimated $15-25 million in lost cash flow. KM started with the Manager’s Online Community offering web access to information such as change management, member profiles, a discussion forum etc. Other communities such as Offshore Drilling, Project Management, IT Fluency, Aboriginal Issues, High Risk AVO etc. followed. Story-telling was introduced with help from IBM’s Dave Snowden.
Helen Gilman (SAIC Consulting) described how KM could help companies deliver growth through merger and acquisition (M&A). Gilman warned that M&A is inherently risky. Mergers and acquisitions ‘tend not to work, between 50 and 70% of deals fail to deliver the anticipated value’. To maximize the chances of success, Gilman suggests tapping into knowledge of previous deals and to ‘embed KM into the deal.’ Challenges for knowledge managers involved in such activity, are different cultures (petrochemical vs. E&P), poor continuity in M&A teams – individuals tend to work only on one project and the intermittent nature of M&A activity. KM can help by accessing information captured in previous deals. The KM effort should focus on understanding one’s track record of delivering synergy targets, and on investigating pitfalls in due diligence.
Comment – Gilman’s advocacy of building a KM database of A&M activity is laudable but hardly within the capacity of most oil and gas companies who ‘do’ M&A comparatively infrequently. Such information may be gatherable by a bank or other advisor – or perhaps by a consultant like SAIC. Congratulations to Gilman for the softest sell at the show!
Alf Michaelsen (Marathon) is developing a broad-based document and KM service throughout the company. The document management spans traditional functions like drafting and the library, but extends to new technologies – web design and the portal. Marathon’s legacy DMS dated from 1983. It was very reliable, but used only by specialists. Marathon elected to move from its legacy system to a modern EDMS – with infrastructure to link its offshore, UK and Eire locations to its US headquarters. The system has been built around Novasoft’s Cimage. Cimage Oil & Gas now supports over 350 users in eight locations and stores over 120,000 invoices, 700,000 drawings and 15 million document pages. The installation has resulted in an 80% reduction in information distributed in paper form (equivalent to 150,000 pages per month).
Gary Moucha is a member of Hess’ IT Advanced Technology team with responsibility for the ‘I-laboratory’ which develops proof of concept projects in conjunction with IS and management. Hess’ guru is Ian Scott of the London Business School. Scott’s work with BP and SAIC led him to believe that ‘finding who knows is more important than capturing the knowledge’. Hess translates this into practice with a global, standard search engine. KM concepts are expanding into HR, IT, refining and finance. Projects include an examination of down hole failures in the West Texas Seminole field, reducing rig move costs in Algeria and HP/HT drilling cost reduction in Norway. KM is enabled by a world-wide intranet and search engine, the Hess Connect people finder and discussion groups and knowledge asset templates.
Shell E&P Co.
Gayle Holtzinger (Shell E&P Co.) is providing Shell’s US sub with a ‘connected’ knowledge transfer (KT) environment. Connections can be in any direction between people, the organization and its best practices. KT happens at conferences and learning fairs – where they generated ‘a lot of electricity in the air’ and became a showcase for Shell’s successes. Shell currently has various ‘networks’ – communities – for topics like SAP plant maintenance, asset life-cycle documentation, HR, petrophysics, drilling and producing operations. The latter has developed a KM methodology called Practiced Excellence through Accelerated Replication (PEARL). Community members submit successful practices to a local focal point for input. An administrator selects the PEARLs which are communicated to other foci. Upon adoption a PEARL is deployed and usage monitored. Holtzinger commented on cultural differences. E&P folks are keen to act if something appears worth doing. Production is more deferential and asks management approval first! Shell’s tools of the trade include NetMeeting and OpenText’s Livelink. Technology can remove time and distance barriers. Electronic KM needs dedicated full time support. Face-to-face interaction remains crucial.
Anne Kleppe is an ‘e-collaboration professional with Statoil. E-collaboration happens when two or more people make an IT-supported decision they would not have arrived at without the technology. Statoil is rolling out a full e-collaboration solution in 2002. The core component is content management – of documents and images. This integrates a KM spectrum of content management, collaboration and portal technology. 2002 will see a basic content management solution with secure collaboration through the portal. In 2003, an internet-based workspace solution with personalized information access will be added. 2004 will see learning on demand and wireless anywhere. Part of the program involves the re-establishment of a central service provider. Statoil’s inspiration for its collaborative effort comes in part from a paper by The Mitre Corp.’s Pam Dargan (see page 3 of this issue of Oil IT Journal for a synopsis of Dargan’s views).
For Tom Henriksen (Norsk Hydro), KM involves a collaborative approach to capture and use of enterprise information assets. These include databases, documents and most importantly, the un-captured tacit expertise and experience of individuals. Hydro’s guru is Ikujiro Nonaka of the Japan Advanced Institute of Science and Technology. Nonaka has developed a methodology for moving information around in the tacit/explicit knowledge space. Hydro is implementing these ideas with its e-collaboration – remote access infrastructure.
One facet of this is that Henriksen can use the TV in a hotel to receive his email – at ‘secure web hotels’. Henriksen’s presentation went way beyond KM to outline Hydro’s complex IT infrastructure. Hydro’s trading is migrating to a unified Enterprise Application Integration (EAI) – based system. Elsewhere the ‘overriding paradigm’ is the portal which targets specific groups of users, and has clearly identified objectives and business cases. Hydro deploys IM tools including Autonomy, Corporum, Lotus Domino, Documentum, NetMeeting, QuickPlace, Symphoni and Interwoven’s TeamSite. Chairman Reid Smith wound up the proceedings by putting K-managers in their place – “you wanted to be a rock star, but you ended up a roadie”.
This article is abstracted from a 10 page report on the IQPC Advanced KM in Oil and Gas Conference produced as part of The Data Room’s Technology Watch Service. For information on this service send mail to email@example.com
Oil ITJ – What is the scope of Shell’s Oracle GIS implementation?
JdLL – Almost entirely upstream. We have consulted for retail, but the real focus is the 35,000 strong Shell global E&P sector. Currently, the Oracle Spatial (OS) solution has been deployed or is in the process of being deployed in Brunei Shell, PDM Oman, South America and is extending to Houston and Australia.
Oil ITJ - What data types and extent will the Shell GIS offer?
JdLL – The vision is of a global, federated database. A pragmatic decision needs to be made as to what goes in and what the business units keep to themselves. We draw a line. Above the line everything is pushed out to global database. Below the line, sensitive information is kept locally and will not be visible at the global scale. In the North Sea, Shell Expro UK, Expro Norway and NAM are sharing data ‘above the line’ which is merged into one map view at Rijswijk by SIEP - along with its own and third party global datasets from companies like Robertson Research and IHS Energy. This provides a synthetic view of the whole NW EU cluster. Anyone interested in regional plays, prospects or pipelines has a seamless view, avoiding trying to ‘manage’ data across assets.
Oil ITJ – it sounds like you must be confronted by multiple problems of nomenclature and data standards?
JdLL – Shell has always had standards for surface features, topography, wells etc. We are currently working on formations, plays so that geologists will be able to populate the subsurface data model as consistently and easily as has been done for surface features. The common model naming conventions are grouped into the Shared Reference Library.
Oil ITJ - How do you go about enforcing procedures and standards?
JdLL – Data management is not ‘sexy’. But data, especially spatial, is interesting and our people appreciate the tools we are providing to organize spatial data properly. Our business units recognize the value of regional studies where standards really pay off. We all believe in the value of the information in our archives. In Rijswijk, our Basin Field Evaluation teams, who draft well proposals, are now building a GIS-enabled library of projects to look into analogues. As our ex-Exploration Head Roel Murris once remarked, “We probably have more oil in our archives than we’ve ever actually found!”
Oil ITJ - What key technologies are deployed in the Shell GIS?
JdLL – We have replaced our legacy GIS with the security and functionality of the RDBMS. ESRI and Oracle work together to support all ESRI products and Oracle 9i. Shell deploys nearly all ESRI’s products. The casual user gets ArcIMS data fed from OS. The analyst will have ArcGIS on the desktop while the data manager will likely have the full ArcInfo suite. We have some issues reading and writing to OS and we suffer from (and have learned to live with) a non synchronous 6 monthly release cycle for Oracle and ESRI products. OS has been deployed, managed with ESRI tools. Locally users can set up a GeoDatabase with SDE which is a very flexible solution.
Speaking at Oracle World in Denmark last month, Oracle CEO Larry Ellison spent a lot of time bashing the opposition. Ellison says IBM’s mainframes are in for stiff competition from clustered Intel machines running Red Hat Linux. Clusters are ‘inherently robust,’ making fault tolerant computers out of commodity components.
Ellison was particularly scathing about the Windows/Unix implementation of IBM’s DB2 database which ‘doesn’t share a single line of code, [with the flagship mainframe version] and was developed by different teams in different countries.’
For Ellison, the next step is to “put everything inside the Oracle database - email, word documents, spreadsheets. So that when you search for say, ‘Shell Oil,’ you retrieve all the relevant documents”. According to Ellison, Microsoft is moving its email server into a database but, ‘This will take them several years - we have already done it’. The same goes for the Windows file system. You can already store .docs, Excel, Power Points and OLAP reports in Oracle. For Oracle (and seemingly Shell Oil ) data, all and any data, belongs in a database.
Web Services - hype
Ellison was skeptical about the importance of web services, which ‘as we know are supposed to solve all known IT problems’. Except, that is, for the ‘ludicrous notion that web services will let you connect applications’. Web services is ‘just a standard protocol, a modern version of IP – that’s all!’ Ellison offered an analogy. The notion that web services is the lingua franca of IT is like suggesting that a cell phone will help you call France. “Web services is a great tool but it won’t make you speak French.”
CGG has added fractured reservoir characterization to its ‘Vista’ seismic processing modules. FracVista leverages CGG’s experience of preserved amplitude, 3D, 4D and converted wave processing to provide quantitative information such as direction and intensity of sub-seismic scale fractures for both carbonate and clastic reservoirs. FracVista combines inversion with a proprietary geostatistical techniques to estimate the azimuthal variations observed during modern, high fold, wide azimuth P-wave seismic acquisition. By integrating such information into the regional stress framework, a better understanding of the impact of fractures on field production can be obtained. FracVista is proving popular in the fractured reservoirs of the Middle East.
Welcome to the 2002 Oil IT Journal CD-ROM Archive. This year, the acclaimed Archive contains nearly half a million words of focused reporting on upstream IT including -
* 1526 articles, the complete Oil IT Journal reference since 1996
* References to over 1300 companies
* References to 1300 individuals active in oil and gas information technology
* References to 1600 commercial products
* Over 50 MB of sponsor-contributed white papers, commercial presentations and video.
The Archive should auto-run once inserted into your CD-ROM drive. If not double click on the file index.htm in the home directory of the CD. More information and support is in the readme.txt file on the CD.
Once you’ve tried the CD-ROM Archive you will realize the benefits of having Oil IT Journal online. Email firstname.lastname@example.org or visit www.oilit.com for details of how to upgrade this single user Archive into a site license for your organization - along with monthly updates.
Thanks to our Sponsors
The 2002 Oil IT Journal CD Archive was made possible with generous
help from our Sponsors;
Foster Findlay Associates
LSI Logic Storage Systems
Schlumberger Information Solutions
Venture Information Management
Please check out the sponsor-contributed material on the CD and follow the links to the Sponsors websites for more information on their products and services.
As revealed in Oil IT Journal last month, Denver-based Exprodat Technology (ET) has sold its Web Archive Manager (WAM) to Landmark Graphics Corp. Under the terms of the agreement, Exprodat Technology will continue to develop and maintain the product for a minimum of three years, with marketing and sales taken over by Landmark with immediate effect.
ET president Bruce Rodney said, “This deal reaffirms our strategy of US-based technology innovation. We work with our clients, in this case Conoco Inc., to incubate concepts into products that are created market-ready.” WAM - a.k.a The ‘Archiver’ is a browser-based application for archiving E&P data, including components from multiple applications, data stores and documents. The software was developed in conjunction with Conoco for use worldwide.
Rodney continued, “Our information management vision is simple. Present the user with a map displaying both active and archived projects. Click on an active project to browse live data, click on an archived project to browse a snapshot of that project. Click on ‘search’ to find stuff, no matter where it’s located.”
WAM creates the physical archive and a permanently online database of archived objects, as well as a project ‘stub’ in Web format. Components can be browsed, documented and searched, even when a project is offline. WAM is the second in a series of web-based tools development by ET. The ‘Browser,’ for online project data browsing was sold to Landmark Graphics in 2001 and is marketed as ‘WOW’. ET is now working on the ‘Administrator’ for web-based system and project data administration.
Marathon Oil Company is implementing Wellogix’ WorkFlow Navigator to improve technical and commercial communication with well stimulation service providers.
Marathon’s Phil Snider said, “We are focusing on stimulation and fracturing because of the impact they have on well performance. Wellogix’ software ties together different aspects of the drilling and completion process and has become a significant aid to our knowledge workers.” Marathon anticipates that Wellogix’ web-based software will improve engineers’ effectiveness by providing a platform for global knowledge sharing and standardization. Initial deployment will be in Marathon’s Canadian operations.
Wellogix has also announced a new release of WorkFlow Navigator. V 6.0 includes planning and communication capabilities and real-time decision support for the specification of technical requirements for complex oilfield services. New features in version 6.0 provide higher information and decision visibility, versioning capabilities to assist users in tracking project progress, enhanced well planning capabilities, new collaboration features and new project management tools including the ability to transfer project ownership.
Wellogix co-CEO Jeff Livesay said, “WorkFlowNavigator 6.0 offers enhanced visibility and control over multiple internal and external team members and organizations, enabling significant improvement in our clients’ ability to plan and execute quickly and effectively.”
The Houston-based Information Store (I-Store) is looking for partners to help market PetroTrek, its upstream asset and data management software. I-Store CEO Barrry Irani told Oil IT Journal that although its UpstreamInfo (UI) venture with EDS and Raytheon had failed, the PetroTrek suite was not at fault. Irani said, “Industry was just not ready for outsourcing its data management. UpstreamInfo was ahead of its time – even though the members appreciated the I-Store technology.”
According to Irani, while the UI experience showed that there was resistance to putting data outside the corporate firewall, the technology for sharing information with partners – dubbed ‘JV Solutions’ – proved popular. PetroTrek was leveraged to expose in a secure and controlled manner, data stored in corporate databases. Now that I-Store has recovered all IPR on technology developed for UI, the company is seeking alliance partners to market PetroTrek. The PetroTrek concept is simple – leave data in situ and manage access through entitlements and web-based technologies. I-Store has been riding the internet wave since it’s initial vision in 1994 – Irani believes that this was a ‘good call’ at the time.
PetroTrek works with Finder, OpenWorks and other data stores. The philosophy is to work with, rather than to displace, legacy systems. Caching techniques assure good response even for large data sets. Entitlements controls are sophisticated enough to limit user access to data from different reservoirs within a well section. PetroTrek includes a mechanism for synchronizing data in disparate databases. Irani claims the PetroTrek is very easy to deploy and learn claiming a 30 minute learning curve for end users.
C & C Reservoirs
Jon Roseway has been appointed senior research geoscientist for C & C Reservoirs’ Geoscience Group in London. Roseway was previously with Amerada Hess.
John Knowles and Chris McCarthy have joined the board of DPTS Storage.
Andreas Ehinger has been named Director of the French Petroleum Institute’s Geophysics research division in an internal promotion.
Paras Consulting Inc.
Flemming Rolle, has been named President of Paras Consulting Inc., and will head-up the new Houston office.
Pam Koscinski is to become VP Data Acquisition with PennEnergy Data. Koscinski was previously with IHS Energy.
Maryann Stack has joined PetroWEB to work on the PennEnergy Data project (see Oil IT Journal Vol. 7 N° 5). Stack was previously with IHS Energy.
Sandy Esslemont has been appointed Chief Executive Officer of Roxar ASA. Esslemont was previously COO.
John B. Gibson has joined Veritas DGC as director of emerging technology. Gibson was previously manager of geophysical operations with Union Pacific Resources.
Stacy Kasper has been promoted to Director of Human Resources and Recruiting. Kasper came to ZettaWorks as a Corporate Recruiter from Smith & Associates.
Norwegian Technoguide, developer of the Petrel software suite is changing its name and re-branding Petrel to reflect its expanded scope and vision. The Technoguide name will be phased out over the next year and the company will be re-named as Petrel Workflow.
At the same time, the software previously known as Petrel will be re baptized Petrel Workflow Tools. Petrel Workflow Tools 2002 (PWT 2002) was announced recently and is said to be Petrel’s largest release completing a two-year period of development focused on ‘giving users the ability to model a complete project in one single application’.
PWT 2002 now lets geoscientists interpret 2D and 3D seismic data, in both a conventional 2D interpretation window and in 3D space. PWT 2002 now also includes ‘ground-breaking’ technology such as an intuitive process manager with predefined processes and mapping in batch functionality that allows users to quickly update models when new data is available. 3D autotracking is said to be ‘incredibly fast and stable.’
Petrel product manager Paul Hovdenak said, “With this release of Petrel geoscientist can visualize and integrate all relevant data in 3-D space. There is no more important task for the production geoscientist than to quantify ideas. Petrel provides the common thread through which the sub-surface team expresses itself. Clients tell us that the projects that used to take six months can now be completed in only three weeks.”
UK-based software house Midland Valley Exploration has a new major release of its flagship geological modeling package 3D Move scheduled for August. 3D Move V4.0 introduces dynamic structural modeling, visualization of prospect evolution and basin modeling from within the structural model.
The software aims to provide a structural model and risk profile which can be integrated into the decision-making process. Interactive modeling of basin framework and evolution is said to provide geoscientists with insight into reservoir development and sourcing.
The latest release targets model-building speed improvements and data integration and re-use. Two new modules have been added to the software for fracture analysis and petroleum system modeling.
Canned workflows are provided to guide users through typical tectonic environments. Model output is scale-independent and can be used equally for basin analysis of reservoir analysis. A new structural hub tool offers fracture generation and analysis, charge and reservoir extent risking, overlap analysis and enhanced GIS integration and visualization. More from mve.com.
Landmark has just opened its new Executive Briefing Center (EBC) as part of the celebrations of the company’s 20th anniversary. The EBC, co-located with the new Asset Management Center at Landmark’s Houston headquarters, will provide E&P company execs with state of the art demonstrations of Landmark’s software prowess.
During the EBC housewarming executives from some of the largest E&P companies as well as Landmark’s top technology partners were treated to a Landmark technofest. Landmark president Andy Lane recalled the early days, “Since the company’s four founders revolutionized the oil and gas industry with an affordable 3-D seismic workstation 20 years ago, Landmark has delivered breakthrough software products and services that have increased oil findings and production for our worldwide customer base. We are leading the evolution of oilfield asset management and operations software into real-time, remotely accessible systems. Innovation is at the forefront of everything we do, and our new Centers are flagship facilities to show how new technologies are changing the way upstream E&P companies do business today. The EBC combines advanced teleconferencing and presentation facilities with a leading-edge visualization lab in which we can demonstrate the full power of our systems to our customers, in a realistic, 3-D environment.”
Landmark VP Eric Johnson told Oil IT Journal how the EBC had been kitted out with top-flight technology from partners such as IBM, Sun and SGI. The jewel in the EBC’s crown is the high-end visualization center. Here Trimension’s new edge blending software ‘Scorpion’ is used to drive triple Barco DLP projectors. This allows the output from three NT graphics cards to be displayed with unprecedented clarity onto the hard wall curved screen. Other systems available in the EBC computer lab include a top-of-the-range SGI Onyx which was used to good effect by Magic Earth president Mike Zeitlin to demo GeoProbe’s latest bells and whistles.
Halliburton and Landmark also recently introduced a new Asset Performance Consulting service (APC), offering technical and management consulting to improve the performance of assets, asset teams and E&P portfolios. APC consulting services span the oilfield lifecycle and include exploration, development, production enhancement, divestment and abandonment. The APC leverages Halliburton’s best practices such as exploration play and prospect generation, development scenario optimization and mature field optimization.
ChevronTexaco is to deploy Digital Fountain’s technology to speed data transfer from its remote sites. The primary application will be transferring 1-5 GB data files from platforms and onshore drilling sites in an effort to speed up the interpretative process. Such data was previously transferred to tape, and sent by courier services. Bandwidth limitations also led to the use of in-country professionals to perform analysis at considerable cost. Digital Fountain’s (DF) technology claims to optimize the use of existing bandwidth which, according to DF is ‘saddled with the performance issues associated with TCP where high latency and loss are major issues’. DF’s flagship product Transporter Fountain is described as a ‘no compromise’ data delivery product that provides speed without sacrificing reliability. Digital Fountain claims that its solution ‘outperforms any standard FTP server by orders of magnitude*’.
DF leverages what is described as its ‘revolutionary’ Meta-Content technology whereby data packets do not need to be received sequentially. Meta-Content encodes data into a series of equations from which ‘an unlimited number of users can reconstruct a perfect copy of the original content’. Meta-Content is claimed to eliminate the need for retransmissions or receipt acknowledgement and makes data loss irrelevant. Transporter Fountain provides a drag and drop content management interface with support for all major attached storage file sharing systems, including NFS, CIFS, and SMB. The Fountain is available in two chassis sizes a 1 RU Fountain Server with 1.5 Gbps output and 5 GB storage and a 2 RU Fountain Server with 1.5 Gbps output and 85 GB.
Transmission rates can be adjusted to arbitrate and prioritize individual data transfers. ‘Congestion control’ is built in to the Transporter Fountain meaning that the administrator is in control of the rate of transfer, not the network conditions or the intrinsic limitations of a transfer protocol.
*Editor’s note – the ‘orders of magnitude’ claim would appear somewhat extravagant, our vanilla FTP exchanges from France to the US usually take place at speeds approaching the nominal bandwidth of our ADSL line.