April 1998


PI pulls plug on POSC - backs rival PPDM data model. (April 1998)

Charles Ivey, CEO of PI(Dwights) andPetroconsultants announced that they will no longer support the Petrotechnical OpenSoftware Corporation.

In a dramatic announcement at the Public Petroleum Data Model (PPDM) Association's Houston Member Meeting, Charles Ivey, CEO of PI(Dwights)/Petroconsultants gave enthusiastic support for the PPDM data model and at the same time announced that PI would no longer continue with their membership of the rival Petrotechnical Open Software Corporation (POSC). Ivey's position on data management technologies was revealed in an extensive interview in PDM Vol 3 N° 1. In this interview, Ivey explained PI's philosophy of providing "Just in Time" data as opposed to the more theoretical models and systems developed by POSC, which Ivey categorizes as offering "Just in Case" solutions. Ivey's attack on the POSC Epicentre data model was two pronged. Epicentre, according to Ivey is too complicated for common mortals, and cannot be engineered for performance.

Quixotic

Few would disagree with the criticism that Epicentre - with its highly normalized myriad of tables and relationships described in the esoteric Express data modeling language - is complex. Defenders of Epicentre would claim that this complexity is the price to pay for detailed mapping of the real-world. The performance issue on the other hand is of primordial importance to PI(Dwights), particularly in the North American context. Monthly updates of PI(Dwights)' massive North American database stress the relational database to the limit. Ivey considers that the relatively straightforward approach to database design adopted by the PPDM association helps them to load this data in a timely manner.

Babel

Ivey's talk was subtitled "The quixotic search for data integration and management" and likened the POSC Epicentre database to the Tower of Babel, with complex mappings of every imaginable real world entity. Ivey cited Freedman Dyson, who wrote recently in Wired Magazine, "Big projects are guaranteed to fail because you never have time to fix everything." PI's approach has therefore to build on the PPDM data model which has been augmented to include a horizontal drilling subject area, a subject area for production and has extended the land/lease portion and seismic navigation areas. The PI data model is referred to as the PIDM model, and is deployed in the P2000 data management product.

Iris21 outside scope

Petroconsultants' Iris21 data model will not for the meantime be integrated into PI's PPDM-based strategy. Ivey acknowledged that Iris21 data model is "too complicated and unfriendly to the end-user". But Iris21 is the only data model that can handle Petroconsultants' concession-based international data. First attempts at federating the North American PIDM data model with Iris21 will therefore be limited to the deployment of a PPDM-like wrapper around Iris21 which will enhance ease-of-use while preserving the whole of the Petroconsultants dataset.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_1 as the subject.


The Matrix - new licensing deal from GeoQuest breaks the mould. (April 1998)

A new software license pricing structure,integrated into a $7 million - 3 year deal with Lasmo is set to change the way E&Psoftware is sold.

A recurrent complaint from IT managers working with the high-cost software deployed in E&P shops is the lack of flexibility in license costs. If a user wanted occasional use of n licenses for a product, then by golly, they had to buy (and maintain) n licenses. London and Scottish Marine Oil (LASMO) - one of the largest independent exploration and production (E&P) companies in the world - have just announced a technology partnership with GeoQuest that sets out to revolutionize the way in which E&P software licenses are sold. The new deal is the "Matrix", a pay-as-you-go license which caters for the varying usage that typifies the E&P department. This new technology partnership calls for GeoQuest to provide LASMO's world-wide operations unlimited user access to a wide range of GeoQuest software for three years. "We think this is going to be an enormous benefit to LASMO by allowing us to standardize on applications software," said Hugh Banister, LASMO's E&P systems manager. "Database and application integration is key, and LASMO took a strategic decision to go with a single supplier to achieve this."

$7 million deal

The deal - described as a strategic supplier agreement - is for an initial three-year period and is worth $7 million. It is planned to have software installed at all LASMO offices by the end of the first quarter this year. Before installation, LASMO will conduct site visits to evaluate the current information technology infrastructure, as well as local internal support for hardware, software and data management. "This partnership will help us better understand the needs of LASMO," said Jan Erik Johansson, Business Manager of GeoQuest Europe and the CIS. "As technology partners, we share the common goals of improving efficiency and reducing cycle time in the exploration process." Adds Banister, "Our partnership will lets us be more flexible in delivering software, services and training to personnel. If we open a new office, all the software required by the geoscientist to perform their duties can be in place the day it opens."

The Matrix

The Matrix License – a new concept in licensing from GeoQuest - uses the Flex License Manager (Flex LM) to meter the use of each product. The license payment is initially based on an anticipated utilization, but extra licenses are available at all times to cope for overload. At the year end, Flex LM indicates the true usage and billing is adjusted accordingly. Existing client site licenses mat be incorporated into the Matrix. Barry Taylor, Software Products Sales Manager with GeoQuest told PDM "Our customers want more flexibility. They are sick of filling in capex forms for new purchases or having to pay on a per license basis. They want the ability to scale up and down as workload demands – hence our Matrix marketing concept.". The GeoQuest software suite has been chosen as the standard for all LASMO offices. LASMO will also benefit from the release of new software modules and technology under the agreement. "This is going to be an enormous benefit to LASMO by allowing us to standardize on log analysis, seismic interpretation and data management software," said Banister who continued, - "Our partnership will allow us to be more flexible in delivering software, services and training to personnel. If we open a new office, the core software needed by geoscientists can be in place the day it opens.". Banister told PDM "LASMO wishes to remove the inadequacies that have been introduced in the past by technology and administrative limitations".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_2 as the subject.


PPDM, POSC Geoshare - how many standards do we need? (April 1998)

PDM's editor Neil McNaughton muses on recentdevelopments in the data modeling arena and asks - is it important to the buyer?

This month's announcement from PI(Dwights) endorsing the PPDM data model over POSC's Epicentre is the latest in a protracted series of alternating skirmishes and rapprochements between the two organizations over the last few years. To recap for those of you who may have come late to this edifying spectacle, the story starts around 1991 when POSC, located in Houston, (but with strong backing from European National oils) and the Canadian Public Petroleum Data Model set out independently to build the "mother of all data models". POSC got some forward thinking purist data modelers on board, whereas PPDM was founded by commercial practitioners of data modeling, including Finder Graphics Corp. - later to be subsumed into the GeoQuest organization.

Middleware

PPDM focused on producing a working database, defined at the level of tables and rows, with real-world compromises both in terms of scope and de-normalization. At the same time POSC's approach can be categorized as an attempt to model the real-world of E&P in as complete and uncompromising a manner as possible. Another important difference in concept was the way in which applications would use these databases to share the data. In the case of PPDM, all applications were to access the data through direct SQL access, whereas POSC developed a sophisticated scheme using a "middleware" layer, the Data Access and Exchange layer - which was to allow better separation of application and data, better future-proofing and all sorts of other benefits.

Expectations not met

The first observation on both of these efforts is that neither has met with the initial expectations. Despite vendor claims, there are no commercial products that use either PPDM or POSC in a way that could be said to correspond to the initial intent. Both models are used in different products - even in the same product - yet no "compliant" application can access data in another vendor's equally "compliant" database. There have been two major attempts over the years to bring POSC and PPDM together - to no avail - while the two major vendors appear to use the standards issue as a marketing football, with both GeoQuest and Landmark settling on POSC's Epicentre data model, but with neither offering interoperability with the other via a DAE.

The outsider

The other runner in this race is Geoshare, which proffers yet another data model. Geoshare has not had the benefit of as much promotion as POSC or even PPDM, but it is the de-facto standard for the exchange of data between E&P applications, and therefore is as near as we have got today with interoperability. Next month's PDM will report from POSC and PPDM member meetings as well as the Geoshare user group, we'll be giving a "state of the nation" analysis of data modeling and models, and who is using them and for what. Meanwhile we invite you to read the contributed article by Mark Chidwick (Panther) and Bruce Rodney (Exprodat) for another view on where we are on the long and winding road to interoperability.

Who'll pay?

In the interim, PI's abandonment of POSC contrasts with the apparently enthusiastic claims for POSC compliance made by both Landmark and GeoQuest. It is hard to extract a straightforward editorial line from these developments. On the one hand, data gathering and use may mandate different models. On the other hand, how many standards can we manage? As a variant on my conclusion last month, "buy not build" is all very well, but if we end up with complex and hence costly systems, who is going to pay for their maintenance? The oils of course, in more ways than one.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_3 as the subject.


GeoQuest's GeoForum 98 - sun, surf and … automation. (April 1998)

Francis Mons, Vice President of GeoQuest EAMEwelcomed around 450 customer delegates from 79 oil companies together with some 100GeoQuest personnel to the 1998 edition of GeoForum, GeoQuest's European technology forum.Mons mused that the sum of the production from all the oils present would exceed that ofOPEC. Given that three of the best hotels in Cannes had been taken over for the event(with posted room rates of around $650 per day) the cost of the event was probably worth afew days OPEC production too. PDM brings you the highlights.

Keynote from Thierry Pilenko, GeoQuest's new president.
Pilenko gave a top level view of where GeoQuest is going today. The target horizon for Pilenko's talk was 2005-2010, and the objective - "Automated Reservoir Optimisation". In the same breath, and following each mention of the word "automation" Pilenko stressed - as if to reassure the assembled mouse-clickers that "We don't want to remove creativity, just to eliminate the tedious aspects of the interpretation and optimisation process". Top level talks between service companies and oils have led GeoQuest to set themselves a very high objective. In the above time-frame, the aim is to halve finding costs by upping discovery size, diminishing the number of dry holes and reducing drilling costs. The real challenge, Pilenko stated, is "to do this with existing staffing levels by maximising user potential".
New paradigm
The new paradigm for reservoir optimisation will involve more, new players from sister companies in the Schlumberger group. To date, GeoQuest has provided a "small piece of the cake". To leverage to real time, new sensors and actuators will have to be developed and integrated into the software environment. Among the new players will be Schlumberger's customers and third party providers. Offering access to the GeoQuest software environment has been a "major strategic decision". Open access will be provided through standards allowing third parties to develop their own ideas on a common software platform which will integrate the new automated process environment. In this context, Pilenko affirmed that GeoFrame is already offering such services to third parties and is "100% POSC compliant". (Editor's note - For those of you who have escaped PDM's editorial line on POSC "compliance" we would encourage you to refer to PDM Vol 3 N° 2).
An example of how all this will hang together is the move from today's interpretation environment - still essentially the 3D two-way travel time seismic cube - to the near-term objective of the velocity cube and depth cube deliverables. Further down the road, interpreters will be working directly on the "geology cube" which will integrate the seismic-derived information with corporate knowledge utilising all the interpretation skills available. This geology cube will also incorporate quantitative measures of the uncertainty in its spatial attributes and will be delivered with tools to manage and audit such uncertainty.
Economic space
Once the play types have been defined and associated with certain seismic attributes these can be studied throughout what then becomes the "prospect cube", and as well paths are developed, and potentially modified during drilling, this in turn becomes the "planning cube". Real time input to the planning cube will come from surface mounted sensors recording information transmitted from the drill bit. This will provide continuous information gathering, incorporated into the model in real time, and allowing for changes in the well trajectory. Another example of such real-time adjustment is the possibility that a field can be discovered and appraised in one operation. All this leads Pilenko to another geometrical environment, - where one navigates in "economic space" - where dollar values can be computed in real time for optimal drilling decisions such as dog-leg severity.
Automation key
Pilenko's top level analysis of the way in which Schlumberger's product line has evolved over the years showed how the company began with what was essentially tools and services. The next phase in Schlumberger's development followed from this as the ever increasing data volumes generated by the tools needed managing - leading to the development of data management solutions - the xxDB product line and of course, Finder. Then followed - or perhaps follows, because this is where we are today, the integration of all of this within the GeoFrame environment. The next phase - that slotted for the first decade of the 21st century - will involve the automation of many of these processes and workflows. The business impact of each of these phases has been considerable, but Pilenko anticipates that it is the last, automation phase which will offer the largest return on investment. "Automation will be more important than anything we have done in the past" claimed Pilenko.
ProMIS
Vlad Skvortsov introduced the concept of the Automated Information Factory (AIF). Production engineers require information from different sources and different disciplines often stored at different locations, to different quality standards, versions and perhaps integrating paper data. Ideally the same data should serve for studies at various timescales - reservoir (long term); field management (monthly) and well management (daily). The AIF intends to merge all these scales of observation in a Pilenko-esque datastore centred on Finder, but integrating web access and the use of Oil Field Manager PC based end-use tools.
Finder is presented in the role of a data hub with data sources and sinks such as real-time, field data, paper based data sources, 3rd party digital data, Office apps and analytical software. The AIF - (aka ProMIS) wraps all this up into a single data source, with automatic data capture and loading. Skvortsov claims that today we spend 90% of the available time in data preparation and only 10% on analysis, tomorrow, ProMIS is set to reverse this. These tools rely on the extendibility of Finder using Oracle technology to constitute an application database. In other words there will not be one massive database for all applications.
SEM - revisited
Ian Bryant of Schlumberger Doll Research described how the new reality of the SEM is being developed. The basic problem is that the reservoir is unevenly and under sampled. As an example, the area actually sampled by logs in an oilfield may be as little as 0.0001%. The impact of this depends on reservoir geometry. For flat lying beds you may get away with relatively low sample density. For a labyrinth type reservoir this is unlikely to hold true. Current 3D models honour some of the data but introduce a new problem. That of an implicit confusion between real data and interpretation. There is a requirement to visualise what we know and where we know it; and what and where we don't know. Typically costly processing and interpretation may be performed on some datasets such as well logs or 3D seismics. But neither high resolution log information, nor 3D stratigraphic information actually gets into the model. At a well, complex reservoir information may be collapsed to binary sand/shale voxels - while seismic information may be reduced to top and bottom of reservoir. "If a picture is worth a thousand words, an image is worth as many wiggles".
Deja vu?
So is the SEM Visualisation software revisited? No according to Bryant who describes the use of the SEM in the "validation gauntlet" whereby a fast simulator is used to predict and match iteratively to obtain a number of models that fit the data with an accompanying measurement of associated uncertainty. This was demonstrated with a video of a 3D view of a fault-block in Statoil's Gullfaks field. In one window a ray-traced seismic model was compared with the recorded seismic data. In another the input geological model could be tweaked and the impact of such adjustments viewed in real time on the seismic simulator. The demonstration was sufficient to impress some Statoil personnel in the audience who may have had some second thoughts of the deal they have just struck with Landmark for an enterprise-wide computing solution.
Petrotech
The client side of the outsourcing story was presented by Paul Blair of BG E&P. The decision to outsource was made as part of the de-merger of British Gas which resulted in the creation of BG E&P. Prior to this, British Gas's E&P effort was organised into "resource intensive" asset based teams. A modern enough business paradigm you might think, but there are no sacred cows in the cultural revolution of BPR. At de-merger, E&P was to downsize by 45% in a move to a functionally-based organisation-designed for better utilisation of resources. The lead role in the "Petrotech" outsourcing project was awarded to SAIC. This arrangement came about from BG's desire to have "centralised control through a single point of contact with the primary partner". 
Balanced
Working under SAIC are GeoQuest, Landmark and other contractors. Petrotech is described as a partnership and based around a core of service level agreements. A risk/reward cost model is used and cost savings are shared between BG and the providers. Performance is metered regularly by a "Balanced Business Scorecard". Petrotech has been up and running for one year now. Blair described outsourcing as a "major, non-trivial task". The first three months were a transition phase, with BG staff retained to assist with the process. Subsequent to this transition BG has experienced an (unplanned) 100% staff turnover - with many taking the voluntary retirement package which was on offer. Blair suggested that a retention bonus might have been a better ploy than paying people to quit!
Major findings after the first year
* Expectation levels were unrealistic
* There was confusion over the respective roles of the legacy corporate IT services and Petrotech. The latter was blamed for some of the failings of the former.
* Recruiting and retaining high quality staff proved tough
* Keeping members of the "old guard" in key positions was considered a mistake
* Friction was generated between the different "cultures"
* New technology was introduced "too slowly" - a new approach is slotted for 1998
Sensitive issue
On the positive side, the new professional approach - notably in cartography - overcame some scepticism on the part of the user community. Additionally, substantial cost savings have been reported - as much as 40%, although the situation before the change was such that the baseline has been hard to establish. Outsourcing is a sensitive issue to E&P personnel and Blair was probed by questions from the floor as to the overall efficiency and gains accruing from the outsourcing effort. Blair opened up and stated that the outsourcing decision was taken at "a high level" in the organisation and that not all the results have been positive. "We have lost expertise - outsourcing is a balancing act. In some areas the service is not as good as it was before the outsourcing initiative." The main positive point to date has been the cost saving.
GISWeb
Agip, in cooperation with GeoQuest EAME, have developed a web browser for Finder/Enterprise and other data sources. The main design constraints were to provide access to geographically dispersed data throughout AGIP's world wide operations, over low bandwidth links, and to access heterogeneous E&P data stores. Technology involves a three tier structure with the GISWeb Java client talking through CORBA links and a "dispatch middleware" to CORBA data servers grafted onto a variety of standard E&P data stores. This technology is connectable to any type of E&P data store by the association of a CORBA driver which allows the existing datastore to remain unchanged. Servers have been developed for Petroconsultants' IRIS21, AGIP's Forall environment and GeoFrame. Maps are drawn vectorially on the client screen and can be zoomed, panned and selected. Intelligent scale-sensitive data transmission economises on bandwidth and ensures reasonable performance even over low bandwidth internet connections. Apart from the Java client, another version of the tool also exists as a Finder Smartmap client. This works in a similar way to GeoQuest's GeoWeb product except that it is not limited to a single pre-defined Finder map, and access to foreign data is facilitated. Herve Ganem described the job of converting third party data stores to run as CORBA data servers as "relatively simple" emphasising that the data in the original data stores required no modification for this technology to work. Paul Haines (GeoQuest's Head of Data Management Product Planning) told PDM "These local-level developments are of considerable interest to us in the GeoQuest software division. They provide us with feedback on customer requirements and deliver working prototypes of software modules. We track such efforts closely and will integrate the results of such efforts into our product line if client demand is there".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_4 as the subject.


Keynote address from Thierry Pilenko, GeoQuest's new president. (April 1998)

Pilenko gave a top level view of whereGeoQuest is going today. The target horizon for Pilenko’s talk was 2005-2010, and theobjective – 'Automated Reservoir Optimization'.

In the same breath, and following each mention of the word "automation" Pilenko stressed – as if to reassure the assembled mouse-clickers that "We don’t want to remove creativity, just to eliminate the tedious aspects of the interpretation and optimization process". Top level talks between service companies and oils have led GeoQuest to set themselves a very high objective. In the above time-frame, the aim is to halve finding costs by upping discovery size, diminishing the number of dry holes and reducing drilling costs. The real challenge, Pilenko stated, is "to do this with existing staffing levels by maximizing user potential".

New paradigm

The new paradigm for reservoir optimization will involve more, new players from sister companies in the Schlumberger group. To date, GeoQuest has provided a "small piece of the cake". To leverage to real time, new sensors and actuators will have to be developed and integrated into the software environment. Among the new players will be Schlumberger’s customers and third party providers. Offering access to the GeoQuest software environment has been a "major strategic decision". Open access will be provided through standards allowing third parties to develop their own ideas on a common software platform which will integrate the new automated process environment. In this context, Pilenko affirmed that GeoFrame is already offering such services to third parties and is "100% POSC compliant". (Editor’s note - For those of you who have escaped PDM’s editorial line on POSC "compliance" we would encourage you to refer to PDM Vol 3 N° 2).

Geology cube

An example of how all this will hang together is the move from today’s interpretation environment – still essentially the 3D two-way travel time seismic cube – to the near-term objective of the velocity cube and depth cube deliverables. Further down the road, interpreters will be working directly on the "geology cube" which will integrate the seismic-derived information with corporate knowledge utilizing all the interpretation skills available. This geology cube will also incorporate quantitative measures of the uncertainty in its spatial attributes and will be delivered with tools to manage and audit such uncertainty.

Economic space

Once the play types have been defined and associated with certain seismic attributes these can be studied throughout what then becomes the "prospect cube", and as well paths are developed, and potentially modified during drilling, this in turn becomes the "planning cube". Real time input to the planning cube will come from surface mounted sensors recording information transmitted from the drill bit. This will provide continuous information gathering, incorporated into the model in real time, and allowing for changes in the well trajectory. Another example of such real-time adjustment is the possibility that a field can be discovered and appraised in one operation. All this leads Pilenko to another geometrical environment, - where one navigates in "economic space" – where dollar values can be computed in real time for optimal drilling decisions such as dog-leg severity.

Automation key

Pilenko’s top level analysis of the way in which Schlumberger’s product line has evolved over the years showed how the company began with what was essentially tools and services. The next phase in Schlumberger’s development followed from this as the ever increasing data volumes generated by the tools needed managing – leading to the development of data management solutions – the xxDB product line and of course, Finder. Then followed – or perhaps follows, because this is where we are today, the integration of all of this within the GeoFrame environment. The next phase – that slotted for the first decade of the 21st century – will involve the automation of many of these processes and workflows. The business impact of each of these phases has been considerable, but Pilenko anticipates that it is the last, automation phase which will offer the largest return on investment. "Automation will be more important than anything we have done in the past" claimed Pilenko.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_5 as the subject.


ProMIS an Automated Information Factory (April 1998)

GeoQuest's Vlad Skvortsov introduced theconcept of the Automated Information Factory (AIF) - which offers production engineersinformation from different sources and different disciplines often stored at differentlocations, to different quality standards, versions and perhaps integrating paper data.

Ideally the same data should serve for studies at various timescales - reservoir (long term); field management (monthly) and well management (daily). The AIF intends to merge all these scales of observation in a Pilenko-esque datastore centered on Finder, but integrating web access and the use of Oil Field Manager PC based end-use tools.

Finder is presented in the role of a data hub with data sources and sinks such as real-time, field data, paper based data sources, 3rd party digital data, Office apps and analytical software. The AIF - (aka ProMIS) wraps all this up into a single data source, with automatic data capture and loading. Skvortsov claims that today we spend 90% of the available time in data preparation and only 10% on analysis, tomorrow, ProMIS is set to reverse this. These tools rely on the extendibility of Finder using Oracle technology to constitute an application database. In other words there will not be one massive database for all applications.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_6 as the subject.


The SEM - revisited (April 1998)

Ian Bryant of Schlumberger Doll Researchdescribed how the new reality of the SEM is being developed.

The basic problem is that the reservoir is unevenly and under sampled. As an example, the area actually sampled by logs in an oilfield may be as little as 0.0001%. The impact of this depends on reservoir geometry. For flat lying beds you may get away with relatively low sample density. For a labyrinth type reservoir this is unlikely to hold true. Current 3D models honor some of the data but introduce a new problem. That of an implicit confusion between real data and interpretation. There is a requirement to visualize what we know and where we know it; and what and where we don't know. Typically costly processing and interpretation may be performed on some datasets such as well logs or 3D seismics. But neither high resolution log information, nor 3D stratigraphic information actually gets into the model. At a well, complex reservoir information may be collapsed to binary sand/shale voxels - while seismic information may be reduced to top and bottom of reservoir. "If a picture is worth a thousand words, an image is worth as many wiggles".

Deja vu?

So is the SEM Visualization software revisited? No according to Bryant who describes the use of the SEM in the "validation gauntlet" whereby a fast simulator is used to predict and match iteratively to obtain a number of models that fit the data with an accompanying measurement of associated uncertainty. This was demonstrated with a video of a 3D view of a fault-block in Statoil's Gullfaks field. In one window a ray-traced seismic model was compared with the recorded seismic data. In another the input geological model could be tweaked and the impact of such adjustments viewed in real time on the seismic simulator. The demonstration was sufficient to impress some Statoil personnel in the audience who may have had some second thoughts of the deal they have just struck with Landmark for an enterprise-wide computing solution.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_7 as the subject.


The Petrotech Outsourcing Project (April 1998)

The client side of the outsourcing story was presented by Paul Blair of BG E&P. The decision to outsource was made as part of the de-merger of British Gas which resulted in the creation of BG E&P.

Prior to the Petrotech project, British Gas’s E&P effort was organized into "resource intensive" asset based teams. A modern enough business paradigm you might think, but there are no sacred cows in the cultural revolution of BPR. At de-merger, E&P was to downsize by 45% in a move to a functionally-based organization-designed for better utilization of resources. The lead role in the "Petrotech" outsourcing project was awarded to SAIC. This arrangement came about from BG’s desire to have "centralized control through a single point of contact with the primary partner".

Working under SAIC are GeoQuest, Landmark and other contractors. Petrotech is described as a partnership and based around a core of service level agreements. A risk/reward cost model is used and cost savings are shared between BG and the providers. Performance is metered regularly by a "Balanced Business Scorecard". Petrotech has been up and running for one year now. Blair described outsourcing as a "major, non-trivial task". The first three months were a transition phase, with BG staff retained to assist with the process. Subsequent to this transition BG has experienced an (unplanned) 100% staff turnover – with many taking the voluntary retirement package which was on offer. Blair suggested that a retention bonus might have been a better ploy than paying people to quit!

Major findings after the first year

Expectation levels were unrealistic

There was confusion over the respective roles of the legacy corporate IT services and Petrotech. The latter was blamed for some of the failings of the former.

Recruiting and retaining high quality staff proved tough

Keeping members of the "old guard" in key positions was considered a mistake

Friction was generated between the different "cultures"

New technology was introduced "too slowly" – a new approach is slotted for 1998

Sensitive issue

On the positive side, the new professional approach – notably in cartography – overcame some skepticism on the part of the user community. Additionally, substantial cost savings have been reported – as much as 40%, although the situation before the change was such that the baseline has been hard to establish. Outsourcing is a sensitive issue to E&P personnel and Blair was probed by questions from the floor as to the overall efficiency and gains accruing from the outsourcing effort. Blair opened up and stated that the outsourcing decision was taken at "a high level" in the organization and that not all the results have been positive. "We have lost expertise – outsourcing is a balancing act. In some areas the service is not as good as it was before the outsourcing initiative." The main positive point to date has been the cost saving.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_8 as the subject.


GISWeb provides flexible low bandwidth access to multiple data sources (April 1998)

Agip, in cooperation with GeoQuest EAME, havedeveloped a web browser for Finder/Enterprise and other data sources.

The main design constraints were to provide access to geographically dispersed data throughout AGIP's world wide operations, over low bandwidth links, and to access heterogeneous E&P data stores. Technology involves a three tier structure with the GISWeb Java client talking through CORBA links and a "dispatch middleware" to CORBA data servers grafted onto a variety of standard E&P data stores. This technology is connectable to any type of E&P data store by the association of a CORBA driver which allows the existing datastore to remain unchanged. Servers have been developed for Petroconsultants' IRIS21, AGIP's Forall environment and GeoFrame. Maps are drawn vectorially on the client screen and can be zoomed, panned and selected. Intelligent scale-sensitive data transmission economizes on bandwidth and ensures reasonable performance even over low bandwidth internet connections. Apart from the Java client, another version of the tool also exists as a Finder Smartmap client. This works in a similar way to GeoQuest's GeoWeb product except that it is not limited to a single pre-defined Finder map, and access to foreign data is facilitated. Herve Ganem described the job of converting third party data stores to run as CORBA data servers as "relatively simple" emphasizing that the data in the original data stores required no modification for this technology to work. Paul Haines (GeoQuest's Head of Data Management Product Planning) told PDM "These local-level developments are of considerable interest to us in the GeoQuest software division. They provide us with feedback on customer requirements and deliver working prototypes of software modules. We track such efforts closely and will integrate the results of such efforts into our product line if client demand is there".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_9 as the subject.


Data Architectures - From Independent through Integrated to Inter-operating Systems(April 1998)

Mark Chidwick of Panther Software (Calgary)and Bruce Rodney, Exprodat (UK) investigate the state-of-play in data architecture anddesign. They conclude that the virtual databases currently being developed may offersignificant gains in terms of integration, but that interoperability is still elusive.

Isolated Systems

Databases were not invented to meet any known business need. A database is an IT construct for amalgamating data in a controlled and secure environment. But is this what business users want or need? When computer applications were first deployed in E&P, business needs were met directly by isolated applications with their own local data. Soon networks of interconnecting links developed to join these isolated systems. Half-link technology was one answer to point-to-point links. But such solutions are characterized by a lack of robustness and data leakage. Another strategy was to build a single central database. Unfortunately though, there is no one 'correct' model for E&P data, and different data models demonstrate better performance in different domain problems. The single centralized database has not been found to be a practical solution.

Integrated Systems

The last decade has been characterized by centralization and integration. In the utopian ideal, all software vendors would access the same physical database, oil companies could retire their legacy systems, and small players could build against the new model and provide best-of-breed applications. In practice, islands of integration have developed. The concept of a three-tier data architecture became the accepted data management system. The corporate layer contained the approved, secure, company version of data; the project layer contained multiple geographically-limited project databases, and the top or application layer contained copies of data in use by interpretive applications (see - Figure 1)

Figure 1

This architecture allows project databases to be focused and efficient while the corporate database provides a view across all projects. Data duplication, reconciliation and back population remain major concerns. Furthermore, the architecture may be fundamentally flawed. The system needs detail at both operational and application level, detail that is difficult to carry over the wide scope of corporate mid-layer. In practice, demand-population of the project databases directly often undermines such formal architectures.

Epicentre: The Last Great Database?

Epicentre is the most recent effort in designing the single industry data model which would unite all software vendors (see figure 2). However, take-up of POSC has not met expectations. The issues native to the single data model architecture have kept the smaller developers away. And despite claims of POSC compliance by all the major vendors, the industry is no closer to interoperability.

Figure 2

Interoperability through Business Objects

Recently, POSC has changed direction. Rather that focus on identical physical implementations of Epicentre at the database level, the current Interoperability Workgroup provides the architecture to share data at the ‘business object’. These business objects are best visualized as domain subsets of Epicentre in application memory. POSC's major contribution has been the logical model that describes E&P data and the inter-relationships in the data - whether implemented in a physical data store or in memory. In principle, applications have no knowledge of where the data came from - they communicate at the object, not the data level. This architecture insulates the application from the database. Rather than multiple applications executing against a single database, a single application can reference multiple databases. No longer do the models need to be identical, or even similar. In practice, there will be little motivation for the larger vendors to absorb the cost of re-engineering their well-accepted applications to this new object standard. However, there may even be no need for the major application vendors to integrate. Service providers could provide the integration tools using the existing development kits to integrate data and application events to the new object model.

Virtual Databases and Interconnectivity

As the current solutions do not meet our business needs, an alternative to the physical database model has emerged (see Figure 3). The idea is to provide a data management solution that is user-centric rather than model-centric. In this solution model, data management is approached from the desktop, providing a geocomputing environment that integrates both data and functions. Interconnected databases will be serviced with a light weight ‘federating’ database or catalogue. This database will be populated and maintained with a group of data servers. In this model, all applications appear to access data in any of their native data stores. Furthermore, each of the applications may be provided by different vendors, allowing us to achieve best-of-breed integration of tools. The actual structure of the underlying data models is now a distant concern, and the users are free to focus on the core business process. This virtual approach is not without problems. Most significantly, the technology to implement this type of solution is only now emerging and is not in widespread use in E&P. GeoQuest's Finder Enterprise is probably the most advanced, 'federating' operational and project databases via a 'metabase' kernel. Another big issue is one of standards. In order to interconnect successfully, heterogeneous systems need either exact matches on business object identifiers, such as well and seismic line names, or a mechanism for translating between different schemes. This isn't a new problem to data management, but it is a difficult one. Finally, data duplication isn't removed: project databases still need to be physically instantiated, i.e. there is still multiple source to multiple project loading taking place. On the plus side, the back-population of data is less of an issue, as data remains largely in situ. It may also be easier to maintain - there is less data duplication, and natural support for heterogeneous architectures. A virtual database may provide only 'cosmetic integration', but it does meet business needs. Business users don't care where the data actually resides, provided it appears to be integrated.

Figure 3

Full circle?

It may appear that we've come full circle, back to isolated systems. Actually we've advanced considerably, to a state of hopefully dynamic equilibrium between centralized generalist databases and distributed specialist databases and applications. Excessive focus on the database itself is unhealthy, and causes us to loose sight of the business objective, which is to get quality data to business tools on demand. This can be achieved equally through the looser integration of separate data stores. Shared data models are neither necessary nor sufficient to meet the goals of a successful E&P interpretation system. Interconnected data stores are necessary, and this refocuses the data management on the business process, rather than on data and models. We are entering an era of interconnectivity between islands of integration. Only time will tell whether this expands into interoperability. More from Exprodat at http://www.exprodat.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_10 as the subject.


QC Data's Acquisition of AccuMap Creates a New Data Management Powerhouse (April 1998)

QC Data (Calgary) has acquired all interestsin AccuMap EnerData Corporation. AccuMap is a supplier of data browsing and mappingsoftware. Accumap brings a customer base of more than 400 organisations and is used byover 70% of all Canadian petroleum related companies.

"What has made AccuMap so successful is their dedication to be the best with the fastest, easiest to use and most reliable system," says John Redfern, President of QC Data's Petroleum Data Services division. "With the purchase of AccuMap, QC Data can offer the complete range of data management services." "With QC Data's Information Hub and the AccuMap Opportunity Network, as well as the strategic partners of both companies, well be able to put all the best oil and gas information onto the desktop in seconds." QC Data has also completed an agreement with Spatial Data Warehouse, a not-for-profit organization formed to take over the Alberta Government's land mapping activities. QC Data and its partner, Martin Newby Consulting Ltd., will be managing the organization's $50 million database. QC Data was acquired eight years ago by its present Chairman Alfred H. Balm, the Dutch billionaire and owner of the Emergo Group. Balm recently sold his holdings in Canadian Fracmaster for approx. $600 million. In 1993, he hired Michael Pfeiffer, then President and CEO of Hughes Aircraft Canada Limited and gave him a free hand to build QC Data into "a global leader in the delivery of technical solutions and services".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_11 as the subject.


European FINDER Rationalisation Project - status report (April 1998)

Finder customisation is popular, but oftenleads to maintenance problems. Hugh Banister (Lasmo) has supplied PDM with the followingstatus report on a collaborative effort to share the cost of Findercustomisation.

The EuroFinder initiative is designed to federate the efforts of its member companies in the field of Finder customization. The idea is simple, going it alone is costly and leads to a hard-to-maintain site specific version of Finder. This has proved a particularly painful experience when upgrading from one version of Finder to the next. Applying the 80/20 rule to the various bespoke developments has allowed EuroFinder members to agree upon a common set of customizations which can be performed professionally and whose cost can be shared. The objective of the EuroFinder project is to provide a single physical implementation of the FINDER data model at all participating European client sites. A further objective is to move client based modifications into an environment more easily supported by GeoQuest with the added expectation that these updates will be included in future Euro-FINDER releases.

POSC-based development

EuroFinder development is to be based upon the Petroleum Open Software Corporation's Epicentre data model where possible, otherwise changes to Finder may be submitted to POSC for possible inclusion in Epicentre. User groups from Norway, the United Kingdom and Germany are cooperating in the project as well as GeoQuest.

The Project has been divided into four phases. These are;

Phase I - preliminary identification of clients' and GeoQuest customizations.

Phase II - construction of a database of FINDER customizations and prioritization of their rationalization.

Phase III - rationalization of customizations and cross referencing of the FINDER and the customizations to Epicentre.

Phase IV - generation of the European FINDER product as specified.

Phase I of the EuroFinder project has been completed. This phase was sponsored by GeoQuest UK and open to all FINDER clients in Europe at no cost. Phase IIa of the project has been funded by 14 participating companies and is underway. DataBasix/Venture, a joint alliance between two data management consulting firms, has been contracted to complete the Phase IIa work.

Benefits

The anticipated benefits from the EuroFinder initiative are :

More efficient data exchange between participating Oil Companies.

Reduction in costs associated with support and version upgrades.

The convergence of the FINDER corporate data store with the FINDER project store.

Greater FINDER functionality (e.g. data exchange, standards, loaders/unloaders, version control, forms and reports).

Improvement on the ability of GeoQuest and Oil Companies to communicate model extensions and associated documentation between themselves and between third-party companies.

The opportunity for GeoQuest to better understand the needs of Oil Companies, most particularly in terms of data types and how they are used.

The main challenges are the difficulty in reaching a consensus on the extension to the FINDER data model; and in getting buy-in from GeoQuest to develop and maintain the Euro-FINDER product. EuroFinder is looking for more Finder user companies to participate in later phases of the project. For more information, contact Banister on hugh.banister@Lasmo.com

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_12 as the subject.


Halliburton Energy revenues up 42% as antitrust investigation mooted (April 1998)

Halliburton's Energy Group's 1998 firstquarter revenues totaled $1.6 billion, an increase of 42 percent compared to the 1997first quarter. Revenue growth was particularly strong in international markets whererevenues increased by more than 50 percent from a year ago.

Following last month's announcement of a proposed merger between Halliburton Company and Dresser Industries, Inc., the companies have received requests for additional information from the Antitrust Division of the U. S. Department of Justice. Halliburton and Dresser indicated that the requests for additional information were not unexpected and they plan to respond promptly to the Department of Justice. The companies expect to complete the merger during the fall of 1998.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_13 as the subject.


GeoGraphix acquires Vantage Software (April 1998)

Vantage Software and its DSS-32 to beintegrated into the GeoGraphix suite

Landmark's GeoGraphix subsidiary - the PC-based software arm of the group - has completed the acquisition of Vantage Software, Inc. for an undisclosed sum. Vantage Software provides Windows-based production mapping and surveillance software based. Vantage Software’s Dynamic Surveillance System 32 for Windows (DSS-32 ), is described as a "proven and full-featured software system designed for production and reservoir engineers to quickly monitor and analyze project performance on a single or group of oil and/or gas wells". DSS-32 is an open environment and is data model independent, providing access to a wide variety of data sources, via Microsoft Access, SQL Server, Excel; Oracle and Sybase as well as any ODBC-compliant database. Robert P. Peebler, Landmark president and CEO stated "We believe there is no other software available that provides the flexibility and openness to such a variety of data sources. Production and reservoir engineers can make better engineering decisions with a visual tool that quickly identifies production trends, and resulting opportunities for workovers and infill well locations." More information from http://www.geographix.com

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_14 as the subject.


Open Spirit SIG invites partners (April 1998)

Open Spirit is developing cross-vendormiddleware and seeks partners in Special Interest Group

The OpenSpirit Alliance (Alliance) - see PDM Vol 2 N° 11 - is a consortium of oil companies and E&P software vendors sponsoring the design and development of the OpenSpirit E&P Component Framework (OpenSpirit) application-independent middleware which will enable ‘plug-and-play’ integration of E&P software applications and datastores. It is intended that this Framework will become the de-facto industry standard for E&P business object middleware. The Alliance has selected PrismTech as its development and marketing partner for OpenSpirit.

Special Interest Group

In order to ensure that Oil Company and other developers can benefit from the OpenSpirit Framework, PrismTech has created the OpenSpirit Special Interest Group (OSIG).

Membership of the OSIG offers many advantages including early access to the OpenSpirit software through a pre-release (beta) program. OSIG membership costs just US$1,000 per annum per company site. More info from osig@prismtechnologies.com

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199804_15 as the subject.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.