February 1999


Oracle 8i object technology core of new interoperability initiative. (February 1999)

Oracle is back into the fray with Project Synergy which will offer E&P technologists professionally developed 'shrink-wrap' versions of two of the main industry data models; POSC Epicentre for the upstream and POSC/Caesar for facilities and construction. Both models are to be supplied as plug-in 'Cartridges' for Oracle's latest 8i database.

After a period of apparent quiescence (we thought they were through with the upstream) Oracle are back big-time. Partnered by Chevron and Statoil, Oracle will be working on Project Synergy which it is claimed "will allow oil industry E&P co-venture partners and newly merged oil companies to share disparate business-critical data and monitor asset performance quickly, easily and collaboratively". The secret weapon in Oracle's armory is the new Internet - enabled database, Oracle 8i and Java. Through new "business intelligence" capabilities developed by Oracle and other applications vendors, oil companies will be able to make "better informed business decisions, dramatically shortening the time between investment and return".

Balanced Scorecard

In addition to the two data models from POSC and POSC/Caesar, Synergy will incorporate other products specific to the E&P industry as well as Oracle's horizontal resource Strategic Enterprise Management Suite including the 'Balanced Scorecard', acquired in 1998 from Graphical Information, Inc. Andrew Lloyd, who heads up the Synergy project within Oracle told PDM "Balanced Scorecard is a means of driving the business through a number of internal and external metrics. Synergy adds the energy content to these tools by focusing on collaboration and sharing between the different disciplines." See the exclusive interview with Andrew Lloyd inside this issue of PDM.

POSC support

Assisting in the development of Synergy is the ubiquitous PrismTech - who will be chipping in with the Open Spirit interoperability framework - and POSC. Donald L. Paul, vice president of technology and environmental affairs for Chevron Corporation said "Chevron has long been committed to industry standards and open computing architectures that will provide interoperability between applications and platforms from different vendors. Project Synergy provides Chevron the opportunity to participate in a commercial implementation of the POSC specifications, using advanced computing architectures and Internet technologies developed by Oracle."

Back to basics

Recently the trend has been away from the humongous central database, a move which perhaps reflects an admission of defeat more than anything else. Project Synergy sets out to rehabilitate the central data repository. The complex transactions involved in collaborative projects, joint ventures, and mergers and acquisitions "will be simplified through the ability to base critical business decisions on common, shared data residing in a single repository, regardless of the application". Added attractions for future Synergists will be the various other horizontal cartridge-based technologies including Spatial, Imagery, Time Series, and Text Management.

Innovative

Epicentre was a ground breaking development at its inception and implemented some technological solutions which were very innovative. Up to now, the deployment of the complex Epicentre model using conventional relational databases has proved problematical. The new object extensions to Oracle 8 will allow for the development of a database more in line with the original intent of Epicentre. Oracle claim that Oracle 8i makes a faithful, object-relational deployment of Epicentre a reality. This will be achieved by taking the Express code used in Epicentre and POSC/Caesar and convert it into the more generic Unified Modeling Language (UML) beloved of today’s object zealots.

Beans

PrismTech’s Epicentre Builder will be integrated with Oracle’s Designer 2000 to aid in the conversion. Another facet of synergy is to offer developers Enterprise Java Bean components rather than the more traditional C++/SQL interfaces. A move to the latest web programming language Extensible Markup Language (XML) is also mooted. This will allow self describing data structures to be created and accessed by different vendors – yet another route to the grail of interoperability. Products developed under Project Synergy are expected to be in beta test within six to twelve months, and will be marketed through Oracle Energy. Meanwhile PDM offers its readers the first of a three-part primer by Nigel Goodwin of Essence Associates on the technology under Synergy's hood - Epicentre and the POSC/Caesar data models.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_1 as the subject.


Standards and openness proliferate, but which is the right one? (February 1999)

PDM's Editor Neil McNaughton reflects on the arrival of two new standards initiatives which have arrived at a time when the industry is struggling for survival. He concludes that we are lucky that the major players in IT are taking a serious interest in the upstream.

There is good news and bad news this month. Good news in that data management and interoperability are getting some serious attention. Bad news that they might just be getting too much attention. First for the good news. A short while ago, when we built the last E&P Pleasantville on the strength of ephemeral $ 17-ish oil, when data management reached something of a summit (as shown by Helen Stephenson's survey in last month's PDM) and when we were forecasting dire personnel shortage and new ways of doing business, something important happened. We got noticed!

Bill, Larry and Hasso

We got noticed by some of the biggest players in IT. Bill Gates, Larry Ellison and Hasso Plattner, respectively shot-callers at Microsoft, Oracle and SAP. They got their minions to beaver away at checking out the industry and found - a challenge. Bill, Larry and Hasso then instructed the said minions to go forth and offer a solution to the challenge of oil industry data management, and they did. Each with their own special a priori. Just as Henry Ford offered the model T in any color 'so long as it is black', our new champions offered similar flexibility. For Microsoft, it could be anything 'so long as it is COM', for Oracle anything again 'so long as it is Oracle 8' and for SAP, anything more 'so long as it is BAPI'.

Buttered toast

The concomitant bad news - I know you've already guessed - actually fits in to a very general principle of science. The 19th Century French industrial chemist Henri-Lewis Le Chatelier conjectured that "A change in a variable that determines the state of a system's equilibrium causes a shift in the position of equilibrium in a direction that tends to counteract the change in the variable under consideration." Which being interpreted could be considered as being a formalization of what is known in the UK as Sod's law, in the US as Murphy's law and in France as the law of falling buttered toast. It is reassuring to see that Chatelier's law is still at work in the field of E&P interoperability. How is that? I hear you ask.

N-cubed?

Well the argument goes as follows. First you have a growing multitude of applications which can't talk to each other. Then humankind, in the form of standards organizations, enthusiasts and data managers in general decide 'wouldn't it just be great is we all agreed to fix this by standardization'. Then Le Chatelier's principle kicks in by returning, after a few years of effort a growing multiplicity of standards! Or as Bill Sumner suggested last year, "an N-squared problem of solutions to the N-squared problem". Or again as Jim Theriot of POSC put it, an "N-cubed problem of mapping between different applications convolved with different versions of data models". Whatever N was before, just got bigger by 2

And OpenDAEX?

Actually N grew by 3 this month with the announcement by Oilfield Systems of yet another interoperability initiative, OpenDAEX. You may have spotted our attempt to have two front pages for PDM this month. We really would have liked to have had three, with DAEX on the third, but that was beyond our own technology barriers. Let it suffice to say that for the community of DAEX users and developers, the OpenDAEX initiative makes a tempting addition to the interop sweepstakes.

Flattery

So although the interest bestowed on our business by the greats of the IT world is flattering, some of these considerations are going to make IT investment in the upstream quite a tricky issue in the medium term. I would suggest though, in view of the nature of the protagonists and in the light of the above analysis that when analyzing the various offerings you eliminate the words 'open' and 'standard' from the discussion and judge their merits on what is left. To kick off the analysis we offer our own preliminary review of how the commercial cards are stacking up below. Despite the caveats, all this new-found enthusiasm is great, we just hope it withstands the undoubted negative pressure of the downturn. Sure makes the data managers job interesting.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_2 as the subject.


Who's who in interop country? (February 1999)

With two new initiatives announced this month, PDM offers a check list of who is doing what in the three main interoperability initiatives currently proposed.

Initiative

Scope of interoperability

Sponsors

Support *

Technologies

COM for Energy Technical and Financial. GeoQuest

Landmark

Microsoft

PriceWaterhouseCoopers

SAP

BP Amoco

POSC

Statoil

COM

SAP BAPI

Synergy Facilities, Subsurface,

Technical and Financial.

Chevron

Oracle

Statoil

Prism Technologies

POSC

Oracle 8i

Java

Epicentre

POSC/Caesar

Open Spirit Subsurface CGG

Chevron

Elf

Geoquest

PGS

Shell

Statoil

Prism Technologies.

POSC

CORBA

Vendor data stores.

POSC Business Objects.

* By support we mean both technical and moral - i.e. quoted as supporting the initiative.

Interop space

This is a deliberately simplified analysis (pace David!) of the breakdown into technology and support. The important facts to emerge are the polarities of the different initiatives. They are several, and on both technical and commercial planes we would offer the following as possible constituents of a multi-dimensional ternary diagram which we invite you to draft yourselves :-

COM for Energy vs. Synergy = Microsoft vs. Oracle

COM for Energy vs. Synergy = COM vs. CORBA

COM for Energy vs. Synergy = Oracle Financials vs. SAP

GeoQuest's presence in both COM for Energy and Synergy precludes an editorially tempting GeoQuest vs. Landmark categorization, but it historically true that Landmark have shown increasing leanings towards Microsoft COM and away from the UNIX CORBA of OpenSpirit. Yet another slice through interoperability space shows Landmark and GeoQuest as outside of Synergy, a situation that reflects perhaps vendor reticence at throwing their own data models away and starting over with new technology. Java and Microsoft's ActiveX must be slugging it out in yet another dimension of the hyperspace.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_3 as the subject.


Fletcher Challenge Energy chooses Kelman Technologies for management of its exploration data. (February 1999)

Calgary-based Fletcher Challenge Energy Canada Inc, has selected KTI’s DMASS solution for the archival and maintenance of its seismic library. On-line data access will be provided off-site by Kelman's Archive Division.

The contract was awarded following a successful pilot project, conducted and evaluated in late 1998. "We have thoroughly evaluated the Kelman solution in addressing our on-line data management and retrieval requirements for our seismic data," said Jeff Allison, Geophysical Operations Supervisor at Fletcher Challenge Energy Canada. "We have been extremely impressed with Kelman’s professional services and attention to detail. It is obvious that this solution has been engineered end to end," he said.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_4 as the subject.


Epicentre and POSC/Caesar – perpetuating the schism between upstream and downstream cultures? Part One of Three. (February 1999)

Nigel Goodwin from Essence Associates offers PDM readers a primer on modern data modeling with reference to POSC's Epicentre and the facilities POSC/Caesar data model which are both at the heart of Oracle's latest Synergy initiative.

Every software application, from word processors to banking systems, has a data model which is simply a definition of the structure of the data used in the program and how it is stored. Formal data models are defined in a standard languages such as Structured Query language (SQL) or rather in the Data Definition Language (DDL) component of SQL. A real-world database consists of a database management system (e.g. Oracle), a data model, user interface components, the actual data held in the database, and a set of procedures for managing the data.

A short history of E&P Data Models

Early on in POSC’s existence it was realized that in order to produce a coherent industry standard data model, an overall framework was required. This framework would make up the foundation of the detailed data model. A ‘high level model’, was first produced which defined basic concepts such as ‘activity’, ‘property’, and ‘object of interest’. The remainder of the model was then derived with more and more detailed specialization of these basic concepts.

The first Epistle

At about the same time as POSC started out, the Epistle (www.stepcom.ncl.ac.uk) group was formed, to aid technical collaboration between projects working on an ISO standard for data models for the process industry. One focus of Epistle was the requirements of data models for data sharing as well as for data exchange. Epistle itself made extensive use of on-going work at Shell which defines principles and guidelines for how to produce a ‘good’ data model. POSC, through its London office, was a regular attendee at Epistle meetings from the start. A criteria used by Shell and Epistle for a good data model was flexibility. A good model should cater for changing business practices and environments. It was recognized that while hardware and software comes and goes, data, and the structure in which data is held, can have a lifetime of many decades.

Entity types

The principle guideline that aimed at ensuring a good data model was that entity types - the building bricks of the data model - should represent the essence of things, rather than how they relate to other things. As an example, an entity type ‘father’ is generally not good data modeling practice – instead, there should be an entity type ‘person’, and through relationships with other entity types we can model the role of ‘father’. Epistle, in close liaison with Shell, produced their own ‘high level model’ implementing such techniques. In 1992 POSC produced version 1 of Epicentre which generally encompassed the subsurface domain, but also impinged on the facilities arena. During review of Version, it became clear that there were some problems with the way equipment and materials were modeled. A liaison was established with Epistle specialists to see how Epicentre might benefit from Epistle. The first realization was that, although the two data model groups had had almost no prior contact, the high level models were remarkably similar. Indeed, Philippe Chalon, the POSC data model project leader at the time, said that had he known of the existence of the Shell/Epistle high level model, he would have used it and saved a lot of time.

POSC/Caesar

This liaison between POSC and Epistle resulted in a rework of the equipment and materials area in Epicentre. The principle changes were the distinction between physical objects and the logical, functional requirements for those physical objects as well as a more flexible way of modeling equipment classification schemes, as standard data rather than entity types. Soon after this collaboration, the joint project POSC/Caesar was set up. Caesar had already been in existence for some years, and was a regular attendee at Epistle meetings. It seemed a natural choice that there be a formal agreement between POSC and Caesar so that redundancies could be eliminated and there be a greater weight behind a common industry data model which represented the complete lifecycle of an oil and gas field.

Hurdles

However, it was not completely plain sailing. Some of the hurdles in setting up the venture were

the relative priority between an ISO standard and an oil and gas/POSC standard

object technology, whatever that meant

the business need for data integration between surface/facility information and subsurface/geological information and diversion of POSC resources.

In particular, the pursuit of an ISO standard imposes constraints on the technical contents of the standard, and on the process for defining the standard. However, particularly for the large engineering companies, ISO standardization was seen as a priority. POSC, in contrast, had explicitly decided to avoid being ‘sucked into’ the world of ISO.

Business value

The POSC/Caesar specification (http://www.posccaesar.com), as Epicentre, consists of a data model and a reference library. The data model is defined in Express, although a slightly different version from Epicentre. In terms of business value, the reference library is the primary means of achieving data sharing and exchange. The reference library allows different organizations to standardize on natural language such as ‘centrifugal pump’, and allows industry-wide selection based on this standard language. The data model is somewhat secondary – it is a convenient mechanism for storing the data in a flexible manner. Although POSC/Caesar publish a data model, they have in the past delivered the reference data library using an alternative data model, and current implementations of POSC/Caesar use a variety of data models and application programming interfaces, although the data models all share a very similar style. It is possible to adapt the POSC/Caesar data model to create implementations based on anything from MS Access to the latest object oriented DBMS. POSC/Caesar do not currently publish an API, nor any rules for conformance of implementations, although there is some push by members for a definition of a standard API – which would almost certainly be based on one of the current API standards.

Burden

Although the POSC/Caesar reference library can be implemented using old versions of Oracle, or relational databases such as MS Access, this puts some extra burden on the application developer. In addition, the facilities industry makes extensive use of on-line and off-line documentation. Older technology can be successfully used as a catalogue, and as pointers to document files, but it is less successful at storing documents within the DBMS, or in storing graphical elements of documents, which might have links to the raw data. Some specialized engineering databases - which may be layered on top of Oracle or object orientated DBMS’s - are designed to remove the burden from application developers of aspects such as handling units of measure, and dealing with inheritance.

Next Month Part 2 - Inheritance, abstraction and data modeling styles.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_5 as the subject.


Oilfield Systems pitches into the middleware battlefield with OpenDX (February 1999)

Forget Open Spirit, Com for Energy and Synergy, Oilfield Systems is opening up its hitherto proprietary DAEX technology with its own OpenDX initiative. OpenDX is intended to widen the user base of the DAEX software which provides data exchange between industry standard software packages.

The purpose of the OpenDX Initiative is to focus industry experience and effort on the enlargement and deployment of the DAEX data migration technology, particularly among users of Finder, OpenWorks, GeoFrame, Geoshare and Recall. The OpenDX Initiative will deliver an extended DAEX framework and a progressively richer set of components for moving data in and out of the most popular datastores. The OpenDX Initiative is an industry-led project funded by the participants. Oilfield Systems will provide DAEX source code for existing links and will define the data interface specifications. The initiative was the brainchild of existing and potential DAEX customers who have offered their individual services and skills to contribute to the accelerated development of links. OpenDX will leverage these efforts by minimizing duplication of effort and maximizing feedback to the developers. Oilfield Systems will encourage other vendors to participate in this Initiative, especially GeoQuest, Landmark and Baker Hughes, who have attended recent meetings and are said to be supportive of the concept.

Data loading

While various industry initiatives may promise a long-term goal whereby data will remain in situ and not have to be moved between project data stores, there will still be a need for DAEX-type solutions for populating project data stores from national repositories, corporate data stores, partner's project stores, master log stores, applications and other disparate sources of data. In addition, existing data will require quality audits, indexing and mismatch resolution before it can be used with confidence in any data store. This is the target focus for OpenDX. Membership of OpenDX is on a sliding scale starting at $1,000 per annum. More info from opendx@oilfield-systems.com or www.oilfield-systems.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_6 as the subject.


GeoQuest and Altana Exploration Ltd. sign three year technology contract (February 1999)

Altana Exploration Ltd. of Calgary is to license GeoQuest's PowerHouse data management solutions and components of the GeoFrame reservoir characterization system.

M.W. Payne, vice president and general manager of Altana said "We have spent three years looking for an affordable solution to manage and integrate public Canadian oil and gas industry data with our own expanding inventory of proprietary information. The products and services offered by GeoQuest make up the best database management and integrated application suite we have seen at a price we can afford. Within a year we hope to have a fully integrated suite of applications and common data sources for all geologic, geophysical, engineering and land functions." The PowerHouse solution will enable Altana to browse, retrieve, map and store data from its field locations using GeoQuest's Finder integrated data management system. Altana's E&P work also will be supported by GeoQuest's CPS-3 mapping and surface modeling system, ECLIPSE simulation software, and the Oil Field Manger well and reservoir analysis software. GeoQuest will also be supplying training in the new software. Altana is a subsidiary of The Montana Power Company of Butte, Montana, a diversified energy and telecommunications company, operating domestically and internationally, with 3000 employees and assets of $2.8 billion.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_7 as the subject.


New Boss for the PPDM Association (February 1999)

The Calgary-based Public Petroleum Data Model Association (PPDM) has hired Scott Beaugrand as Chief Executive Officer. Scott was previously with Schlumberger Wireline.

The Calgary-based Public Petroleum Data Model Association has appointed Scott Beaugrand to the new position of CEO. The appointment reflects PPDM's strategy for model implementation throughout the petroleum industry including an expanded marketing and international focus in addition to the current technical effort. Scott Beaugrand began his career in 1978 with Schlumberger Wireline in Alberta. He worked subsequently as log analyst in the southern United States and held various management positions in Arkansas, Mississippi and Louisiana before moving to Houston headquarters to work on new products including the Modular Dynamic Formation Tester. A stint overseas followed, in Norway then Aberdeen where Scott was support manager for North Sea Operations and finally Indonesia as operation's manager. Scott returned to Calgary as a consultant in 1997.

Growth planned

Scott's appointment as CEO of PPDM demonstrates the Board's intention to raise PPDM's profile both domestically and abroad. The new CEO position replaces the Executive Director post previously jointly occupied by Mel Huszti and Mary Kai Manson. Beaugrand told PDM "The PPDM Association is a solid and technically strong organization with a recognized product and wide support from the industry. My brief is to communicate this to the world-wide E&P community and to grow the organization into the international arena. In the longer term, it is my intention to seek opportunities for further development within and outside the oil and gas business." Beaugrand went on to say "In the current economic climate, the need for cooperative work in the critical area of data management is even more important than ever. The uses of data management, the potential leverage of a standard data model and the PPDM Association's experience are really only just beginning."

Virtual community

The Association is described as a "virtual community" of technical experts in engineering, land contract, finance, and geoscience information. Volunteer participants define the business requirements and technical solutions, making PPDM the "business-driven standard". There are currently 120 member companies, including oil companies, software and data vendors, regulatory agencies, and industry consultants. Expansion of the Model is presently underway in subject areas of stratigraphy, land, spatial data, seismic, and reference tables. More information from the PPDM website at www.ppdm.org or contact the office at (+1) 403-660-7817.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_8 as the subject.


geoLOGIC and M.J.Systems collaborate on Canadian raster well log library. (February 1999)

geoLOGIC Systems Ltd. will be working with M.J.Systems to complete the scanning and conversion of M.J. Systems’ Canadian logs to raster images before the end of 1999.

British Columbia, Saskatchewan and large portions of W5 and W6 are already available as Raster Images. The present initiative will ensure completion of the Alberta file in 1999. M.J.Systems is already delivering weekly updates of newly released logs and raster images for all western provinces. The aim is to offer full interactivity between M.J.Systems’ LogSleuth program and geoLOGIC’s geoSCOUT. This is the GIS version of geoLOGIC's software which claims the second largest market share in Canada.

Continued growth

David Hood of geoLOGIC told PDM "We had a very successful year in 1998 growing from 25 to 60 employees and we moved to new, larger premises in the center of downtown Calgary. Expansion has continued in 1999 despite adverse market conditions and the company will continue to direct one third of total annual expenditure to R&D despite the short term difficulties that the petroleum industry is experiencing". GeoLOGIC's proprietary and public data sets include tops, dst's, core analysis, pipelines, logs (both digital and now, with MJ Systems raster), directional well surveys ("the most complete data base of its kind in Canada"), and well locations. geoLOGIC has a base of more than 300 customers in Canada and has begun to service customers in the US. geoLOGIC will expand its services into the US this year with a move outside of North America scheduled for the year 2000. More from David Hood Dhood@geologic.com

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_9 as the subject.


Project Synergy; PDM Interviews Andrew Lloyd of Oracle Corp. (February 1999)

Andrew Lloyd, Senior Industry Director, of Oracle Corp. describes the intent and scope of project Synergy in an exclusive interview for with PDM.

PDM - We cheekily suggested that Oracle had abandoned E&P when you didn't show up at the New Orleans SEG last year. Had you really left us?

Lloyd - Not at all, we were actually preparing the Offshore Technology Conference. The upstream is a very important business to Oracle - we estimate our share of the upstream database market at around 84 %.

PDM - What were the key drivers behind project Synergy?

Lloyd - With the oil price at a 25 year low and with the proliferation of mergers, both of oils and service companies there is a greater focus on joint ventures, asset swaps and risk sharing in general. This activity generates a legal and practical need to share data; between companies, between departments and between disciplines. This may be to facilitate data sharing during the handover from subsurface to engineering, or again to make legacy data, locked up in filing cabinets 'live again' by distributing it to asset teams and between applications.

PDM - And what exactly is novel about the technology?

The key technology in Synergy is Oracle 8i - the new internet-enabled version of the Oracle Object database. Project Synergy will have at its core, a brand new implementation of industry standard data models from the Petrotechnical Open Software Corp. (POSC) and POSC/CAESAR. These data models will be encapsulated into Oracle Cartridges which are plug-in, domain-specific object-extensions for the Oracle RDBMS. The two new cartridges will be developed by Oracle's Server Technical Development Group.

PDM - so this would be the long-awaited shrink-wrap version of Epicentre?

Lloyd - Yes indeed.

PDM - And the intention is to have a single 'mega' corporate database, or would Synergy offer Oracle Cartridges to each department - and if so, how will you maintain synchronicity.

Lloyd - The underlying Oracle architecture helps us to be able to support either highly centralized or decentralized business and/or computing models. It is very likely that a large enterprise would wish to have reference data in one place and departmental data in another. This is essentially something all Oracle users get when they buy Oracle8; clearly, this will be useful when we deploy Project Synergy products.

PDM - Who exactly is the targeted end-user - the technologists or managers? Geoscientists or finance?

The targeted end-user of Synergy is, ultimately, everyone in the company! Domains to be encompassed include exploration, drilling, reservoir engineering, production and facilities. In one, or possibly two repositories, there will be all the data from seismics and geology, subsurface data, the Shared Earth Model, commercial data, facilities engineering, production.

PDM - What sort of characteristics will Synergy applications offer?

Lloyd - Applications are to include Decision Support Systems (DSS), Web-based data mining, the fusion of technical data with information in Enterprise Resource Planning applications such as those from SAP, PriceWaterhouseCoopers and Oracle Financials.

PDM - The scope of Synergy is a little unnerving, are you going to do the whole thing all at once?

Lloyd - The initial focus will be on two domain-specific areas and should allow for proof of concept before wider deployment. One area will be facilities, through Statoil's Odegard project, another will be drilling optimization and will involve a collaboration with the Mobitech group.

PDM - Epicentre has been criticized in the past for its complexity; and its performance as an operational data store has been questioned. How is Synergy going to address such issues?

Lloyd - Our intention is to provide a practical, efficient implementation of POSC Epicentre which will scale across the largest E&P enterprise whether centralized or not.

PDM - In the 'Doing objects with Oracle 8' (see PDM Vol 3 N° 11), Doug Benson explained that Oracle 8's object implementation was a kind of pragmatic subset of OO technology designed to offer a limited OO implementation that would actually work. So is it fair to say that the Epicentre Cartridge will be a compromise between the very inherited and recursive design of the Logical model and the 'relational projection' of current implementations?

Lloyd - Before Oracle8i, our technology along with everyone else's could not implement the model as intended. We will be challenging the model where we don't think it makes sense. We are employing POSC on our development and providing feedback on elements of the standard which we believe could be improved. Our intent is to provide a performant, useful implementation which remains as faithful to the model as makes technical and commercial sense. We hope that our implementation will not only prove the value of the model but also provide momentum to drive its future evolution.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_10 as the subject.


New definition of 'Seismic Acquisition' from Veritas (February 1999)

While seismic surveying may be in the doldrums, Houston based Veritas DGC has been busy in the other 'seismic acquisition' business. Veritas first acquired Calgary seismic librarian Time Seismic Exchange in January before signing a letter of intent to acquire Enertch of Houston in a $24 million transaction.

Veritas intends to operate Time Seismic as a separate entity, building the library of non-exclusive 2D and 3D seismic data for customers in Canada. Dave Robson, Chairman and CEO of Veritas said "This acquisition provides the best means for us to integrate Veritas’ operating expertise and comprehensive services with Time Seismic’s proven ability to put attractive data library programs together." Time Seismic owns over 3,000 kilometers of 2D data in the Western Canadian basin and is currently conducting a 690 kilometer 2D program in association with Veritas.

Synergy

Commenting the second acquisition, of Enertch, currently subject to board approvals, Robson said "Given the current low commodity prices and industry over-capacity, the challenges facing our industry today may be greater than ever before. We need to find better, more efficient ways of delivering services to our customers. Enertec is a quality operation with excellent employees and a strong customer base. Together we can realize significant synergies in delivering the best possible geophysical services to our North American customer base." The letter of intent provides for the exchange of 0.345 Veritas shares for each Enertec share. Following the acquisition, the former Enertec stockholders would own approximately 9.44% of Veritas. Based on current market price, the transaction is valued at approximately US$24 million. The companies presently expect the transaction to close prior to June 30, 1999. More from http://www.veritasdgc.com

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_11 as the subject.


Kerr-Mcgee goes for Oracle Energy Upstream (February 1999)

Energy and chemical company Kerr-McGee Corp. has tied together all of its E&P processes with the Oracle Energy Upstream applications suite.

Oracle Energy Upstream, rolled out to 250 users at Kerr-McGee, is an integrated suite of applications tailored specifically for handling the operational and accounting needs of the oil and gas E&P industry.

The Oracle Energy integrated applications will deliver the following benefits to Kerr-McGee:

quicker assessment of Kerr-McGee's assets

a single comprehensive information management system for E&P business operations, including exploration, land acquisition, production and accounting,

cost-effectively integration of the business processes of merged organizations as the company expands its oil and gas operations worldwide,

Seamless integration with Oracle Financials,

Integration of third-party tools, such as geological, geophysical and engineering applications.

"The fact that many oil and gas third-party tools run on an Oracle database allows us to bring a strong solution-set to our oil and gas needs." said David Bender, E&P information technology manager at Kerr-McGee. "We were also pleased with the way Oracle worked with us as an integral part of our implementation team, acting as more of a partner than as a software vendor." Oracle Energy comprises upstream solutions, downstream modules for supply, distribution and marketing; and a retail solution for enterprise fuels/merchandise management. Oracle Energy Upstream is currently available in the US. Global availability of the product suite is scheduled with the release of version 5.0 slated for May of 1999.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@oilit.com with PDM_V_3.3_9902_12 as the subject.


Courses and conferences (February 1999)

Upcoming E&P IT Conferences

March 23rd Prism Technologies OpenSpirit SIG Workshop Houston gs@prismtechnologies.com
April 11-14 AAPG Annual Convention San Antonio 916 560 2617
April 26-28 PNEC Petroleum Data Integration Houston 214 841 0046

phil_crouse@msn.com

April 29-30 PPDM Spring Meeting Houston Info@ppdm.org
May 6-7 POSC Member Meeting Houston Info@posc.org
May 10-12 Landmark Worldwide Technology Forum Houston Forum@lgc.com

281 560 1000


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.