July 1998


Amerada Hess goes for Mincom's Geolog petrophysical suite. (July 1998)

Amerada Hess Corporation will be deploying Mincom's Geolog software in the UK prior to possible world-wide use.

Amerada Hess has selected Australian developer Mincom's Geolog well log management and interpretation software. Geolog will provide a comprehensive suite of tools to support Hess petrophysicists in the analysis of log data from exploration, development and production assets. A comprehensive set of functional requirements was specified by company petrophysicists including 63 essential and 24 desirable functions. Three products were evaluated, and Geolog came out on top, meeting 62 of the 63 essential and 21 of the 24 desirable criteria. Hess also considered the degree of maturity, level of continued development, and tools for developing interfaces to other products in the business workflows, as key features of the Geolog product.

Presence

Brisbane-based Mincom has developed a strong presence in the European market with the Geolog package. In the three years since the UK office was opened, Geolog has become the market leader in the area of wireline log databasing, display and analysis. The software is used by oil companies such as Amoco, BP, Britannia, and Chevron, and service companies such as Roberston Research and Scott Pickford. Reseller agreements have also been struck with CGG-PetroSystems and Smedvig.

Geolog provides a flexible database capable of supporting data management and providing a complete suite of tools for the log analyst. Features include

A multi-zone, multi-well database.

Support for a wide variety of industry standard contractor log formats.

Interactive graphic log display, manipulation and editing.

Petrophysics, Geophysics, cross-plot tools.

Cross-section building and interpretation.

Project management and mapping.

Graphics and image processing.

Links to open product databases.

Constructed around a central database, modules can be plugged in as required. The Geolog environment supports open connectivity allowing exchange of geological and geophysical data with other vendors' products. Connect is a graphical tool for moving binary data between Geolog and OpenWorks databases without the need for intermediate files. In addition to the Connect module, Geolog also provides generic data exchange through a Geoshare half-link. Geolog will initially be installed in Hess' UK subsidiary, with the potential for a wider deployment at a later date. The UK unit is a major element in the Hess group, producing some 126,500 bopd, 62% of the world-wide total. More information from http://www.mincom.com

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_1 as the subject.


CGG's Stratimagic plugs and plays with Charisma (July 1998)

CGG-Petrosystems' Stratimagic seismic interpretation plug-in now shares data with Schlumberger-GeoQuest's Charisma seismic interpretation system using CORBA-based technology.

Stratimagic was introduced last year as a specialized seismic stratigraphy based interpretation plug-in for Landmark's SeisWorks (see PDM Vol. 2 No. 1). CGG-Petrosystems, the software division of Compagnie Generale de Geophysique (CGG), has now announced that the upcoming release of Stratimagic will feature links to GeoQuest's Charisma seismic databases. Stratimagic can now handle data from the two industry-leading seismic interpretation environments in addition to Integral Plus, CGG-PetroSystems' own-brand integrated workstation. "Since your seismic data is already loaded into your interpretation system, it is logical to expect cutting-edge software to have direct access to it, avoiding redundant storage and tedious data reading activities", says Marc Philton, Marketing Manager at CGG-Petrosystems. CGG claims sales of well over 100 Stratimagic licences in the 14 months following product launch. Based on the Sismage technology developed by Elf Aquitaine, Stratimagic is the first commercial software package to offer neural-net seismic facies classification, allowing for "a robust and proven process in the characterization of reservoir property variations".

Orbix

CGG-Petrosystem's is developing data access links for all its interpretation software portfolio, using the industry-leading Orbix product developed by Iona Technologies, which embodies the CORBA distributed object standard. This allows for direct network access to remote databases with little or no performance drops, and without resorting to NFS disk mounts. Currently in preparation is the link giving access to GeoQuest's IESX seismic databases." CGG are a sponsor of the OpenSpirit consortium and will adopt the results of this initiative when they become available in a commercial version.

pressure

Phil Neri, Stratimagic product manager told PDM "There are strong market pressures on products such as Stratimagic to inter-operate with other vendors' products as soon as possible. This requires good performance on large datasets which has made us continue our own efforts to implement CORBA links using our own development resources. This activity is performed by a team that is independent from both the staff that we have placed in the OpenSpirit team, and our shadow group here in Massy that follows OpenSpirit progress". Neri further elaborated on the relationship with the POSC RFT by saying "Our CORBA links were developed with a close eye on the POSC model, and are mapped to Epicentre. This was however a standard production software exercise, and did not involve our POSC unit and RFT activity. In a nutshell, this is a development that is independent from OpenSpirit and the POSC RFT activities within Petrosystems". More information from www.cgg.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_2 as the subject.


Prime numbers, beans, religion, objects and hype. (July 1998)

Innovations and new discoveries can be presented to the world in a variety of different ways. Just publishing them usually means that they will probably be overlooked, whereas marketing can sometimes increase expectations beyond what can be realistically delivered. PDM's editor Neil McNaughton examines some alternative approaches.

Pythagoras, living about 2,500 years ago, and best known today for the theorem concerning the square on the hypotenuse, knew a lot about many other aspects of math. For instance that squared numbers are the sums of consecutive odd numbers, a fact that I found totally mind-boggling when I read it recently. My mind is easily boggled I suppose. But what was more interesting was Pythagoras' marketing technique. Rather than publishing, or at least broadcasting his knowledge to the hoi paloi in the Agora, Pythagoras set up a secret society dedicated to the study of numbers. Since numbers had such magical properties, they were venerated, and various rules were imposed on the Pythagoreans.

Beans means..

One essential requirement for instance was that they did not eat beans "because they resemble testicles". The secrecy requirement could be taken to be one of the first examples of a non-disclosure agreement (the chap who did 'disclose' the Pythagorean finding that the square root of 2 is irrational was drowned by Mr. P. himself). But the "no-beans" rule is more akin to the workings of the modern marketing person. Nothing to do with the technology, just some floss and spin designed to enhance by mystification. With the increasing focus that the IT community is placing on the Object Paradigm it is useful to know whether we are in the mainstream of the theory, or staring in horror at the beans. Given that even the most elementary discovery in number theory can be associated with the most extravagant ideological claims, how do we sort out the fact from the fantasy in the great object debate? In case you do not realize the extent of the ramping up of the new technology, try this enthusiastic "in praise of objects" piece of purple prose... "Smith seeks nothing less than to revise our understanding not only of the machines we build, but also of the world with which they interact. Smith's search [..] ultimately leads to a radical overhaul of our traditional conception of metaphysics". I submit that we are back with the no-beaners here.

OO cuisine?

Over the last few moths we have heard industry leaders outline their views on interoperability. POSC's ongoing Interop group is currently analyzing the submissions and will be presenting the results at the New Orleans SEG in September. The SEG conference is also the date for the big roll-out of the Open Spirit specification (closely linked with the POSC effort). In this issue of PDM we hear how CGG is using OO technology today to inter-operate with software from both Landmark and GeoQuest. In fact this latter announcement opens up an interesting question, do we need "official" standards for interoperability? Check out the side box below to see how this has not been the case for Internet telephony. So we await the SEG with considerable interest. We don't expect to find any secret societies, and if we spill the beans (oops..) on something of note, we sure hope not to get drowned by anyone. In my research for this article though, I did notice that beans are conspicuously absent from Cajun cuisine. Makes you wonder..

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_3 as the subject.


Interoperability as seen over the fence. (July 1998)

Have you ever wondered how all those telephone companies make their hardware inter-operate? In the emerging field of voice telephony over the Internet, a pressing need for interoperability between two of the major technology providers has led to a quick-fix solution to what might have become a protracted battle.

Writing in the authoritative "Pulver Report", Jeff Pulver reveals that Lucent and VocalTec will make their Internet voice telephony gateways interoperable by September 1998. Pulver further muses that "The only people who get slightly upset over these kinds of announcements are those involved in standards groups who feel that they should be the ones controlling the protocols they use to conduct their business". Pulver suggests that we should "look for a battle brewing over "Business Standards vs. Standard Standards" in the future. Those interested in Voice on the 'Net can get the Pulver report from http://www.pulver.com.

<end of side box>

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_4 as the subject.


Model-Centric Interpretation for the Oilfield - the SEM unveiled. (July 1998)

A contributed article describing Schlumberger-GeoQuest's vision of the Shared Earth Model by: N.Abusalbi, T.Gehrmann, S. McLellan, & W. Quinlivan, Schlumberger GeoQuest

The Exploration and Production (E&P) industry has been undergoing a process of reengineering whereby integrated teams of experts, asset teams, are set up from a variety of geoscientific domains to manage, process, interpret, and derive value from the multi-disciplinary data now regularly being gathered. Equipping such asset teams in today's aggressive market requires both a Shared Earth data Model (SEM) that bridges team disciplines as well as an integrated computing environment with new tools and techniques that allow discipline-specific views into and staged analysis of the underlying data. In addition, this means these new tools must allow geoscientists the ability to build and refine the SEM as they move throughout the interpretation process, rather than at some late point in the process. We call this approach Model-Centric Interpretation (MCI). The ultimate goal is the construction of a dynamic reservoir model based on geological, geophysical and petrophysical findings that matches production history. Its predictions will be used to control existing facilities, as well as plan and build new production facilities.

invalidation

Every new fact or interpretation must be put into the model context, potentially invalidating parts of the existing model. The tools and applications dealing with reservoir models are already today actively using earth models. RESCUE (REServoir Characterization Using Epicentre) for example serves as an industry standard for the description of static models. Upstream, in the geological and geophysical (G&G) domain, a model centric approach is less common. It is hard to press the daily correlation work into the framework of a structural volume model as it is used downstream. Nevertheless the products of the G&G domain give evidence for the basic shape of reservoir models. Typically G&G interpretation provides the definition of boundaries, interfaces between more or less homogeneous volumes. The data entities themselves are defined by standard data models like POSC’s Epicentre. The existing standards support the communication and exchange of individual data among applications and disciplines. When it comes to the assembly of a model in an MCI process, distributed over different working disciplines, today’s standards offer only limited support--with the effect that models have to be created over and over, often by dumping data and interpretation "over the wall" to a group tasked with model building but not with interpretation.

challenge

Figure 1 refers to a variety of typical asset team tasks which contribute to and use an SEM. The tasks span disciplines and domains. Ideally, results of one task are available to the next and vice versa. The communication is done iteratively via an evolving model, or model versions resident in a data store.

To support the workflow in an asset team it must be possible to

define a plan of which entities constitute the model to be built, without shape and property of the entities known yet; example: the list of horizons, layers or faults;

add SEM components as they become available; example: as interpretation proceeds new results are "published" to the model;

tag model components with unique semantics; example: one and the same data set may play different roles in one model where other models require separate data sets;

add relations in between model entities, define model building rules; example: define zones and layers in between geological interfaces, truncation rules in between faults in a fault system;

keep parallel model versions customized to their purpose but share whatever can be shared; example: a velocity model used to assist in depth converting seismic time interpretation requires careful definition of the overburden, whilst a reservoir model is usually indifferent to the overburden and more focused on the internal structure;

assert model consistency on changes of model components; example: interpretation adjust the shape of an interface - the layers bound by the modified interface have to be adjusted as well;

merge the individual model components into a constrained, normalized form, the ultimate reservoir model; example: a complete model à la RESCUE;

create new model versions or model snap shots; example: alternative interpretations lead to different models or different methods produce alternative models

compare similar models and measure changes; example: property variation as derived from time lapse seismic surveys.

Figure 1. Model-Centric Interpretation Tasks

Solution

1. Building up an Initial SEM

The SEM, built with a standard such as Epicentre and with a rich repertoire of associations carrying model semantics and usage rules, can meet the above challenges. In the early phase of a reservoir model-building exercise the G&G domain prefers an open modeling environment. Which surfaces and which layers are relevant is not yet known. The earliest model will just consist of a list of features that seem interpretable and are going to be correlated (Figure 2). In following tasks more and more detail will be added. Shapes and properties of the model’s relevant features are subsequently collected. These are then added to the SEM with associations that clearly define their meaning in the model context. Applications will use the SEM as an information pool. These applications may fulfill their specific input data requirements by consuming the model definitions and the data which have already been associated to the model. The results in exchange are added as new model components in whatever representation they are produced. The different specialists in the asset team will work in their domains--e.g. the seismic interpreter in travel time and the log analyst in depth--and publish their findings to the SEM in parallel (Figure 3). The SEM keeps a growing collection of shape and property information. These data may be incomplete, contradicting and inconsistent. They will have varying uncertainty which is recorded in the model context wherever possible.

Figure 2. From Interpretation to SEM

In general the collected data have to be consolidated into one domain, typically the depth domain. For example, this requires a consistent and complete 3D velocity model in order to perform the seismic travel time to depth conversion. This particular velocity property model is an earth model in itself. Its focus is much broader and also includes features irrelevant to the reservoir while it may lack detail in the reservoir zone. It is expected that the velocity model will share some features of the reservoir model.

2. Refining towards a consistent SEM

The reservoir model assembly phase will have to remove the contradictions and inconsistencies. Where components are incompatible, alternative hypotheses are sketched which may be confirmed by a re-interpretation of the original data. In the consistent model, more and more rules and associations are added.

3. Tracking in a SEM

All elements of an SEM are produced and/or modified by some task like those described above. As tasks are performed, a history record should be kept so that, for example, a datum in the SEM can be traced back to the task, interpreter, and product that produced or modified it. The complexity of these relationships can be minimized by restricting the number of modifications to data representations. This is very similar to versioning.

Figure 3. Knowledge Accumulation & Refinement of the SEM

Conclusion

Typical workflows in asset teams require a SEM which supports a process of knowledge accumulation and refinement. Today’s standard data models like Epicentre provide building blocks for such a model. The definition of how to associate the individual blocks in a SEM is the next main step. The model semantics and rules must be standardized, a possible POSC project. This is a fundamental requirement for multi-vendor interoperability between applications interacting with the SEM. The SEM is the technical foundation for Model-Centric Interpretation that offers business groups a real opportunity to reduce interpretation cycle time, increase interpretation accuracy, and yield improved implementation decisions.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_5 as the subject.


New data from Australia's NW Shelf and Norway from Scott Pickford (July 1998)

Corelab subsidiary Scott Pickford has finalized work on two interpreted data sets from Australia's North West Shelf and the Norwegian sector of the North Sea.

The Australia North West Shelf Interpreted Lithology project (ANWIL) is a non-exclusive study focussed on the Bonaparte basin in the Timor Sea. Initial sponsors are BHP, Shell and Woodside. Deliverables include digital lithology files in workstation-ready formats including depths for top and base of each unit and all aspects of the lithological interpretation. 212 released wells have been incorporated into the study which was performed using Scott Pickford's Litholog software. Follow up projects are planned for other areas including the Carnarvon Basin and Arafura Sea. Another new product from Scott Pickford is the second phase of the Norwegian Facies Knowledge Base (NFK - Phase II). This extends the first phase of the study from 60° to 64° N over the Norwegian continental shelf. Depositional facies interpretation of the section from TD to Top Miocene is delivered in workstation ready format. Support data and an audit trail of the facies analysis is also available and depth interpretation is referenced to seismic two-way travel time. NFK is said to help identify new acreage opportunities, to model new play concepts and stratigraphic trap possibilities. Phase I has already been acquired by nine companies. More info from phils@scopic.com.,

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_6 as the subject.


Database connectivity technologies compared. (July 1998)

Student dissertation offers good starting point for database connectivity

An investigation into the interconnectivity of Internet and Database Technologies can be found at http://www.hipstream.forc9.co.uk. These pages offer detailed coverage, with code examples, of HTML + Forms, CGI, ODBC, ISAPI, ActiveX (ADO and Active Server Pages) and other technologies such as IDC (Internet Database Connector), DCOM, Perl, ActiveX Components and Java Applets. Kevin Staunton-Lambert, did this work as part of a final year university dissertation at the University of Huddersfield, UK. To benefit from these pages you should have experience with SQL databases and C++. Background to the project, and an impressive technology overview can be had from the same website, subdirectory /dissertation/dissertation.html. Kevin is a keen advocate of the ActiveX/COM/DCOM which is "the way ahead, particularly with large scale business solutions".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_7 as the subject.


CGG Petrosystems view on Interoperability (July 1998)

Continuing the interoperability debate, this month Jean-Paul Marbeau of CGG-Petrosystems presents the view of a medium-sized software developer with a particular interest in plug-and-play capability with the major vendors.

Interoperability has been one of Petrosystems’ key priorities since the early 90s. For a mid-size software vendor like us, interoperability has always meant the ability to seamlessly exchange information with the main vendors' systems. This is a major reason-why we originally joined POSC in 1991. Since that time, a lot has happened, both in terms of re-defining the E&P business processes and the emerging I.T. technologies. This led to re-definition of the concept of interoperability and currently, we consider that the E&P technical software industry is entering into a completely new era. Back in 1991, interoperability was mostly viewed as being achieved through data integration. With Integral Plus, we were actually one of the first companies offering such data integration. However, based on a proprietary data model, our interpretation software was locked in its own integration. This is why we actively supported the design process of POSC Epicentre, the vision of a single data model adopted by the entire Industry. The concept of business objects was still considered as a far-reach, and, with Epicentre, the POSC arena settled for a logical data model concept, half way between object technology and physical data models

leadership

Seven years later, Epicentre’s benefits turned out to be quite different. It did provide the basis for modern corporate data-management, and our leadership position in that domain with the PetroVision suite of products we developed with PECC is directly linked with Epicentre’s proven industrial strength in that domain. However, Epicentre did not work for interoperability, first because the main E&P systems, i.e. Geoquest and Landmark did not fully implement it in their applications, but also because the paradigm had somewhat changed. The new organization of the E&P Industry through multidisciplinary asset teams has modified the requirements with respect to interoperability. First, seamless-access to the main project databases is a must, and will stay that way in the future. Quick convergence of the respective data models is not expected any more. It is a matter of survival for both the value-added software vendors and the technology groups within the oil companies to adapt to this reality. Second, the concept of Shared Earth Model is now gaining momentum. This model-centric approach to asset workflow basically means that data integration is not enough. Different disciplines and software applications must share common high-level objects. Modifications made by one application must be immediately available – and consistent with – the other ones. At this point, interoperability also means sharing context, graphics, and algorithms such as 3D-geomodelling tools.

Light SIP

Such interoperability, across project databases, vendors and platforms, cannot be reached simply by using vendor A or B Application Programming Interfaces (API) that are meant to work only in the proprietary environment of its owner. This is why we have been advocating a "neutral" SIP, and welcomed the first efforts to build one, PrismTech's Light-SIP. In the mean time, object-oriented technology has become mainstream. Every significant vendor is now stating that its new developments are OO-based and made in C++ or Java development languages. More importantly, the OMG has set the standards for distributed objects technology, which, in theory is the proper framework for the high-level interoperability as described above. This is why Petrosystems developed CORBA-based data servers, enabling commercial applications such as Stratimagic to simultaneously access data from OpenWorks, Integral Plus and now GeoFrame project databases. This experience was for us the proof-of-the pudding, and in 1995 we started the development of GEM-3D, our new object-oriented application integration platform, based on CORBA protocols for accessing the data through business objects, and integrating gOcad as our preferred 3D-geomodelling tool.

Open Spirit

Finally, came Open Spirit and the POSC interoperability RFT. This is for us the most recent leg of the interoperability journey. Technically, Open Spirit brings us two key features. First, it is a neutral Application Integration Platform (AIP), to be shared by a number of software vendors and oil companies. Second, it brings the full strength of the distributed business objects, on top of the concept of CORBA-based data servers. This is the main reason that we consider the Open Spirit project as strategic for us, and why we are sponsoring it, and contributing two senior programmers to its development Team. We expect the deliverables to be on time, first in terms of CORBA servers, by Q1 1999, then in terms of graphics and other more sophisticated distributed services. We expect to rapidly implement such servers both in our data management and our geoscience interpretation applications. We also verified that the Open Spirit architecture makes it possible for us to keep as framework extensions several important GEM-3D features, such as its highly sophisticated interaction mechanisms and its integration of gOcad. The merger of the two frameworks will take place next year.

POSC RFT

POSC’s new focus on interoperability is of course key to the success of this business objects approach. The POSC RFT covers both the I.T. architecture sustaining the interoperability, and the definition of the main common business objects. The Open Spirit alliance did respond to the POSC RFT, and whatever the outcome, Open Spirit is meant to endorse and follow the POSC recommendations. POSC’s other focus, on the SEM is consistent with this new trend. However, in this domain, consensus in the Industry may be further away, as 3D-geomodelling, a fundamental constituent of the SEM, is still considered by several vendors as a competitive advantage, not to be shared. In conclusion, we consider that the distributed objects technology is the only sensible approach to interoperability in the E&P industry. Starting in 2000, there will be growing availability of applications and "applets" based on this technology. This will gradually lead to a profound re-definition of the E&P technical IT environment, with many newcomers and possibly with a new definition of what is an application and how the clients will be prepared to pay for it. The technology is ready for this change, and PetroSystems is investing heavily, in order to get ready for an early start.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_8 as the subject.


Wavelet Compression aids ArcView GIS raster image display (July 1998)

ESRI is to incorporate the viewing component of LizardTech's MrSID raster image compression technology into ArcView GIS.

A new extension interfaces ESRI's ArcView GIS software with LizardTech's patented, wavelet-based compression and display technology, MrSID (Multi-Resolution Seamless Image Database). The MrSID ArcView GIS extension gives users the opportunity to rapidly decompress and view massive raster images within ArcView GIS that have been compressed with MrSID. In addition to the joint development effort for ArcView GIS, both companies have identified strategic areas and customers for further cooperation to adapt and integrate aspects of their respective software technologies. The extension uses MrSID's Selective Decompression feature that allows users to decompress only the imagery required for the current ArcView GIS display at the appropriate resolution, therefore saving space and dramatically increasing speed.

In the army now..

Instantaneous, seamless, multi-resolution browsing capabilities are claimed for virtually any size raster image inside ArcView GIS. ArcView GIS users can now employ this detailed imagery as realistic backdrops using the MrSID ArcView GIS extension. An example of possible compression ratios was supplied by Roger Adams, GIS project manager for the U.S. Army's Integrated Training Area Management Program, "We now require that all imagery be delivered in MrSID format because we can compress data at very high ratios, even as high as 80:1. Recently we compressed over 250 150 MB files (37 GB of imagery) into one single 522 MB file." The actual compression ratio achieved varies depending on image content and color depth but is generally around 15-20:1 for gray scale and 30-50:1 for full color. More from http://www.esri.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_9 as the subject.


Amoco makes world-wide purchase of GeoQuest's OilField Manager (July 1998)

Amoco Corporation and GeoQuest have signed a 'worldwide enhanced supplier agreement' to facilitate the introduction of GeoQuest's OilField Manager (OFM) software application into all Amoco business units.

Amoco have purchased a suite of OFM licenses which will be deployed in asset teams worldwide for use by engineers and technologists in Amoco's E&P divisions. OFM is a production and reservoir management database application for Windows 95/NT-based computing platforms. By standardizing on OFM, Amoco will be able to meet several different production and oilfield data management requirements resulting in better management of oil and gas assets. "Amoco offers OFM as part of its core engineering application portfolio," says Greg Grotke, technical computing coordinator for Amoco's Technical Computing Solutions group. GeoQuest participated with the Amoco Technical Computing Solutions group in an introductory tour to give the Amoco business units the opportunity to view the core engineering applications.

PA to go

The worldwide tour provided an introduction of the software capabilities and how these will assist engineers and technologists in their reservoir assessments. This is being followed with specific technical demonstrations, training and data migration validation before OFM is fully implemented. "We have shared business goals, which includes replacing Production Analyst (PA) with OFM and to ensure customer satisfaction" said George Steinke, vice president of Business Development for GeoQuest. After Amoco's $30 million investment last year in Landmark software (PDM Vol 2 N° 7), the addition of OFM will bring some interesting interoperability issues. Could be that Landmark's adoption of COM/DCOM (PDM Vol. 3 N° 1 and 3) as the basis for their interoperability strategy has opened up a new interoperability route to the Windows environment of OFM.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_10 as the subject.


SAP Selects ESRI as Development Partner (July 1998)

New partnership sets to bring tight integration of Geographical Information Systems (GIS) and SAP R/3 business software

ESRI has been selected by SAP as a development partner. As such, ESRI will work closely with SAP on joint development projects aimed at bringing geographic information system (GIS) functionality including spatial analysis and visualization, to various applications making up SAP's R/3 flagship product. "This is exciting work because we are responding to the needs of the market, both for SAP users and for ESRI users," says Jack Dangermond, ESRI's president. "By providing a total information solution, users in small, medium, and large organizations will have the ability to harness spatial analysis within the powerful SAP enterprise environment." Hasso Plattner, SAP's CEO said "We view this as an important step in advancing our software to help businesses get the most out of their software and data. With a large amount of data in corporate databases having a geographic component, accessing spatial information in R/3 will complement and further support a wide variety of business activities, from decision support to supply chain optimization and customer service." Using ESRI components within SAP's new Business Information Warehouse will provide users with the ability to use GIS analysis and visualization tools on information in SAP, such as customer location and sales information. In addition ESRI's ArcView GIS, ArcLogistics, and ArcView Business Analyst software will all have direct interfaces to the R/3 software through SAP's Business Application Programming Interface (BAPI) concept.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_11 as the subject.


The Corporate-Front Page (July 1998)

While a low oil price depresses oil company results, the service sector goes from strength to strength. How long can this go on?

Following the strong first quarter reported in last month's PDM, Petroleum Geo-Services (PGS) has now returned what is described as its best-ever quarterly results. Second quarter net income increased by 41%; revenues by 29%, and operating profit grew by 52%. PGS has acquired Louisiana-based Acadian Geophysical Services Inc., a 3D-seismic acquisition company specializing in transition zone and shallow water areas. Transaction cost was $55 million. A new seismic production record is claimed for PGS' Ramform Challenger which acquired over 2,100 square kilometers in a 30-day period off of the coast of Australia. During the six months ended June 30, 1998, the Company invested $163.5 million in its multi-client library, primarily in strategic seismic surveys in the North Sea, Gulf of Mexico and Asia Pacific regions. Schlumberger also reports second quarter net income of $359 million - up by 17% over the same period last year. CEO Euan Baird commented: "The oilfield results remained strong despite the anticipated slowdown in the growth of exploration and production expenditures experienced during the quarter. The uncertainty surrounding the demand for oil will keep our customers cautious about upstream spending, and we are adjusting our operations accordingly." Western Atlas and Baker Hughes have cleared the U.S. antitrust review of their merger and report a second quarter revenue hike of 31% to $547 million while operating profit was up 59% to $75 million." Meanwhile Halliburton and Dresser have received clearance for their merger in Canada and from the European Commission. At the present time the companies have other regulatory filings in process with the U.S. Department of Justice and regulatory agencies in certain other countries. The companies expect to complete the merger during the fall of 1998.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199807_12 as the subject.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.