August 1997


POSC object technology ‘success’ as Discovery falters (August 1997)

POSC is ramping up its fledgling Object Technology, but the Discovery project – aimed at a joint POSC/PPDM data model – seems to have hit the rocks.

Speaking at the Object World West Conference in San Francisco last month, Alan Doniger, POSC's technical director presented POSC's effort in the field of co-operative industry migration as a success story in the application of object technology. The objectives of the POSC initiative are to organize the E&P computing industry's migration to object technology with the goal of providing interoperability through re-use and exchangeability of software components. The usual benefits of cost, risk and cycle time reduction were claimed and the mantra of "core business, buy not build and outsource" recited before presenting the results of POSC's Interoperability Architecture Group.

RFT

Comprised of members from POSC, Chevron, Elf, Schlumberger and Prism, this team is working on a request for technology (RFT) to define and build E&P Business Objects (BO). Despite the "success" claim, it is still early days for this group which is considering such basic level issues as

Which features should be standardized to facilitate interoperability

How should the BO interface be designed

What services should be provided by BOs?

How should the Object Management Group's CORBA specification be implemented?

Clay Harter (Chevron) speaking for the POSC Interoperability team listed the various levels of compliance with the Business Object paradigm. Level 0 means that applications are required to use the same data store to achieve interoperability, level 1 means that applications can run against different data stores, and so on, with increasingly robust handling of issues such as data integrity and inter application coordination.

Nirvana?

The ultimate, level 5, implies that users can create "virtual applications" from plug and play components. It is interesting to note that the combined efforts of POSC to date have only specified a level 0 compliant model (Epicentre). Some members of the POSC object group have already have expressed alarm at the prospect of having to jettison their Epicentre developments and there is debate as to the extent to which the new POSC BO's will be Epicentre derived or a radical departure. This is an important issue because as you climb the levels of Object Interoperability, the standard data model's role is considerably reduced.

The move to objects is undoubtedly important, but it is too early to talk of "success". Many issues need to be resolved first such as what object model will prevail - competition between the UNIX CORBA model and the Microsoft ActiveX/DCOM is hotting-up with Software AG (the developers of the corporate heavyweight package SAP R/3) now offering ActiveX on Solaris. Other UNIX implementations of this technology will be rolling out in the next few months. Another weak aspect of POSC's BO push is the relatively poor attendance in the BO group. Few oil companies are prepared to get their hands dirty with this low level stuff these days and most vendors are either working on their own on this technology, or adopting a wait and see approach.

Discovery bogs

Meanwhile at an altogether more prosaic level, Discovery, the collaborative effort between POSC, PPDM and major E&P software vendors plus a few Oil Cos. to come up with a workable commercial strength subset of the POSC Epicentre data model, is rumored to have gotten bogged down somewhat. The objective of the Discovery project, initiated in 1995 is to develop a physical relational data model that is a subset of Epicentre. This is not the first time such collaboration has been attempted, POSC and PPDM tried it before in the early 80's and it ended in tears then. Officially Discovery is still there, and is still intended to make up the next major release (Release 4.0) of the PPDM data model, providing it is adopted by the PPDM membership at the next AGM. Our inside track on project Discovery suggests that this is unlikely. The process involved in "subsetting" Epicentre into a PPDM format has been painful and inconclusive.

Express

The essence of the problem is that Epicentre is defined in Express, a complex and rich language with high level properties such as inheritance and complex attributes. Mapping to any relational projection is a difficult process, which stressed the Discovery team to the limit and beyond. PDM understands that Discovery members are not overly keen in getting involved in other projects from hell like this - "total burnout" was how the state of most project members was described to PDM. Whether or not Discovery makes it to PPDM V4.0, one thing is clear. The mapping of Epicentre to the relational plane has proved a Herculean task for some of the best brains in the E&P data modeling business. This means either that we should be taking on some more rocket scientists for our model building, or maybe looking for an easier way of doing business. Objects? We wish!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199708_1 as the subject.


New Publisher for PDM (August 1997)

Message from publisher Andrew McBarnet

This month's issue of Petroleum Data Manager marks a transition, hopefully transparent to you the subscriber. However, you should know that from this issue, PDM will be published by The Data Room, the company run by Neil McNaughton, PDM's Editor since we started in July 1996. The change is simply a 'moving on' as far as my company Themedia is concerned. I believe that we have created a very valid publication which meets a genuine need for information and comment on one of the key areas in the petroleum industry today. Our relevance has been reflected by the loyal and growing following which PDM has built up. It is gratifying to know also that each issue stirs discussion and debate suggesting that we have been 'hitting the spot' with our coverage. It was always PDM's purpose to keep a critical eye on all developments in the field and I believe that we have honored that commitment with forceful viewpoints based on solid argument. Long may this continue ... Needless to say I have no doubt that Neil is just the man to take PDM forward and enhance its status as a 'must read' every month for anyone with an interest in the future of data management in this industry.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199708_2 as the subject.


Change of PDM ownership, stocktaking time, and a new mission? (August 1997)

Neil McNaughton reflects on PDM’s past and future from his new perspective as publisher AND editor.

Some corporate news from PDM this month. Our publisher, Andrew McBarnet who's idea it was, and who launched and supported PDM during its first year is bowing out and is going to concentrate on his other ventures. PDM's editor is leaping into the breach and will, when the ink has dried on our agreement, be PDM's publisher through his company The Data Room. Otherwise, nothing else will change, we will still contrive to provide our readers with timely news and authoritative opinion in matters relating to E&P data management and computing. We would like to take this opportunity to wish Andrew well in his new ventures.

This seems like a good point to take stock of where we have got to and where we are going. First how is PDM doing? Well Andrew's marketing acumen and (of course) PDM's great intrinsic worth have taken us to a subscription count of over 120 after our first year of activity. Subscribers are spread all over the world and include most major oil companies and contractors with a split of around 50/50 between the client and service side of the business. We are planning to introduce new services and subscription configurations in the near future, with special rates for multiple subscriptions to the same company, and soon, an Intranet service. We will also be pushing for more coverage (and subscribers!) in the US and Canada, and will be consolidating PDM as a truly international newsletter. Subscribers, contributors and other interested parties are invited to update their records with the information shown on the back page of this issue.

Explosive

Looking back over our first year we have seen an explosive growth of interest in data management, a growth which has spawned a plethora of conferences, quite a few papers and a host of data management oriented offerings from the vendors. Regular readers will have noted though that PDM's position vis a vis this enthusiasm is that there is more smoke and mirrors than substance in much of what passes for a data management "solution" today and that the industry has a long way to go before data management becomes what is should really have been all the while - a central but more or less transparent part of the E&P workflow.

So you know we are skeptical, and that in our view data management has a long way to go. But some recent papers have led me to believe that the data management problem is actually having a much more pernicious impact on the way we do our business than is apparent from the foregoing. Let me explain.

Processing is interpretation

As a first witness I will call Allen Bertagne who, writing in an editorial side box in the July Leading Edge points out that with 3D data, the "interpretation" process is increasingly being performed in the processing house. Bertagne states that the processor has become the de facto interpreter and that interpreters had better become processors if they want to retain their traditional key role. This view tallies with the observation that whereas in the old 2D days, fault alignment in particular required all the interpreter's art, today the faults just pop out of the 3D data. These "obvious" faults will however be more or less obvious, or even may be spurious depending on how the data has been processed. Stacking velocity picks, migration velocities decon and all the other tweaks that the processor makes do significantly influence the final result and the map and the drilling location.

Utopic

My next witness is myself. In last December's Leading Edge I wrote what I now realize was a hopelessly utopic view of integrated interpretation in an article entitled Trends in E&P Data Management. This article (please contact me at The Data Room if you would like a reprint) describes an idealized iterative workflow which vendors, consultants and others like to present as the way things are. In fact I had personal experience of the iterative paradigm and the close knit asset team when working for a major consulting organization. But the iterations and asset team were centered on a table of mainly paper data, with relatively little help from the workstation. In so far as I was writing of trends I was not wrong, but when I argued that seismic processing should integrate the asset team's workflow I was getting seriously ahead of current practice. This fact was made clear to me in the form of two reality checks, both from BP personnel. David Feinman talking at the GeoQuest Forum97 (PDM Vol2 No. 5) described a linear workflow designed to respond to very specific time constraints involved in developing Algerian gas fields. No iteration here. Janet Rigler (BP Exploration Houston), speaking at the PNEC Petroleum Data Integration and Management Conference (see elsewhere in this issue) concludes that existing data management and data delivery technologies are stretched to the limit in supplying a near-linear interpretation paradigm and that the much vaunted iterative workflow is a myth.

Key role

So to summarize, on the one hand we really need to integrate processing into our workflow if we are to retain control of the interpretation process, and also because oilfield development is intrinsically iterative, with new depths and petrophysical data coming in with each new well. But our current technology will not even allow us to iterate the relatively simple part of the workflow, that based on one data set of stacked seismics. Why can't we do it? (this is where we came in) - Data Management that's why. Until we can manage our data properly, until new well information can be incorporated into interpretative reprocessing, until we can iterate around the structural/reservoir performance prediction route in a timely manner, we will be missing out on the full potential of our data and providing sub-optimal answers to oilfield development. This then is PDM's brief, to enhance awareness of the data management and E&P computing dilemmas with the aim of pinpointing the problems, and while we will shout from the rooftops when vendors come up with credible contributions to their solution, we will also continue our questioning and debunking of some of the more overblown claims of success.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199708_3 as the subject.


GeoQuest develops Qatar database (August 1997)

GeoQuest has been awarded a $2.5 million contract by Qatar General Petroleum Corporation (QGPC) for a well log database population project.

The contract calls for GeoQuest to work with QGPC, the national oil and gas company of Qatar, to develop an online well log database. The log database will create a 4-year history of data from the three main fields operated by QGPC, including the Dukhan field, the largest and oldest oil field in Qatar. According to Ahmed Sidiqqi, QGPC's petroleum engineering manager, the objective of the well log database project is to have a centralized repository of all relevant petrophysical and geological information. This database will assist the Petroleum Engineering Department's professionals in the exhaustive and efficient use of the available data, allowing better petrophysical and geological characterization of our reservoirs, said Sidiqqi. The database will significantly enhance efficiency, thus saving on manpower costs over the long run. GeoQuest was selected because of its experience in handling such projects and its ownership of the GeoFrame and Finder systems, selected by QGPC.

Finder

Starting in October, GeoQuest and QGPC professionals will load data from 700 wells, 6,000 tapes and 185 wells with Repeat Formation Tester information (RFT) into the Finder integrated data management system. The work will include scanning 20,000 feet of film and digitizing 50 million curve feet. Most of the work will be done in Doha, Qatar, by GeoQuest experts in log data, log editing and data management. In addition to populating the database, GeoQuest will support QGPC users as the database goes live. "We are pleased to partner with QGPC to create this important database," said Benoit Barbier, vice president of GeoQuest Middle East. "Establishing long-term technical alliances with our large national clients is one of our major business objectives." Ongoing data management projects include work for the Abu Dhabi National Oil Company.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199708_4 as the subject.


More on Geoshare Conference (August 1997)

Shortage of space in last month's PDM precluded full coverage of the PNEC Petroleum Data Integration and Management Conference held in Houston in June. In this issue we focus on that rarity - the client side case history.

But first, for our European readers, a new concept destined to warm the cockles of all you survivors of 1986. Oil companies in Houston today have warped through the restructuring dimension and are actually paying "retention bonuses" to their cherished G&G personnel. What you may ask is a retention bonus? It is a bonus, which will be paid at some time in the future - providing you are still with the company. The general feeling amongst those targeted by the retention bonus scheme was that they would rather have the money now! So it could be that these schemes actually serve as a reminder to employees of their worth in the marketplace - a far cry from the situation of recent years.

Outsourcing

Another insight/shock to the system (depending on your sensibilities) came from Cindy Pierce describing the outsourcing program underway in Conoco, Houston. Conoco's radical approach to outsourcing involves not only outsourcing the task, but the people too. So that full time Conoco employees are taken on by the contractor (GeoQuest in this case). The rational behind this is of course that the new look E&P department no longer considers activities such as data management as sufficiently core business to merit maintaining dedicated company personnel. So the outsourcing approach is designed to facilitate personnel movement into the contractor's employ at the same time guaranteeing a certain workload for the contractor and his new employees. Additional rationale for the outsourcing approach came from the realization that the contractor was better placed that the Oil Co. to "infuse new technology" and to offer a more flexible and cost efficient data delivery service.

Tough world

Unlike similar European efforts in this field, which have been modeled on a phasing-in approach, whereby "ownership" of the personnel involved is shared between Oil Co. and contractor (at least for a while), the Conoco technique is for an instantaneous and complete move to the contractor's employ. While this may seem radical, as it undoubtedly was to the people involved, it does avoid the usual havering in the form of out-placement, counseling and so on, and must be better than being "let go". It's a tough world though when some may be getting retention bonuses while others are experiencing such upheaval. The results seen from the corporate viewpoint appear satisfactory, Conoco is entering the second year of this three-year contract with GeoQuest. While a lot of the first year was given over to the complex change management involved in this project, already benefits are accruing in the form of improved adaptability of resources, good integration of GeoQuest and Conoco processes, but as yet, significant cost reduction has not been achieved although this is probably explained by improved overall levels of service provision.

debunking

Janet Rigler (BP Houston) debunked a widely held belief concerning the way the E&P asset team does its business. BP's data management objective in the Gulf of Mexico was to present its asset teams with a shared view of the subsurface. Following a common industry pattern BP used to have in-house developers working of bespoke solutions - generally with Digital VAX hardware. All that has changed, the VAXes and the developers have gone and BP, like most others runs vendor applications on Suns and Silicon boxes. The shared earth model should enable interpreters to iterate through the reservoir characterization process, taking account of new information from reservoir modeling as new wells are drilled. Well it doesn't work like that. The limiting factor is data management. Because applications are licensed on different machines (seismics on Suns and reservoir modeling on SGIs) data most be moved around constantly. FTP and NFS have been used, but system privileges are hard to manage. Also, since no software shares the same database, audit trails become a problem in this environment. BP sees savior in the short term in the form of Landmark's Common Access Interface, and in the longer term, possibly with POSC efforts such as RESCUE.

3DKI

A related issue, that of integration through visualization, has been addresses using a development from Western, the 3D Knowledge Integrator (3DKI). This was originally developed by Sierra Geophysical as the 3D Shared Canvas and has now become Landmark's Open Vision product. Jesse Black from BP gave more information on the development of this product in a joint venture between Western, Landmark and BP. The objective was to allow interpreters to access disparate datasets and the starting point was the observation that it is easier to display data than to prepare data for display. BP was very pleased with this collaborative effort and described the partnership as "win-win-win", although they lost control of the project in 1996 with the onset of IT outsourcing. Amongst the lessons learned during the project were

Data integration can be achieved without reformatting

Vendors have difficulty integrating their own products, do not expect them to integrate their competitors.

Notwithstanding the trend towards outsourcing and buy not build, there is a place for customized software and integration. This can be achieved through a judicious choice of partner.

3DKI is to be integrated into Landmark's Open Works release 4.0.

Going back to the outsourcing and IT performance issue, BP's IT budget was reduced from an annual $400 million in 1990 to $140 million today, simultaneously, they claim to be managing an order of magnitude more data. Some speakers from the floor wondered however if the industry was not throwing the baby out with the bathwater with such radical downsizing.

Anderson the Iconoclast

Roger Anderson left the Ivory Towers of the Lamont Doherty Earth Observatory to blast off with both barrels at the assembled IT'ers present in the PNEC Data Integration Conference. US Government swords to plowshare programs are currently investigating the applicability of military technology used in the Hunt for Red October to that of locating un-produced oil in the reservoir, which Anderson claims, is a relatively trivial task. In an entertaining tour de force, Anderson claimed that submarine tracking generated orders of magnitude more data than even 4D seismics and that a major part of the submarine warfare technology involved data management, and in particular, real-time jettisoning of unwanted data. Introducing the concept of the Orchestration Layer, which can be described as middleware linking real time acquisition and database servers to the interpretation application level, Anderson went on to speculate as to how real time inversion, based on continuous 4D seismic monitoring could be used to control oil well production. Just about everything that is trendy is software engineering will have a role to play in this "Fifth Dimension" oilfield system.

Beanz meanz Java

JavaBeans, Smart Agents, just about everything that is, except anything that the E&P Computing business uses today. On the current industry efforts to control its IT destiny (POSC et cetera) Anderson was particularly scathing, stating that the investment of major players in the software business is orders of magnitude greater than that available to E&P IT "standardizers". A useful resource for those interested in tracking 4D seismic projects is located at http://www.ldeo.columbia.edu/4D4/the-list.html there are currently over fifty projects inventoried. (PDM comment - Submarine hunting and geophysics have been closely linked since way back, old TI DFS systems used to have a "BATTLE" switch, which was to be thrown, not when the crew went ashore, but to short circuit the heat protection and allow the system to function till it melted. Likewise, the E&P software business tracks modern software developments rather closely, and is frequently one of the early adopters of new technology. This does of course mean that, collectively, the industry has witnessed many false dawns and that it will take more than saying "Java" three times for all our worries to be over!)

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199708_5 as the subject.


PetroBank revisited (August 1997)

DISKOS, the Norwegian national repository built around IBM's PetroBank was the focus of an in depth examination at the July gathering of the Petroleum Exploration Society of Great Britain's Data management group.

Richard Eastgate from Norsk Hydro described the background to the DISKOS project, which involves a central repository for (ultimately), all data recorded on the Norwegian continental shelf. The data is stored and managed with IBM's PetroBank, which was largely developed for this project. The initial business driver for the project was for cost reduction, which was to be achieved by the sharing of a centralized resource, and by the use of one entry point for data. The ambitious nature of the project, the cost of data remastering and clean-up and the high cost of setting up the system has meant that that DISKOS member companies have not as yet seen the anticipated cost savings. Other benefits have accrued such as a reduction of the physical data storage volumes and the use of highly standardized formats and QC procedures provide users of the system with consistent data of high quality. Future benefits are anticipated in the shape of an open marketplace for data purchase and trade which it is hoped will allow for vendor competition via the PetroBank repository. Direct workstation access is provided through a high security environment linking company Intranets to the PetroBank through firewalls.

Client-side

Bernd Lahmeyer from Norske Shell described the client side of the setup. Shell maintain a copy of the PetroBank spatial dataset in-house allowing for map based selection. A 2Mbit/second link to PetroBank allows for rapid downloading of requested data. Bernd described the following "success stories" for Norske Shell -

Easy loading of even tough datasets such as old 2D to Charisma

Timely location of legacy seismics for regional studies

Easy delineation of spec data

Successful loading of 3D data over the network

Statoil initiated the DISKOS project in 1992 but control was soon passed to the Norwegian Petroleum Directorate (NPD) which was perceived as best placed to drum up support from other Norwegian operators and to supervise and manage the project. The first phase, post stacked seismic data, was operational in 1995 when data loading commenced. As Kjell Arne, PetroData's president explained, the data loading of the seismic is the subject of very strict QC controls. So strict in fact that 60% of the data initially proffered was returned as out of spec. Today data suppliers have cottoned on to DISKOS' requirements and returns are down to 10-15%. DISKOS standards have had a considerable impact in the way seismic standards are enforced in Norway.

Running smooth

Before PetroBank, the way in which the official SEG-Y standard was interpreted, especially for 3D data, was very much up to the contractor involved. Western, CGG and Geco-Prakla all use different flavors of the standard. Things have now changed to the extent that the client companies now specify that the PetroBank standard will apply. The process now runs smoothly to the extent that 12GB of 2D data can be loaded per day, with up to 100GB of 3D. The new standard is not however without problems, the format records CDP location in the SEG-Y header which is incompatible with Charisma, necessitating data transfer through Geoshare. The current PetroBank configuration allows for 40 concurrent users accessing 16 terabytes on the robot in Stavanger. Data access allows for a theoretical maximum download speed of 100Mbit/second representing approximately 10 to 15,000 kilometers of line seismic per hour. Initially raw input data for PetroBank comprised clean SEG-Y seismics, CDP locations and entitlements. Today this is being extended to incorporate other E&P data types such as cultural, well locations, navigation data, pre-stack SEG-Y data, field seismics in SEG-D format, stored off-line on high density media, seismic velocities, composite and trace well logs in LIS/TIF format. Well logs are also being QC'd and cleaned up to provide High Quality Log Data (HQLD) in standardized form.

5.5 Terabytes

Today the overall data volume of data stored in PetroBank is around 5.5 terabytes. This is accessed daily by 40-50 regular users. These regular users are not end-users however, PetroBank is a fairly hairy beast and the recommendation is that a member has a few PetroBank gurus who perform the data download on behalf of the end users.

The medium term objectives for PetroBank are to have all seismics recorded on the Norwegian continental shelf in stacked form on line by the end of 1997. Field data is a pilot project today, but will be operational by the year-end. New data types (production data, geophysical logs and physical archival objets) will be incorporated in the PetroBank 2C release and the network link is currently being upgraded to ATM.

Link to UK?

Arne stated that as well as concentrating on assisting member companies to recoup their investment in the project, PetroData would be working on facilitating virtual workgroups centered on PetroBank allowing asset teams to be assembled from personnel from different partners in an association. A more long-term objective involves a link up between PetroBank and the UK based CDA project. (PDM comment: This link could present an interesting opportunity for real cost saving given that CDA does wells and PetroBank does seismics. So why not a networked Norwegian/UK database with a suitable division of labors between PetroBank and CDA? This may stretch international cooperation beyond what is really feasible, and would probably be resisted by other vendors and service providers who have been left out in the cold so far.) The discussion livened up when the relative progress made by CDA and PetroBank was compared. The hot topic was the involvement and facilitating role played by the NPD, and why the DTI didn't take over CDA. This led to a heated debate as to the relative progress made by the two projects and to the merits of having a state organization running the show. While not resolving any of these weighty matters, the proponents' whistles were wetted suitably before the now traditional trip to the pub.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199708_6 as the subject.


Sun rolls out new workstation line, hijacks PC bus and ups storage offer (August 1997)

While Microsoft is fooling around taking a minority stake in Apple Computer, Sun is defending its corner with the release of new super strength workstations, the flagship Ultra 30 line of single-processor workstations.

The Ultra 30 Model 250 and Ultra 30 Model 300 feature new levels of CPU and graphics performance; up to 2MB external processor caches; enhanced UPA memory performance; and 40MB/second UltraSCSI disks. These systems are the first Sun workstations to feature Sun's new multiple-channel 33 and 66MHz PCI I/O buses borrowed from the PC world, and opening the Sun machines to the mass market of low cost, high performance peripherals. The Ultra 30 machines are available with a 250MHz or 300MHz UltraSPARC-II processor and support the Solaris operating environment. The Ultra 30 workstations are Sun's highest performance uniprocessor systems. The Ultra 30 Model 250 with 1MB external cache delivers a SPECint95 (integer) of 10 and a SPECfp95 (floating point) of 14.9, while the Ultra 30 Model 300 with a 2MB external cache delivers a SPECint95 of 12.1 and a SPECfp95 of 18.3. The new machines maintain full binary compatibility with application software - existing applications will run on the new machines with no modification to software.

Far better than PC

"The new Ultra 30 workstations clearly illustrate the difference between Sun's power workstations and PC workstations, highlighting the fact that a systems company like Sun is far better at creating high performance workstations than PC workstation vendors like Compaq who primarily assemble technology" a bullish Ken Okin, general manager of Sun's Workstation Products division stated. Sun claims data transfer rates to the processor "three times faster than PC workstations" and support up to 2GB of DIMM memory, "far surpassing the memory performance of current PC workstations". The UPA architecture also enables data transfer to the graphics accelerator at up to 800MB/sec., "many times" faster than PCs. Sun's high-capacity memory/graphics system benefits workstation users by accelerating throughput for compute-intensive and graphics or image-oriented tasks. In addition, the Ultra 30's new modular design protects customer investment by enabling users to upgrade to faster processors easily and cost-effectively with a simple module swap.

Creator

Of particular interest to E&P applications is the new generation of Creator Graphics technology to which offers 100 percent compatibility with other current products in the Ultra Creator line, thus protecting customers' investment in applications. Creator Graphics is now significantly faster, supports higher resolution monitors and offers enhancements for video playback performance. Sun is also announcing a new 24" monitor featuring an enhanced High Definition Television (HDTV) aspect ratio (16:10) which enables users to view over two full-size 8 1/2 x 11 pages at a resolution of 1920 by 1200 pixels. The Ultra 30 workstations support up to two 24" monitors. The new Ultra 30 Model 250 and Model 300 workstations are available for immediate delivery. Quantity one (1) pricing begins at $16,495 for the Model 250 and at $21,495 for the Model 300.

New bus and Gigabit Ethernet announced

The Ultra 30 builds upon Sun's traditional architecture with the addition of support for the industry-standard PCI I/O bus. Sun, known for its technical innovation, has not only embraced this standard but has strengthened it with the introduction of a 64-bit 66MHz PCI I/O bus for sustained high performance I/O. Sun will continue full support for its successful SBus interface options and systems while migrating future products to PCI I/O. Sun's Sbus, designed in 1989, will continue to be supported, but the way forward for Sun users is now PCI. Another industry first is the Ultra 30 workstations' support of the leading network connectivity solution, Gigabit Ethernet. With the Ultra 30's high-speed internal system architecture and Sun's PCI Gigabit Ethernet network interface card available this August, the Ultra 30 family offers superior network performance compared to existing PCI-based PCs using either 100baseT or Gigabit Ethernet network technologies. An ATM interface is to be added "at a later date".

Storage core business for Sun

Claiming 2 Petabytes of RAID storage installed worldwide, Sun is focusing on the near line storage market with extensions to its existing line of tape backup products using Digital Linear Tape (DLT - see PDM Vol.2 No.3). The Sun ETL 7/3500 and the Sun ETL 4/1800 are described as Sun's "most reliable, highest performance and capacity solutions" for back up of applications such as: active use databases greater than 200 GB in size; archival of important historical, financial or legal data; and Hierarchical Storage Management (HSM). HSM is of interest to the E&P community since is provides an alternative route to storing "awkward" data types such as large blocked seismic data. A single Sun ETL 7/3500 library can back up more than one terabyte (TB) of data in only eight hours in native mode. The libraries offer up to seven TB of capacity with 2:1 compression, accessible via up to seven parallel DLT7000 tape drives, providing a data transfer rate of up to 70 megabytes per second with 2:1 compression. In addition, the libraries reduce the ongoing cost of storage media by providing two million mean cycles between failure (MCBF) and a two year warranty on all library components to ensure reliable operation. The Sun ETL 7/3500 tape library has an entry price of $85,000 and the Sun ETL 4/1800 library has an entry price of $72,000.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199708_7 as the subject.


GeoGraphix announces Prizm V2.0 and future developments for GES97. (August 1997)

This new version of GeoGraphix flagship log analysis product is both a component of the integrated GeoGraphix Exploration System (GES97) and a stand-alone product.

Prizm users can now

view and pick formation tops and exchange these with GES97

view multiple log tracks, log, linear, depth, lithologies and core descriptions

Perform on the fly log analysis

Depth shift and edit curve

View results in log, crossplot or report form

Generate filed summaries from curve data statistics

Use any of ten standard interpretation techniques, or customize their own

Filtering capability allows projects containing thousands of wells to be managed, allowing wells selected by a variety of criteria to be viewed, or one click batch processing performed using the Multi Well Processing feature. This month GeoGraphix will be releasing GES97 which will deliver enhanced deviated well functionality, expanded support for international map projections, open data access to the GES database, a new and improved security system and installation process, and much-improved pathways of integration with our other 16- and 32-bit products, including Resmap and SeisVision. ResMap allows engineering data to be incorporated into geologic interpretations and to be viewed in map form. ResMap 1.0 enables GES WellBase users to augment their well database with monthly production, injection, and pressure information.

Bubble maps

ResMap provides a quick means to view this data in a graph directly from a map and incorporate that information into geologic interpretations, property acquisitions/evaluations, and field operations. Earth Scientists will use ResMap to generate basemap display layers using bubble and pie symbols to further integrate production information into the GES mapping environment. ResMap also creates map layers within GES based on ARIES for Windows™ economic and reserves analysis package from Landmark Graphics. This feature provides the geologist the means to directly use an ARIES for Windows Access database as the source for production information and/or varying economic calculations as they pertain to each well, especially estimated ultimate recoveries (EUR) and cash flow. GES97 also paves the way for improved integration with Landmark's OpenWorks database. Through a combined Landmark/GeoGraphix effort, a Window 95/Windows NT utility is under development to allow the easy migration of well data between an OpenWorks project and a GES97 project. Using similar technology, PRIZM and SEISVISION will also be able to directly interact with GES97 well data.

Enhanced

GES97 enhancements include increased flexibility in its deviated wellbore calculations to include support for wells containing only surface and bottom hole information, S-shaped wellbores, and beyond-horizontal wells. Presentation now provides improved point-and-click navigation while in Well Info, Shotpoint Info, Cross Section Define, or Query Tool modes. Zoom or pan in query mode - allowing faster access to information from a map. 121 new international map projection systems including support for South America, Australia, Europe, Middle East, Africa, and the Philippines are now standard and other applications, such as Microsoft Excel and Access can be attached to the GES data files. This provides even greater data management and reporting flexibility for GES data using industry standard applications.

The GES installation has been made more flexible for client/server configurations and an improved security system using a Globetrotter software key and a license file is introduced. This provides greater flexibility in how GeoGraphix can provide secured software to best match our clients needs and configurations.

Real soon now

Looking further ahead, the newest addition to the GES suite of products introduces engineering to the GES Workbench. This fall, GeoGraphix will release ResMap, a "truly integrated mapping and reserve analysis system". ResMap enables you to:

Validate your geological interpretations with pressure, volumetric, and production data

Accelerate acquisition and de-vestment decisions using the only software suite that integrates geological, engineering, and leasehold data onto a single map

Improve your production and operations decisions by viewing data spatially

Enhance your reservoir understanding by integrating engineering and geology

Monitor enhanced recovery projects such as water-floods, steam injections or CO2 to optimize your profitability

Together with GES, ResMap brings all your reservoir data together in one view. Does your geological data fit your production data? Are there specific production trends developing in your project area? ResMap lets you see your engineering data where it helps...on a map.

Property analysis

With ResMap, you can perform quick property acquisition analyses within GES. Simply select a well from a map and instantly display well and production data. ResMap accesses production data directly from major vendors' CD-ROMs. Post decline curves adjacent to the wells on your map. Quickly view remaining and ultimate reserves, in a multi-slice pie chart displayed on the map. Add a layer from LeaseMap that depicts your lease positions--and your competitors'--and then add a net sand isopach layer from IsoMap. Finally, you can own a system that enables true asset team integration by integrating all engineering, geologic, and lease data on a single map. ResMap "map enables" any numeric value residing in the ResMap Project database, including: net pay, oil saturation, porosity, gross pay interval, daily and monthly production, injection parameters, pressures, EURs, and volumetric parameters. ResMap will also connect directly to Aries for Windows to provide full mapping capabilities of Aries project data.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199708_8 as the subject.


ER Mapper supplies Microsoft Flight Simulator Group (August 1997)

Those of you who are wont to fritter away their idle hours playing games on their computer may be interested to know that ER Mapper has won a contract to supply Microsoft with their software.

Microsoft will use this to process images for their products such as Flight Simulator. While this may be considered a frivolous observation for PDM (well it is August, the data management silly season!) the specs required by Microsoft make interesting reading.

I/O for all image types

ARC/Info and GIS integration

Vector processing

2D/3D imagery

Mosaicing and subsetting of images

Compositing capability for mosaics

Interpolating between images of different scales

On screen digitizing

Rectification and Georeferencing

World-wide map projections

False color and 3D rendering

This just goes to show that there is a lot of potential synergy between game software and the world of scientific computing. If you doubt this look at the joint venture between SGI and Nintendo and check out the 3D graphics on your kid's 64bit GamePerson!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199708_9 as the subject.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.