June 1997


Landmark launches its ‘Finder Killer’ and signs with IBM’s PetroBank (June 1997)

Landmark is showing off its latest product, Open Explorer, and Oracle/ArcView development offering desktop access to E&P data.

Picture this, you are the CEO of the Really Big Oil Co. and your latest share float has just brought you a few hundred million dollars to spend on new ventures. Yours is a brave new post BPR organization with a flat structure, no more of that middle management and you are going to do some real decision making yourself. But where do you start, how do you access all that mission critical data that your minions play around with on their desktops? Landmark Graphics Corporation believes they can now supply exploration decision-makers with just the kind of all-seeing viewpoint that this type of exercise demands with the roll out of the first demonstration versions of what has been described as the Finder Killer. Firing up Open Explorer (on a Unix box or a PC) you are presented with a Dr. Strangelove type view of the world. At will you conjure up, not the locations of the enemy submarine fleet, but a display of the world’s sedimentary basins, permits, even the 220,000 well locations of the demo data set supplied by Petroconsultants. Zoom in and country names appear and other details become visible as the map scale gets bigger. The display is a split screen, with the now familiar combination of map workspace on the right, and a "tree view" display of data types on the left a la Netscape 3.0.

Getting serious

Now you are getting down to some serious work. A database query pulls up production data from neighboring acreage in the form of a bubble map. Once you have selected your area of interest, another query displays open acreage, and even allows for neighborhood analysis for searching for potential partners/operators in surrounding blocks. Pop up windows allow for a wide range of queries, and can be customized. The English language query builder is more like Access than Oracle Forms, in other words it is reasonably user friendly. Queries can be customized of course which leads to the obvious question as to how Landmark are going to manage the problem of multiple versions of their product as each company customizes to its own ends. While recognizing that this is a potential (inevitable?) problem, Landmark claim that the customizable part of the product is well separated from the production software and that during upgrade, it should be possible to retain and run existing queries against the newer versions. This problem has snowballed into a major issue for Finder users as we reported in last month's PDM. After a brief demonstration of a beta version of OpenExplorer, we cannot personally vouch that this interface does everything that you will want it to, but it does at least do what it does in the way you would expect it to. It is as they say, intuitive, with a click on a well bringing up for instance its' scout ticket. Currently around 50 forms are available for standard query/data display, with about 100 forms anticipated for when the product is rolled out around the end of the year. Access to external databases is also provided with PetroBank at the top of the list (see below), but also Robertson Research in a North Sea Context, QC Data in Dallas and so on.

solid

Open Explorer is built with solid building bricks, Oracle and ArcView, so that database access is slick and fast with large vectors brought to the display in a very snappy manner. Currently the technology behind this involves storing bulk data in Oracle Blobs (binary large objects) but plans are afoot to incorporate one or both of the new "hhcode" technologies (either Oracle's Spatial Database Engine (SDE) or ESRI's Spatial Database Option (SDO)). These promise better access and more intelligent querying of spatial data. The underlying database is described as a superset of OpenWorks, which is being built up with "POSC" compliant additions such as seismics (from the POSC/PPDM Discovery subset) and reservoir modeling from the SAVE project. So some companies will soon have "POSC compliant" databases from GeoQuest AND Landmark in the same shop, and the senior management who have financed all these initiatives will, legitimately, then ask to see them "plug and play". Well, they won't, so you IT people who got us here had better dream up some good excuses, or think about changing jobs.

Killer?

Will Open Explorer "kill" Finder. An old IT joke used to go "How come it only took God seven days to make the world?" – the answer – "No installed base". Indeed starting from scratch makes the developers task a lot easier. But not the salesman’s. GeoQuest recently claimed 50 sales per month, and that is a hard act to follow. One thing is for sure, Finder will probably get better with the arrival of Open Explorer, which itself will certainly win over predominantly Landmark shops, and if it lives up to its initial promise, many those others who have not yet made up their minds. So now that Landmark has a serious GIS front end for Open Works, how does their offer shape up in the corporate data store marketplace? In what amounts to a "belt and braces" approach, Landmark has simultaneously signed with IBM to become what they term a PetroBank "reseller".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199706_1 as the subject.


Facelift for ArcView GIS front-end (June 1997)

ESRI’s ArcView, the GIS front end used by Petroconsultants (see last month's PDM) and by Landmark in OpenExplorer is getting a facelift with the release of a suite of extensions.

The ArcView Database Access Extension is described as a major update to the current Database Themes extension. The Database Access extension provides a single, consistent interface to data stored in either Spatial Database Engine (SDE) Version 3.0 or Open Database Connectivity (ODBC) compliant databases. Users will now be able to make standard SQL queries on any SDE database without the overhead of creating local copies of the query result. Data can also be retrieved as a result of joining several tables in the database together, and users can now access data from multiple databases at the same time. The ArcView Database Access extension is also specifically designed to leverage the features provided by SDE Version 3.0. The ArcView Dialog Designer extension provides Avenue developers with an interactive, easy-to-use dialog design tool for enhancing or customizing the ArcView GIS user interface. This fully integrated cross-platform extension provides a rich set of interface tools for building data input forms, custom tool palettes, and other sophisticated application dialogs that can include selectable scrolling lists, drop-down lists, radio buttons, and other common interface components. Other extensions include. ArcPress, a graphics metafile rasterizer, ArcView 3D Analyst which will provide a suite of tools for the creation, analysis, and display of surface data, and the ArcView Internet Map Server, described as an out-of-the-box map publishing tool for the World Wide Web.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199706_2 as the subject.


Why don’t contractors supply data in workstation formats? (June 1997)

PDM Editor Neil McNaughton relates the recent EAGE meeting in Geneva and discusses the ‘cost of complexity’ – having presented a paper on this very topic himself.

I saw an ad in a magazine many years ago for a get rich quick scheme or some such, the wording of the ad suggested that the idea was being proffered as a service to humanity, nonetheless, a $5 charge was being levied to "eliminate the frivolous". Well holding a convention in Geneva is a pretty good way of eliminating the "frivolous", in the form of the job-seekers, the retired and even the smaller stall-holders. Worse still, none of my regular hospitality-hunting drinking partners could afford to come. Only 2500 attended the annual conference of the European Association of Geoscientists and Engineers (erstwhile EAEG) and even the "big" stall-holders, the I/Os, Geco-Praklas etc. had what seemed like miniature versions of their usual padded emporia. While vendors reported fewer contacts than in recent years, they noted higher quality contacts. The frivolous were gone, and the E&P big spenders were back!

mismatch

As in previous years, there is an impedance mismatch between the conference itself and the exhibition floor - as far as data management goes at least. Our subject of predilection is all but ignored by the main conference, but occupies the front of stage at the exhibition. As one of a couple of foolhardy individuals, who did actually present a paper on data management, I would like to share some of my ramblings with you. What is data management? We tend to see data management in terms of applications, but behind each application there are users, and each of them may well have a very different idea of what makes up important data. Thus data management to an IT professional is all disk space, software licenses and network bandwidth. To his colleague the data base specialist, its is all about clean data and database integrity.

Holy Grail

To seismologists, it may look more like formats and data loading, while geologists may just want to know where there core is. It is interesting to note that if you attend one of the very many specialized data management conferences which are springing up everywhere, then data management comes very near to being defined as software interoperability, with the desire to move data seamlessly around what has been described as the "whitespace" between these applications, using "best of breed" applications from different vendors. While this objective has become something of a Holy Grail of data management, we are today still very far from achieving it, and we will start by looking at why. We are children of history in E&P as everywhere else and what we have today reflects our past as much as any idealized present we might wish we lived in.

Past is key?

Our past, in E&P computing terms is very much influenced by acquisition. Data formats from the seismic and logging industries have been designed to acquire data in the field and to do so in as an efficient a manner as possible. This has led to some highly evolved formats which are complex in the extreme, which are also frequently customized by vendors and major oil companies so that while they may be good ways of writing lots of data to tape, there is a trade off in that managing them can be difficult and costly. To offer a simple but telling example of how this arises, consider multiplexed field seismic data. Data was recorded multiplexed for performance, but this is not a good data management format, indeed demultiplexing some of the older legacy seismics is a non-trivial task today.

Complexity

A more modern example of complexity for performance is to be found in the family of formats based on the API RP66 specification, which we will look at next. Having its origins in the Schlumberger wireline DLIS format which was offered up to the API as a general purpose data exchange format, became Recommended Practice 66 and has spread out into the areas such as, POSC, RODE, and Geoshare. Using these complex formats, we can with difficulty, move data from one media to another, encapsulate different objects on the same support, and do a whole lot of clever things. But there is one thing, which we can't do very well, and that is get our three D surveys from the acquisition contractor, or trade partner, quickly onto the workstation. This leads to my first question for the industry - why don’t contractors supply data in workstation formats?

Martian viewpoint?

If you had arrived from mars, or were just reborn as a business process consultant, this would surely be the first thing would hit you in the face as something to fix right away. Of course, many people and organizations have been working at related formatting and data exchange problems, but not perhaps with a real focus on this particular issue. Today, what we seem to be good at is recording data in the field, putting it into robots or on shelves, and preserving it. Using the data somehow got overlooked. The accompanying figure introduces a rather dubious pseudo-metric of the cost and complexity of data management solutions. The graph shows a rather exaggerated interpretation of the cost of managing the different formats and data models, which have been proposed. I make no apologies for the absence of scales and units, and even the ranking is subjective.

Free lunch?

This graph is just designed to underscore that in data management, as elsewhere, there is no such thing as a free lunch. Of course we have complex structures for a reason, generally performance or flexibility. But if the performance gain is in the field, or in a one-off transfer of data, then there may be no benefit in keeping the data in the same format through the rest of its lifecycle. The ranking of the different groups of formats is subjective, but if you believe that it has any value, it is interesting to sketch out some other projections of the cost space such as portability, ease of loading or application performance – all plotted against the cost of management. This is an even more subjective exercise so I will leave it to you to reflect on, but if you sketch out, for instance, the graph of performance versus complexity there are some apparent bad buys around. If you reflect on the difficulty we have today loading and maintaining clean data in the database, or the black magic involved in carving up a 3D dataset for a trade, you will appreciate that if the future holds a multiplicity of different RODE implementations, or relational projections of Epicentre, then things are going to be even more difficult to manage.

Rocket science

What is important to remember here too is the people involved in managing data are often IT professionals without years of experience of the seismic industry, or staff previously involved in drawing office functions who may actually have the experience, but not necessarily be prepared to write a Unix shell script to facilitate data loading. If you want to run your data management department with rocket scientists you may, it is up to you, but you will pay the cost of this complexity. Another example of possible excessive complexity can be observed in the current techniques of database deployment. Pretty well everywhere you will see the division of labors between the corporate database or data store, and the project databases. This again adds a level of complexity to the system, which may or may not be justified.

Tiers of joy?

Just to clear up a common misconception, this tiering, which comes from the commercial database world is not a "natural" or essential way of organizing data in other industries. It is done this way simply to avoid a heavy duty SQL query effectively stopping all the ATM machines linked to the bank, from functioning. In other words it is a compromise. In the E&P world we do not have any transactions, so why compromise? Other arguments have been advanced for multi (and sometimes very many) tiered deployment such as the need to be able to change data without "corrupting" the data store, or maintaining multiple values for an attribute. This may or may not be a real issue, personally I would rather see one "correct" attribute propagated throughout the database, as the result of an interpretation. In this context, it is interesting to see how Landmark position OpenExplorer. It can be used at the project database level, but equally can be used to assemble projects directly from the corporate datastore into the workstation. No middle tier at all, and a lot simpler.

SEG-Y revamp

I make no apologies for returning to a topic we touched upon in last month's PDM. That is the PESGB's initiative to re-vamp the SEG-Y standard. SEG-Y is the nearest thing we have to a workstation ready format but an aging specification, and a multiplicity of implementations mean that it badly needs re-vitalizing. Over the last couple of years the SEG has been trying to re activate the SEG-Y standards sub committee with no success. Probably because it is not a sufficiently glamorous topic. The PESGB effort is therefore timely, and the proposed link up with the work done on PetroBank and the NPD should ensure a quick start. Maybe we will have workstation ready data from our suppliers some time soon.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199706_3 as the subject.


PetroBank ‘reseller’ role for Landmark. (June 1997)

Landmark’s alliance with IBM extends offering to Data Bank arena.

Hitherto, Landmark has been conspicuous by its absence in the deployment of national and large oil company databanks. With an OpenExplorer view of, simultaneously, a PetroBank bulk data store and an OpenWorks database this situation is set to change.

VAR

Landmark Graphics Corporation has just announced that it has expanded its business relationship with IBM Corporation to include a value-added reseller (VAR) agreement for IBM PetroBank Master Data Store. Customers can now look to Landmark to provide "an integrated E&P information solution from desktop applications through archived data management". "Data has become the new currency of exploration and production, and companies around the world need a flexible and scaleable information environment that doesn't require them to replicate data into a specified format or into a single repository," said John Gibson, executive vice president of Landmark's Integrated Products Group. "We are pleased to announce this new agreement with IBM that provides our customers with even more choices so they have the most usable and appropriate solution to meet their specific business needs." Chip Nemesi, general manager of IBM's Process and Petroleum Industry Solutions, said, "Customers tell us every day that their major exploration and production data issues are the challenges of accessing, sharing and analyzing data in a timely fashion. The linkage of Landmark's information management solutions with IBM PetroBank Master Data Store provides the broadest level of exploration and production data management functionality available on the market today." Landmark's information management solution provides "extensive flexibility for creating networked environments that can merge data from multiple sources and formats". This approach "provides the flexibility to develop a scaleable solution that leverages existing as well as new data sources".

repackaging

Now of course this marriage will involve some repackaging probably on both sides of the equation. After all, the OpenExplorer/OpenWorks combination is being marketed as a data delivery system covering the corporate to project levels of access. Landmark categorize OpenExplorer as their new "regional or enterprise data management system that is designed to support distributed asset teams with an open, scaleable and flexible environment to meet specific business needs throughout exploration and production". While PetroBank is described in the same breath as "an exploration and production technical data management solution for archiving data at the corporate, regional or national level with highly secure data access, retrieval and delivery capabilities".

ArcView

Spot the difference – both OpenExplorer and PetroBank offer the end user an ArcView based GIS front end to the corporate datastore! While the announcement of the PetroBank/VAR agreement does offer Landmark a fast track to the regional databank marketplace it does take the wind out of the sails of OpenExplorer. In our humble opinion, it would have been "cleaner" to position OpenExplorer as Landmark's corporate to project data management system, rather than to cloud the issue with the PetroBank hook-up. After all despite all the claims about POSC compliance, Epicentre based data models and all than, PetroBank is based on a very early release of Epicentre, while the seismic part of OpenWorks will be based on the Discovery subset if and when that appears. In other words, they will not be compatible. And an E&P shop will have two different databases using different data models to maintain and understand. Of course a link up between OpenExplorer and a regional PetroBank extra muros would be a very useful extension to the vision offered to the explorer.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199706_4 as the subject.


IBM and Geco-Prakla stress importance of Inter/Intranets and Connectivity (June 1997)

In separate presentations, Alan Bays (IBM) and John Kingston (Geco-Prakla) analyzed the way in which communication technology is revolutionizing business practices.

Bays pointed out the paradox of how a consumer driven technology (Internet) is in many cases outstripping business infrastructures and technology This kind of consumer led activity is of particular importance in developing countries where for instance, cellular phone technology is bypassing the aging telecommunications infrastructures, allowing a kind of technological leapfrogging. These technologies are also allowing the formation and growth of smaller "virtual" organizations, and Bays claimed that many smaller organizations have better IT than their larger cousins. This may seem a strange view coming from IBM, but it is one that our white-hot technologists at PDM wholeheartedly share. Bays estimates that by the year 2000 there may be as many as 100 million computers connected to the Internet. IBM are investing heavily in this technology with PetroConnect, an electronic commerce, GIS based access tool for a range of third party vendors' data. You can check it out at http://www.petroconnect.ihost.com. It is an attractive tool, but as of now does not seem to offer a great deal more than you can find on the web elsewhere (at least not in the parts of the world that we pointed and clicked on). PetroConnect's success will of course depend on the extent to which third parties populate it with their data. This in turn is more likely to revolve around the charges that IBM levies on the third parties, rather than the $19/month those PetroConnect charges per user. Most international oil companies should be able to stump up that much without a rights issue.

High bandwidth

Kingston's view of the web and of high bandwidth communication links was of a more seismic bent and illustrated how these technologies were being used to pump very large seismic datasets around. Currently bandwidth of around 2Mbps is achievable in a ship to shore environment, with up to 2.2 Gbps available inter-city. To put this into context, a 500 square kilometer prestack seismic survey can be transmitted overnight at a modest 88 Mbps, with data compression accelerating this even more. Putting this into practice, a recent survey conducted in Croatia was transmitted variously to Norway, Italy and the UK. This saved the operator 3 months in the seismic operations, and brought production forward by an estimated 5 months. Another client operating an offshore survey puts all faxes, report and other documentation onto a server on the boat, which was then mirrored by satellite to a remote site. Using a web browser, these documents were imported to local applications. Fold diagrams, QA, navigation data and the like are then all available in the processing house. Looking into the future, but not that far, Kingston anticipates 8Mbps for mobile remote stations in a couple of years, allowing for compressed data transmission in real time, while full prestack uncompressed datasets will have to wait for 5 or 6 years – that assumes of course that the volumes involved do not increase as quickly as transmission rates.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199706_5 as the subject.


Book review – Advanced Storage, requirements and capabilities by Mark and Linda Kempster (June 1997)

This white paper by Mark and Linda Kempster published by the Association for Information and Image Management (AIIM) is a remarkable compilation of information on the state of the art in storage media.

Setting the scene, the Kempsters forecast that over the next decade or so, significant cost saving will be made by storing data digitally near line rather than on paper. This in turn will mean that storage volumes, which so far have been doubling every year, will probably accelerate to an increase of an order of magnitude every four years. This leads to an estimate of worldwide digital data stored in the year 2000 of around 600-1000 petabytes. On the demand side, datawarehousing technologies will allow users access to these huge data volumes, perhaps stored in multiple disparate libraries, managed by carefully designed metadata systems. The applicability and cost effectiveness of tape, robotics, optical media, magnetic hard disk and RAID systems are evaluated, and a chapter on "revolutionary storage media" explores the ultimate physico-chemical limits to data storage. While today's state of the art technologies allow storage densities of around 125 megabytes per square inch (MB/sq. in.) in the future ion-etching technologies will allow storage of around 400 GB/sq.in. (by the year 2002). The theoretical upper limit for this technology is estimated at around 1.4 TB/sq. in. – a whole 3D survey on a postage stamp! Read speeds are estimated at a healthy 2GB/sec – although this may prove to be a bit skimpy when refreshing the next century's true color Giga pixel 3D display. But do not get the impression that this is some kind of futurist report, all today's technology is rigorously examined and catalogued.

On-chip catalogue

From 4mm through 34xx, 8mm, Exabyte, 3590, DTF, QIC, VHS, DLT, optical and 19mm – and that's just the tape. Other media families (optical disk, magnetic hard disk and "revolutionary") are treated with equal thoroughness. New technologies of off-tape cataloguing are also described, with IBM supplying catalogues on diskette for the 3590, and Sony going one better with a catalogue on a chip outside the tape, and readable by the robotics. Estimates of the capabilities and anticipated availability of new media and formats are included, useful information for data managers planning a Petabyte data store. Well researched and clearly written this study includes useful World Wide Web references allowing the reader to track developments in this fast moving field. Do not look here however for any E&P specific information on the suitability of the different media, that you'll have to figure out for yourselves, but this book does give you the background for an informed decision. Advanced Storage – requirements and capabilities, © 1997 AIIM, 70 pages. ISBN 0-89258-314-2 is published by AIIM. More information can be obtained from AIIM tel. (1) 301 587 8202, fax (1) 310 587 2711, email aiim@aiim.org, http://www.aiim.org.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199706_6 as the subject.


New release of Panther SDMS (June 1997)

Panther have announced a new release – version 2.0 of their Seismic Data Management System (SDMS).

SDMS Version 2.0 includes new functionality to manage native SeisWorks data in addition to SeisX, SEG-Y and GeoQuest (output). SDMS V2.0 focuses on data types primarily used at the workstation – clean online 2D and 3D post-stack seismic trace and navigation data. Tools are supplied to track, query, browsing using the SeisView component, and access the data for project building. Tools for seismic trace QC are integrated into SeisView allowing for the analysis of amplitude and phase, histogram plots, trace plots and examination of SEG-Y header information. Other functions include – inventory of duplicate versions of the same line/survey, statistics on on-line data volumes and usage, one-time data loading, offsite data access, drag and drop line selection. Finally SDMS is customizable to individual site requirements.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199706_7 as the subject.


Kelman – new European pitch and Gulf Resources Canada DM deal (June 1997)

Kelman Technologies out of Calgary Alberta have just announced that they have signed with Gulf Resources Canada for the remastering and management of Gulf's legacy seismic dataset.

The contract involves the remastering of Gulf's 160,000 legacy tapes (corresponding to over a million km of seismics) to an STK/Magstar tape store running under IBM's Hierarchical File Management system, ADSM. Simultaneously, Kelman made their first European conference pitch at the Geneva EAGE demonstrating their data management technology. Kelman is first and foremost a seismic processing company, and as such knows a thing or two about managing seismic data. With their Kelman Archives division, they have packaged their remastering and data management know how into an archive solution with what they term the "Virtual Tape Device" as its centerpiece. Other solutions to the remastering of legacy data involve either the transcription of legacy formats to SEG-Y, or the encapsulation, using one of the emerging RODE encapsulation schemes, of the old format onto a modern high-density media. Kelman is circumspect about the merits of RODE, in view of its multiple implementations and complexity (see the editorial in this issue), and have come up with their own solution to the problem.

ADSM

This involves the encapsulation of the legacy tape format onto a hierarchical storage management system such as IBM's Adstar Distributed Storage Manager (ADSM). Rather than re-formatting legacy data formats, the Kelman process captures everything that was on the original tape, including end of file marks, checksum information and tape status information. The resulting encapsulated data is stored on the ADSM on either tape or disk – which physical media used is actually irrelevant using this technology. When the data is required, decoding software returns the data to the calling application just as if the original tape drive was spinning the tape. Kelman claim to have "broken the boundary between tape and disk", and a large seismic dataset can be processed without a single tape actually being mounted.

Kelman's archival technology is wrapped into an online Data Management and Storage System (DMASS) with three main components -data management from the desktop, -on-line data retrieval and automated mirroring of the archive to a separate geographical location.

Mirrored

This last element should be of particular interest to companies operating in a parent company/subsidiary environment, where the security of a remote mirrored site could provide a spin-off in the form of a duplicate data set at head office, although I know of more than one local manager who would rather die than let this happen! Kelman are understandably reserved about the techniques they use for encapsulating and resituating original data, but they appreciate the importance of this technology and have not ruled out putting it into the public domain. This after all need not be a commercial loss to the developer as initiatives such as Netscape and Java have shown. But to convince the industry that their data is really safe, Kelman will have to show that their DMASS works on a wide range of hardware and software platforms, otherwise purchasers may find themselves locked in to elaborate combinations of ASDM and Magstar 3494 robotics. These do not come cheap, and there are other much more economical alternatives whose perceived reliability problems may become less important in the context of an automated mirroring environment.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199706_8 as the subject.


New version of VoxelGeo as ‘For sale’ notice goes up at Cogniseis (June 1997)

CogniSeis, a leading software provider to the petroleum E&P industry, recently announced the latest release (version 2.1.7) of VoxelGeo, its 3D volume visualization software.

VoxelGeo is an innovative volume visualization application that provides geoscientists with the ability to view inside seismic volumes, thus gaining better understandings of spatial relationships between complex structural or stratigraphic features. This latest release of VoxelGeo includes substantial new functionality that enables geoscientists to access foreign interpretation databases using CogniSeis' new Uniform Links Architecture. VoxelGeo V2.1.7 "represents a major step forward for this innovative volume visualization product," said Bob

Wentland, VoxelGeo Product Manager at CogniSeis.

ULA

"The new functionality that was included in this release incorporates significant new capabilities in addition to several enhancements in response to customer requests." Wentland added, "Chief among the many new features and enhancements included in this release is full support for CogniSeis' Uniform Links Architecture (ULA) to SeisWorks, IESX, and Charisma interpretation databases. With VoxelGeo 2.1.7, our customers now have complete read/write access to these databases, thereby allowing modifications made in VoxelGeo to be stored back in the original interpretation database. The ULA links are indicative of CogniSeis' commitment to providing open solutions to industry problems". The new release also provides bi-directional links to SeisX, allowing the interpreter to share interpretation between SeisX & VoxelGeo and a new internal animator to allow users to easily generate movie images of VoxelGeo data. Calling VoxelGeo a visualization product falls short of giving a clear picture of this remarkable product.

Rocket-fuel

It is in fact a fully-fledged seismic interpretation environment, with interaction with the data displayed either as conventional trace seismic or as volumes and surfaces. Derived from a sister product used in medical imaging, VoxelGeo allows for 3D image enhancement and analysis, and provides sophisticated automated data tracking based on seismic amplitude and other attributes. Control of voxel transparency allows tracking of reservoir "sweet spots" and thresholding allows the lateral extend of a seismic anomaly to be mapped. VoxelGeo can equally be used to study CAT scans of borehole cores, the US military even use it to QC solid rocket fuel!

Checkered history

CogniSeis has had something of a checkered history since its' beginnings in 1978 as Digicon's Computer Systems Division. In 1987, the division was bought from Digicon by its management and became CogniSeis. Over the intervening years, CogniSeis acquired successively Geo Logic Systems, Inc., a developer of geologic software, of Boulder, Colorado, the VoxelGeo division of Vital Images a medical imaging concern and later, Photon Systems Ltd., a Canadian interpretation software company with offices in Calgary, Houston, and London. Meanwhile, Tech-Sym, a public corporation that owns other high tech electronics companies, acquired all of CogniSeis' stock and in April 1996, the ownership of CogniSeis, Syntron, Inc., and the Symtronix Corporation was transferred from Tech-Sym Corporation to GeoScience Corporation. Most recently a loss at CogniSeis reduced Tech-Sym's 1997 first quarter earnings by $0.13 a share, reflecting the high upfront cost of developing and maintaining software in the high tech E&P sector. Following this, CogniSeis, in a letter to their clients announced that they were up for sale. An announcement which in the words of Richard Cooper, CogniSeis' President, "has caused a great deal of interest, excitement, and speculation, not to mention some concern". Putting a brave face on things, Cooper states that CogniSeis sees a change in ownership as "a vehicle to potentially accelerate our product development and enhance the services we provide". Tech-Sym says they have received "indications of interest" from several companies regarding CogniSeis.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199706_9 as the subject.


Image Management Conference - Dusseldorf (June 1997)

It is always interesting to take a look at how other people do things, and we paid a visit to the Image Managers for their main European get together held in Dusseldorf to see where they were with respect to our own efforts in E&P.

It was a rewarding experience, not unfortunately because the imagers have a silver bullet to solve all our problems at once, but because they do have similar problems, and a slightly different approach to solving them. In fact there are many different domains, which are converging on what could be broadly termed information technology and imaging, and document management is an important one. First though, what is image/document management, where does it come from and what do they do? You remember about 15 years ago everyone was talking about the paperless office? That was when the clever money went into the pulp business. Never before in the history of mankind has so much paper been produced, and probably so little actually looked at. There were however, a group of stalwart individuals in banks and insurance companies who stuck at it, who thought that there was mileage to be had out of document scanning and set to it with a will. The result is some fairly amazing hardware, heavyweight devices that scan zillions of forms per day, software that performs optical character recognition on hand-written forms and some high falutin' object oriented middleware to tie it all together. They even have their own standardization organizations, (plural) with a POSC like heavyweight the Document Management Alliance (DMA) and a "lite" version (a la PPDM?) the Open Document Management API. Their co-operative efforts are a tad in front of our own, with interoperability having (just) been demonstrated recently.

Not really ‘documents’

One important thing to note is that Document Management Systems are not really about managing documents. The heavyweight vendors that implement DMSs today are not called upon because their clients want to "save a tree" or move towards a paperless office. They are part of a more all-embracing change in the way work is being done, our old friend, Business Process Reengineering. Now this concept contains a fair element of hype, but simply put it is a way of re-organizing the workplace to take account of the following facts:

People are (finally) becoming quite computer literate

Client server computing and networks actually work

Communications are such that tele-working is a reality (very important in Germany) and that email and visio-conferencing are a realities too

Streamlining the business and using less people to do more work is the current fad.

In upstream oil one can imagine how BPR could be used to model the process involved in approving a drilling location so that, for instance, the economics were actually calculated on a best estimate of all the costs involved, rather than, as I have seen just passing on the location to a driller, and finding (far too late) that the well was going to cost twice as much as the last one because there was an extra casing, more salt etc. etc. and that in the end it was impossible to test because of formation damage, the usual sob story. Maybe this is important after all…..

AIIM, ODMA, DMA, WfMC,

Of course we are digressing here, but next door to the Document Management Alliance booth in Dusseldorf was the Workflow Management Coalition, and behind them an army of vendors selling SAP based products. Again, Document Management System are not just systems for managing documents. They are really Object Management Systems, and the interoperability that the DMA uses is based on the same CORBA technology that POSC is currently examining. The basic idea is simple, there are three components. A front end (client) capable of accessing the server, querying the index database and launching an application to process the document (or object). Traditionally this process would involve say a telephone sales person accessing a scanned image of a client’s last angry letter while discussing the merits of changing insurance companies on the phone with them. Today DMA version 1.0 has been used to demonstrate a multi-vendor implementation at the recent US Image Management Conference. But individual IM vendors, while expressing interest in the technology, are not rushing to co-operate with their competitors. A member of the overall standards body (AIIM) confided that "it was all politics". It does seem though that there is a real need for standards in the EDMS community and that DMA is awaited with great interest.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199706_10 as the subject.


Smedvig acquires Petresim Integrated Technologies (June 1997)

Those acquisitive Norwegians have been busy spending their PetroKroners again. Smedvig's Managing Director Torkell Gjerstad, announced that Smedvig Technologies acquired Petresim Integrated Technologies, Inc. of Houston, Texas.

Petresim is a leading engineering consultancy and reservoir management service company in the oil and gas industry. Petresim's services include full field development and reservoir optimization, reserves / audits and asset evaluation of oil and gas properties, and acting as independent technical experts in the development of oil and gas fields. Petresim is currently engaged in a number of projects around the world, many of these are in Latin America. Dr. Ab Abdalla, President of Petresim, believes that combining the know-how and software technology of the two entities will create "a new, technologically stronger and independent force in the area of field development and reservoir optimization, for the benefit of industry". Smedvig Technologies is an international oil field services company providing integrated field development and reservoir optimization services, and specializing in uncertainty analysis in reservoir modeling to quantify the associated risk with any proposed development. It provides cost effective solutions to clients, optimizing the development and improving recovery from their oil and gas fields. Smedvig Technologies is also a leader in reservoir management technology, providing permanent down-hole monitoring and production control systems. Smedvig Technologies' software products include the market-leading geological and stochastic modeling products for reservoir characterization, IRAP RMS and STORM, and a suite of other software for drilling optimization, advanced well design, and reservoir simulation. Ivor Ellul, President of Smedvig Technologies, Inc. in Houston, says "the merger of the two companies will provide the foundation and be the spearhead for offering comprehensive field development and reservoir optimization services to our clients in North America and Latin America." This acquisition follows on the heel's of Smedvig's purchase of Scientific Software Intercomp of Denver, featured in last October's PDM.

 

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199706_11 as the subject.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.