July 1997


Mixed News from Amoco on Landmark's Birthday (July 1997)

Landmark celebrates its fifteenth birthday this month and received a couple of ‘presents’ from Amoco in the form of a patent infringement suit and a major contract for the supply of software and services. The patent battle centers on the use of the Continuity Cube product.

Since November, when PDM revealed that Amoco were to "vigorously defend" their Coherency Cube Patent, Amoco have been engaged in discussions with Landmark Graphics Corporation (LGC) over the on-going use of LGC's competing Continuity Cube product. These have failed to produce an agreement and on the 7th July, Amoco served a patent infringement suit on LGC and CAEX Services Inc. The suit was filed in U.S. District Court for the Eastern District of Oklahoma over alleged infringement of US patent N° 5,563,949. Amoco are seeking monetary damages and injunctive relief from the infringement to prevent Landmark and CAEX from distributing and using the Landmark Continuity Cube program, which is said to infringe on Amoco's coherency patent. Amoco state that CAEX is a wholly owned LGC subsidiary.

‘not owned or controlled’

Landmark disputes saying that CAEX Services, Inc., the Texas corporation that was served in the suit, is not owned or controlled by Landmark Graphics Corporation or Halliburton Company. In an astonishing near-simultaneous announcement Landmark announced that they had signed a $30 million contract with Amoco for a three and a half year contract to supply software, data management, support, training and professional consulting services to support Amoco's worldwide exploration and production business units. Speaking of the good news, Robert Peebler, Landmark's president and CEO said "Landmark have a long history of working together, and these agreements further enrich and expand our relationship. Amoco is a leader in using innovative technology as a business advantage, so their choice of Landmark's broad and integrated suite of applications and professional services is an excellent match for both companies". It is not clear whether the Amoco's "right hand" is acquiring the technology that its "left hand" is litigating against!

Powerful technique

The Amoco patent describes a method of computing the cross correlation of neighboring traces in a 3D seismic dataset. The resulting values are represented as maps and time slices showing local continuity. As described in last November's PDM, such techniques have proved very powerful in locating faults and other features which are not easily visible in the original data. Landmark's incriminated product, the Continuity Cube is part of the Poststack interpretative processing package. Poststack launches directly from SeisWorks and utilizes ProMAX algorithms. Landmark claim a high degree of integration with interpretation tools allowing poststack processing to be performed on the fly. Landmark describe their Continuity Cube tool as a measure of the local lateral similarity of seismic data, the technique is apparently very similar to the Amoco method in that trace data within a sliding time window is "crosscorrelated with data from 2, 4, or 8 adjacent traces".

‘contractual right’

Indeed Landmark do not claim that their technique differs from Amoco's, but dispute the merit of the suit on the grounds that Landmark has a contractual right to offer its Continuity Cube software pursuant to a 1991 agreement between Amoco and Advance Geophysical Corporation, which Landmark acquired in 1994. Landmark intends to "vigorously defend its contractual rights and ability to offer its Continuity Cube software to its customers".

Industry observers have suggested that neither of the "Cubes" is a radically new technique. Both use algorithms, which are fairly common currency in the processing house, correlation is after all the bread and butter of the seismic processor.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_1 as the subject.


CogniSeis becomes a Paradigm Target (July 1997)

Following the 'for sale' notice revealed in last months PDM, CogniSeis have announced the port of SeisX (an integrated 2D/3D seismic interpretation package) to Windows NT. Meanwhile, Paradigm is believed to be preparing the take-over of CogniSeis.

"With the introduction of the new Pentium Pro II systems, we are seeing the gap between UNIX systems and NT systems growing smaller," said Russ Sagert, SeisX Product Manager at CogniSeis. "More importantly, because of the larger market for NT systems, the price for this technology is considerably less than comparable UNIX systems. Our customers, therefore, will realize significant cost benefits." Mr. Sagert went on to say, "SeisX NT was designed to provide the same comprehensive functionality of UNIX-based interpretation systems but at a significantly lower overall cost by taking advantage of the price differential between NT and UNIX systems. We believe that SeisX NT is the right choice for the value minded interpreter."

Take-over

Meanwhile, the talk of the town at SEG Istanbul convention was of an imminent take-over of CogniSeis by Paradigm Geophysical. Paradigm, founded in 1988 has build up a strong reputation for high-tech interpretative seismic processing particularly with its GeoDepth product for depth/velocity model building and depth imaging. Industry sources commented that there appeared a good fit between the Paradigm and CogniSeis product lines. CogniSeis can be found on the Internet at http://www.cogniseis.com and Paradigm at http://www.geodepth.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_2 as the subject.


When does a patent become patent nonsense? (July 1997)

Neil McNaughton, Editor of PDM, looks at the issues involved in claiming patents for software products in the petroleum industry.

When – yet another life! - I was working in "research", i.e. waiting to be posted to some unhealthy part of the world for real work, we used to be preoccupied with technical issues of weight such as how the heck seismic inversion worked (we are going back a while now). Avid reading of Geophysics often brought illumination (just add up the samples!) and the published algorithm was passed on to the programmers, who did something completely different, but that's another story. At the time, it seemed that scientists just published their results for the greater good, and that everyone could pick up on the fruits of their labors, it was the natural way of the world.

Client from hell

Of course it was never really like that, and has never been. The seismic software houses were playing a fairly open, but delicate game. If they "discovered" something, like a novel decon, or a new way of migrating, then they would weigh up the merits of publication – a kind of marketing after all, or of use – in a proprietary sense. The disadvantage of the latter was that, unless the algorithm was published, the "client from hell" with an arm full of math degrees would be convinced that what was really happening was, well just adding up the samples for instance. So publishing was in reality a route to credibility and hence sales. Now this did mean that others could copy, that investment in R&D led to a time window when the discoverer could "get ahead" of the market, and move on to higher ground while others began to cash in on the "old technology". Only in exceptional circumstances would a patent be applied for, this is after all a costly and time consuming process, with no guarantee that the patent obtained will actually serve any useful purpose.

Old as the hills

Patents and patent litigation in geophysics are actually as old as the hills, or at least as old as Geophysics. In 1936, in Volume 1, there was a paper by a certain C. R. Hrdlicka summarizing pending patent litigation relating seismic exploration. Since then, although Geophysics has been regularly publishing patents, there have been no further "case histories" of litigation in this field. Of course back in those days, software did not exist, but in addition to patents for actual physical devices, many of the earlier patents were taken out on methods. Noteworthy among these was CDP (patent filed in 1950) and Vibroseis (1953). Today, while most geophysical patents are still applied for on the basis of "inventions", software algorithms are increasingly the subjects of claims. Hitherto, these have served as territorial marks, to be negotiated if necessary, but rarely have they either made their inventors much money, or even been challenged in a serious way. To our knowledge, geophysical software patents have not before been the subjects of high profile litigation although rumors abound as to behind the scenes arm-twisting and settlements. We asked James D. Ivey, a California based attorney with E&P experience who now works in the field of computer systems and software, to provide some background on the state of the art in software patents. His contribution (see side box) shows that the goalposts have shifted somewhat in recent years and explains why patenting is often a necessity.

Controversy

In the software world at large, patents are the subject of considerable controversy. The arguments for patents are fairly obvious, if an individual or a company has invested a lot in developing an original algorithm, then it is reasonable for them to expect to be protected from others copying the algorithm and having a free ride. The counter argument goes that programming grows incrementally, and that in many cases it is almost impossible to write a program, which is entirely novel. This has not stopped many software developers from patenting just about anything, from the look and feel of a GUI, to sorting algorithms that a high-school student could well be expected to "invent" for homework. The League for Programming Freedom (http://www.lpf.org) has submitted a lengthy document to the US Bureau of Patents, which addresses these issues. They argue that the current legislation on patents is out of kilter with the modern world of software development, and cite examples of patents of such trivial ideas as "the use of different colors to distinguish the nesting level of nested expressions" (US patent no. 4,965,765) or "use of a host independent byte ordering" (4,956,809). The field of text data compression is cited as becoming an area where software development is "out of bounds. There are now so many patents in this field that it is almost impossible to create a data compression algorithm that does not infringe at least one of these patents". The costs of defending or proving patent infringement are also cited as potentially crippling for a small company. Even large software companies such as Oracle have opposed the patentability of software, arguing that that copyright and trade secrecy legislation are better suited to protecting software development.

White flag?

In the Geophysical context, Rutt Bridges of Landmark has questioned the overall economics of patenting calling for a "truce on patents. Companies should re-examine the role of patents and litigation within their organizations. Key questions are: How much money does your company spend acquiring patents? How much time and research effort? How much money have patents made for your company? In these times of tight budgets, isn't it wise to rethink the patent policies of the 1970s and 1980s? The proposal is to create consortia of companies with unlimited cross licensing of seismic processing patents. This would speed up software development dramatically, reduce cost, and benefit all members of the group." (The Leading Edge, January 1994). So what should the industry do? Protect any invention however trivial at any cost. Or open up a free for all where even an algorithm, which has cost years of original R&D can be implemented by anyone. Clearly neither path is satisfactory, but neither is the status quo satisfactory, except for patent lawyers!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_3 as the subject.


Modelers take a double hit! (July 1997)

A couple of quotes a propos of data modeling. First from the June Byte Magazine, in an article entitled 'A career in data modeling'. 'In the 1970's there was a brief push towards developing an 'enterprise data model.' But this idea has largely been abandoned, leaving many large, expensive, and uncompleted projects in its wake'. Sounds painful. Overheard at the PNEC Data Integration Conference 'If a project takes longer than World War II then there must be something wrong.' Oh ye of little faith!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_4 as the subject.


Landmark Celebrates its 15th Anniversary (July 1997)

The Amoco suit comes at an unfortunate time for Landmark, which is celebrating its 15th birthday this month. PDM reminisces with the Landmark old-times.

Already the jellybeans have been dispatched (and eaten) and it is a time for reminiscence. Harking back to their origins LGC remind us that in 1982, 3D seismic interpretation was done on massive and expensive mainframe computers (PDM old timers even saw it done by folding paper sections at every tie line to end up with colored concertinas). Landmark's founding fathers Roice Nelson, Andy Hildebrand, John Mouton and Bob Limbaugh revolutionized the oil and gas industry with an affordable "console-sized" workstation for geophysical interpretation.

Heavy man..

Console sized was of course a euphemism for very big, the "Landmark III" weighed more than 1,000 pounds and was about the size of a (Texan) household refrigerator on its side. It was described as the world’s largest DOS-based system with 440 megabytes of disk storage and costing nearly $250,000. John Mouton, said "In the early days, we would have dearly loved to use an off-the-shelf workstation, but we judged everything as hopelessly weak. We had to design and assemble nearly every aspect of the system from the printed circuit boards to the custom cabinets. The power cables looked like fire hoses and an elephant could have stood on it without causing any damage." Landmark has always been an international company, and shipped its first three systems to BHP in Melbourne, Australia, Enterprise Oil in London and Sun Oil in Dallas. Since then, Landmark claims sales of over 75,000 software licenses to oil and gas companies located in 84 countries. PDM wishes LGC many happy returns.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_5 as the subject.


Coherence Technology camp ‘optimistic’ about law suit (July 1997)

The Amoco methodology covered by the patent is marketed exclusively by Coherence Technology Company, Inc. (CTC) through its Coherence Cube products. Paddy Keenan, CTC's COO said 'CTC has long maintained that Landmark's sale of its Continuity Cube program is a contributory infringement of the Amoco coherency patent

Every time the program is used, the patented method is being employed. We are optimistic that the filing of the lawsuit heralds a new era of protection of these valuable intellectual property rights. We are extremely gratified that Amoco is sending a loud, clear message to Landmark and others that infringement of these rights will not be tolerated.'' Notwithstanding the brewing storm, CTC is bullish about the Coherence Cube's future, they are even hiring! 3D interpreters, processors and technical sales engineers interested in being at the focal point of this hot technology can apply to ctc@coherence.com. Intriguingly, experience in ProMax and SeisWorks is considered advantageous!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_6 as the subject.


RODE Guru defends the use of data archive format (July 1997)

Eric Booth of RODE Consultants takes issue over PDM’s questioning of the benefits and costs involved in the use of the SEG RODE encapsulation standard.

I believe that editorial in last month's PDM on formats misrepresents the costs and benefits of the SEG RODE encapsulation standard. It is based on the American Petroleum Institute's RP66 version 2. RP66 is a very flexible and powerful way of storing and exchanging data and related metadata that is media independent. It forms the basis of many new formats including RODE, DLIS, GEOSHARE, WITS and PEF (POSC exchange files). The RP66 base standard defines number formats, low level media bindings (which mean that the user does not need to know how the hardware stores the data), data structures and some common objects such as the file header, the data source and any data dictionaries used. It is, admittedly, difficult to understand and work with, but it is really the domain of the technical programmers amongst us and need not concern the end user. The data management professional needs to understand some of the features and facilities, they do not need to understand all the details. The data exchange standards are defined by schema that implement data models, the POSC exchange file implements the EPICENTRE model, GEOSHARE implements a broad based exploration model, DLIS implements a well log model capable of storing data for any field or processed digital well log.

Simple model

The RODE model is extremely simple, we assume that the data to be encapsulated consists of variable length records separated by tape marks. The model allows tape marks within a data file and provides options to record all input tape status conditions. The status conditions allow old data to be encapsulated and recovered with the status values from the original media. This allows seismic software (e.g. demultiplex programs) to attempt to recover errors. The only required object is the RODE-CONTEXT object, this requires that the creator of the data file records who they are, the internal format of the data, who wrote the software, which version they used, who the job was done for, where the data came form etc. There is an ANCILLARY-INFORMATION object that allows users to save metadata by designation and value. It is not a required object, but it can be used to store any information about a data file (e.g. end point co-ordinates, ensemble ranges, and velocity fields - its up to you). We also provided an indexing facility, it allows index files to be generated in RP66 format and saved on the media. So if the data model is so simple why have their been problems and why is RODE seen as difficult and expensive.

There are at least three problems to discuss:

The early published examples and implementations were based on very early drafts of RP66 version 2 (we must use version 2 for large physical blocks on tape). As a seismic programmer, I had no previous experience of RP66 or DLIS, and made a simple error in the data channel. The RP66 experts did not notice the error and it has been propagated into some of the early implementations. I found this error late last year and submitted two papers to the SEG Technical Standards committee explaining the error, providing a simple generic coding fix and a corrected example. I have placed draft copies of these papers for review and comments on my web-site ( http://www.rodecon.demon.co.uk ) as they appear to have gone into a black hole at the SEG. The incompatibility problem is, therefore, well understood and a simple fix is available.

RP66 provides efficient structures for bulk data. Most implementations are designed for handling complex data models and do not take advantage of the simplicity implicit in the RODE model. The physical records can be double buffered so that data is always available and the logical records extracted using one or occasionally two data moves. The software then needs to identify the logical record type and if it is a data record determine its structure. A RODE encapsulated file normally only has one structure and this should be set up as pointer increments from the start of the logical record. The addresses of the encapsulated data and status values (including the data length) can then be returned directly to the user. Unfortunately, generic RP66 software tends to work back through the defining records for each instance of an encapsulated data record and re-determine the structure of the encapsulated record. RP66 structures and objects are complex (but then so is a video recorder) and implementation is not easy. RP66 is an accepted basis for many exchange formats and implementers are getting better at creating files and also at handling some of the more common errors. Performance issues are, therefore, a matter of careful coding - in many cases de-encapsulation should be faster than reading short raw records from the media.

The last major issue is the use of metadata. If you are prepared to guarantee that your records are perfect and that they will never be compromised then you do not need metadata stored with the data. In the real world, tape labels fall off, paper records are lost, staff leave, companies are taken over, and databases can be corrupted etc. Storing metadata and index files on modern media is a good idea. Building large archives costs a lot of money and it will be wasted if you cannot find the data. The selection of parameters to store and the allocation of these parameters to RP66 objects and attributes is non-trivial and there is no defined standard. RP66 allows users to define a local content standard that defines the use of objects and attributes within a file and the quality of the data within the file. The data manager is responsible for deciding how data is archived and ensuring that the data is correct and can be easily found. They may delegate the responsibility but they cannot shirk it.

F-word

The flexibility and functionality of RP66 and RODE are a problem – you need to strike a balance between storing all relevant metadata and keeping the files simple. You can, however, ensure that the data is easy and efficient to read and include all the metadata on a single media. End users are not interested in the physical location of data, all they want is access to data. The system must be able to identify the correct volume, load it, confirm that it is correct, move to a specific file, open that and confirm that it is the file that the user wanted and then extract the appropriate data. This should happen automatically without user intervention, volume and file identifiers are essential. Any large archive project should set standards for volume labels and file identifiers.

I believe that you should always be able to recover the metadata from the same media as the data if necessary. Hopefully you should never need to read all of the objects but they provide a high level of disaster insurance providing they contain the relevant metadata.

Eric Booth has 28 years technical experience in the Oil Industry, including: Field Acquisition, Seismic data processing, Systems Management, System design, Capacity planning., Professional Society standards committees and Research and Development. He currently chairs the SEG Technical Standards sub-committee for High-Density Media formats and is working with the American Petroleum Institute’s RP66 committee on the implementation of RP66 on new media. His company RODE Consultants Limited (eric@rodecon.demon.co.uk) is active in a number of RODE related projects for a variety of clients and has also developed ReadRP66 a stand alone package to verify compliance with the API’s RP66 version 2.01 and the SEG RODE schema.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_7 as the subject.


Geoshare is talk of the town at Houston gathering (July 1997)

The First International Conference on Petroleum Data Integration and Management took place last month in Houston. The Geoshare User Group wanted a higher profile and delegated the management of their annual event to the Petroleum Network Educational Council, a division of Philip C. Crouse and Associates Inc.

The presence of around 140 attendees (half service companies, half oil company personnel) demonstrated the success of this "repackaging". The proceedings began with a couple of papers vaunting the merits of Geoshare as a data transfer mechanism. Geoshare, although a Schlumberger sponsored development is a truly open environment, and development kits can be obtained by any third party for a modest sum. The essential idea of Geoshare is that an application can output or receive data through a "half link". Stuart McAdoo described how the half link concept is implemented using an application independent data model. This was initially designed in 1990 and now, version 10 supports 8 major data types including seismics, wells and surfaces. The data model is implemented on top of the API RP66 specification (see the Paper on RODE in this issue for a further discussion of this technology). Data transfer thus no longer requires specific converters for each pair of sending and receiving applications. So long as both applications "talk" Geoshare, then for an exploration shop with N different applications, you only need N half links. Without Geoshare, you will theoretically need (N squared – N)/2 reformatters. Just in case that is not entirely clear, if you have 6 separate applications you will either require 6 Geoshare half links, or 15 reformatters if you do it the hard way.

Best of breed

Geoshare makes sense then, in today's "best of breed" multi-vendor environment. Since 1991, when Geoshare was incorporated as a not for profit company, it has been administered by its members through two committees: Data Modeling and Encoding, and Ancillary Standards. Geoshare has a World Wide Web page at http://www.geoshare.com, where the data model is freely available together with a catalogue of commercial half links. The migration of the Geoshare data model to Epicentre has been mooted, but although no one want's to rule this out, it would seem very unlikely to happen in any foreseeable future, Geoshare practitioners are pretty happy with what they've got thank you very much! Recently Geoshare has moved into the PC Windows domain with the appearance of half link to MS Access. Bill Sumner (Independent Consulting Services - ICS) was particularly bullish about Geoshare, stating that Geoshare could become the industry standard data model once every application includes a Geoshare half link. Today there are around 50 packages that utilize Geoshare. Sumner's company ICS even has a GeoBasic high-level development tool, which uses the concept of a Universal Geoshare Receiver.

Not magic

Of course Geoshare is not magic. Jack Gordon of Conoco emphasized the care necessary to ensure data integrity during transfer especially with the problems associated with topographic datum shifts, and when two applications have a fundamentally different view of data representation. The deployment and regular use of Geoshare is not for the faint hearted. Companies making regular use of Geoshare either have substantial in-house resources to manage this environment, or will be involved in major outsourcing projects.

Valuable case histories of Geoshare based implementations were described by various speakers. This month's actuality has left us short of space and we

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_8 as the subject.


Panther’s latest AppTrack release (July 1997)

Panther Software announces their new release 2.5 of AppTrack, their software license manager.

AppTrack enables usage-based licensing, cost allocation and chargeback of software costs to users, departments and specific projects. Apptrack also offers data filtering, data reporting, and graphing - with a rich set of data presentation capabilities. AppTrack was initially a bespoke development for Landmark Graphics Corporation. Landmark's management had a need to track usage patterns to enable a usage based licensing (rental) scheme for their applications. Landmark markets Panther Software's AppTrack software around the world on a non-exclusive basis. Version 2.5 is a port to Solaris, but new functionality has been added to incorporate enhanced graphing including aborts, denials and CPU usage. Spreadsheet reporting capabilities have been added and unlicensed applications running on AIX, IRIX and HPUX can now be tracked in addition to SunOS and Solaris.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_9 as the subject.


Leading role claimed for ER Mapper’s new edition (July 1997)

Helen Burdett of ER Mapper has provided thisintroduction to the functions of this raster-based GIS tool, which interfaces with themajor E&P software environments.

ER Mapper is the world's leading integrated mapping software. It offers unique data integration and enhancement functionality enabling geophysicists to access and display the wealth of information contained within seismic surveys. Subtle faults and features fundamental to risk and cost reduction in exploration can now be identified and selection and positioning accuracy of target wells and features can now be improved. At the recent EAGE conference in Geneva, Earth Resource Mapping presented their latest release, ER Mapper 5.5. This version included heightened 3D functionality, wizards and additional oil and gas imports allowing the user to exchange data with Charisma, GeoQuest and SeisWorks. The positive feedback generated from the vast number of demonstrations given over the five days re-affirms the software's number one position in the market place as well as testing the stamina of Earth Resource Mapping's staff!

Meteoric rise

Earth Resource Mapping was founded in 1989 and has continued its meteoric rise to the current number one position ever since. With such prestigious names as Landmark and Schlumberger amongst its distributors, its reputation is impeccable. Its user group similarly extends to the largest names in the industry such as BP Exploration, Mobil, Texaco and Shell.

One of the major advantages of ER Mapper is its unique and revolutionary algorithm concept instead of the traditional disk to disk processing. Previously, all intermediate processing steps created new datasets, which consumed huge amounts of disk space. In addition, all these files had to be kept in case the user wanted to adjust or reverse any processing. In contrast, ER Mapper performs all its processing when it displays an image, without using disk storage and with the advantage of maintaining the integrity of your data. As an integral part of the petroleum industry, image processing has continued to reach new heights and is now able to identify and thus prevent unnecessary and costly exploration.

Processing techniques

ER Mapper's proven capabilities in the interpretation and processing steps and finally map production confirm its fully functional abilities in the petroleum industry. ER Mapper provides new seismic horizon processing techniques including real-time sun illumination to highlight subtle faults and channels. It can display merges of multiple horizons and attributes such as dip, azimuth, depth and porosity in a single merged view. This gives the user the ability to combine all data fields for a given area in one single display. ER Mapper has a full suite of cartographic production tools giving the user the opportunity to create professional mapping. These can be printed using ER Mapper's own print management tool to over 250 hardcopy devices making additional printing software unnecessary. The software is available world-wide for Silicon Graphics and Sun Solaris workstations and PC's running Windows NT or Windows 95. As the industry has been driven into increasingly remote and hostile environments, satellite imagery offers up to date regional information in order to help plan and manage seismic surveys in such areas. Such a versatile tool as ER Mapper can now be deemed as essential in visualizing the whole picture. Paul Batey, a senior geophysicist at BP Exploration and Operating Company has described some of the uses for ER Mapper. "Because of inaccurate and out of date mapping, satellite imagery is incorporated to increase understanding of topography.

Elevational changes

With high resolution images combined with DEM's loaded into the ER Mapper's 3D viewer, survey planners can highlight problems in the area for example utilizing features such as elevational changes by adding contours, real time sun shading and slope formulas".

ER Mapper can allow you to move a light source in real-time over a chosen display, controlling elevation and azimuth to make the image appear in three-dimensional relief. This process increases the ability to see faint features such as faults and channels, which were not visible in the original data. Together with the other information gained, a seismic acquisition plan can be produced for an acquired acreage, which can then locate seismic lines relative to topological features and existing infrastructure. When surface geology is analyzed with respect to seismic locations, you can place ground truth within the context of seismic data and thus greatly reduce costs for planning wells in remote terrain. ER Mapper integrates satellite and radar imagery to detect possible hydrocarbon seepage on the sea surface, offshore installation and structural field mapping. This allows the user to fully visualize all forms of geographic data. ER Mapper is the only system that is currently available that allows you to interpret your data to this degree.

‘Quagmire’

Dave Cowen, the senior geophysicist at Texaco Ltd. muses upon the quagmire that the massive amounts of data generated in interpretation in the oil industry has produced. Because of this, "tools to study and analyze that data and visualize information quickly and easily are a must. We find ER Mapper just that tool". ER Mapper contains all the functions and capabilities you need to process and integrate your data including image warping and rectification, full on screen annotation and vector editing, a real world map projection database covering over 800 map projections and datums around the world. Other functions in the software include contrast stretching, raster modeling formulae, customizable filters and look up tables. The President and Founder of Resource GIS and Imaging, Gerry Mitchell sums up the advanced capabilities of ER Mapper 5.5 as " unequalled in providing productive solutions to data integration and imaging problems. Saving image algorithms, instead of having to save copies of multiple versions of the same datasets, solves the problem of managing large volumes of satellite imagery. ER Mapper's effective handling of vectors and the dynamic links to GIS systems provide, in minutes, integration of map and image data which used to take days".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_10 as the subject.


PI/Dwights and Shell Services announce cooperation (July 1997)

Petroleum Information/Dwights (PI/D), and Shell Services Company (SSC) have announced a cooperative agreement to provide a range of information products and services to the global energy industry.

PI/D's primary business is the development and management of the industry's largest commercial databases. PI/D provides expertise in all E&P data processes, including data quality procedures, data management, data integration and data delivery, and data centered object oriented toolkits, in a secure environment. PI/D also provides P2000 the leading operational data management system in the industry to create, load, update and maintain well and production data. SSC is one of the Shell Oil Company families of companies. As an independent operating company of over 2,000 employees and revenues in excess of $350 million, SSC and its strategic partners provide business process consulting and operations in upstream, downstream and enterprise-wide activities of the oil, gas and petrochemical business as well as other related industries. VeriStream, SSC's data management service (featured in PDM Vol. 1 No. 6) provides data management services ranging from consulting to integrating applications, shared database and infrastructure. Services to be offered by the new venture include large scale massive digital and hard copy data archiving, client-server expertise, communication and inter/intranet skills, the creation, integration and management of all major E&P data classes, end user information object toolkits and in-depth application knowledge. Under this agreement, Shell Services Company and PI/D's consultants will provide system design and implementation, including legacy data conversion, system and database management, rigorous data security procedures, data/application interfacing, training and facility management.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_11 as the subject.


Why Patent Software? (July 1997)

California-based attorney Jim Ivey (jim@iveylaw.com) -www.iveylaw.com discusses the issues involved in software patents in business such as the oil industry.

In the field of patent protection, one subject that separates software from other technologies is that software was believed by many to be un-patentable subject matter for the first several decades during which the software industry flourished. As a result, most software innovations were protected by trade secret. Patent protection and trade secret protection are inherently incompatible. Patent protection is, in essence, a negotiated exchange between the public and an inventor in which the inventor gives the public complete knowledge of the inventor's invention and, in exchange, the public gives the inventor rights to exclude others from practicing the invention for a limited period of time.

Secret

An inventor cannot keep an invention secret and obtain patent protection for the same invention and therefore must generally chose between patent protection and secrecy.

What many do not understand is that choosing to keep an invention secret rather than seeking patent protection allows another inventor to subsequently independently make the same invention and seek patent protection. If the latter inventor successfully obtains patent protection for the invention, the latter inventor can exclude the former, prior inventor from practicing the invention. In the context of software inventions, patent protection was largely ignored until relatively recently, partly because the availability of patent protection for software inventions was in doubt until relatively recently. Many developers of software chose to forego patent protection in favor of trade secret protection.

Petroleum Industry

Therefore, evidence of prior use and/or knowledge of an invention, which should bar issuance of a patent, is difficult to come by in many instances. This problem can be exacerbated in industries, like the petroleum industry, in which software is not the primary product/service but is instead merely incident to the primary products/services. For example, many major oil companies develop computer software for their own internal use and benefit and have no intention of marketing such software; their primary business is petroleum exploration and acquisition. Accordingly, the idea of seeking patent protection for software developed for internal use seems divergent from the primary focus and direction of the business. The result is that techniques and processes used in the computer software may have been in use for years but publicly available evidence of such use may be particularly difficult to come by, especially for a patent examiner with limited resources.

Prior use or knowledge

Increasingly, oil companies obtain technology, such as computer software to aid in seismic data interpretation and processing, from independent vendors. As a result, much of the software technology which was traditionally viewed as merely ancillary to the primary business of exploring for oil now has real, quantifiable value. Patent protection is more aggressively sought while much of the use and knowledge which should be considered as prior art is, as a practical matter, unavailable to a patent examiner. The result is that some patents issue while they really shouldn't. However, evidence of such prior use and knowledge, if obtained, can be presented to a court during litigation and the court can declare a patent invalid and therefore, unenforceable. Evidence of prior use and/or knowledge can be obtained for example, by a litigant with a sufficiently large stake in the outcome of such litigation.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199707_12 as the subject.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.