October 1997


Prism's Data Access and Exchange - the shape of things to come? (October 1997)

Prism Technology have announced what is claimed to be the first real-world implementation of POSC’s Data Access and Exchange Layer 5DAE). This middleware is intended to allow consistent access to different implementations of POSC’s Epicentre database.

Ray Boucher of Prism technologies, speaking at the POSC Focus/E&P Knowledge Work conference held in Oslo last month, described the fruits of the LightSIP project as "the first pure implementation of the Petrotechnical Software Corporation's (POSC) Epicentre Data Model". The Data Access and Exchange (DAE) layer was developed in the Elf-led LightSIP project and is intended to solve the problems of software interoperability and maintenance.

Middleware

The idea is to offer software developers a "middleware" layer that allows them to query a database without knowing too much about how the data is actually stored. The DAE middleware is a pretty clever piece of kit. It relies on modern database design to allow it to peek into a sort of "table of contents" within the database - the metadata - and adapt the incoming query from the application to the actual syntax required by the database. To the calling application it offers - or "exposes" - in the jargon, a consistent set of data objects which are supposed to reflect a generic way of accessing E&P data types. For more information on this technology, and the key role that it plays in POSC's philosophy see the article inside this issue "POSC DAE dissected".

Interoperability

The technology allows one vendor to access data in another vendor's database, in other words it will provide interoperability. Additionally, the DAE middleware is designed to be a constant in a changing world, so that a database can be upgraded without all the applications having to be re-written, thus solving the problem of maintenance. Defending the "pure implementation" claim, Boucher described existing commercial implementations of Epicentre as "using a POSC wrapper around a proprietary database". Regular readers of PDM will remember our criticism of vendors claims for POSC compliance because of this.

IBM involvement

PDM featured the LightSIP project in the January 1997 edition where IBM - the project manager - described LightSIP as being "the first truly POSCian data management solution". At the time the project was envisaged by its sponsors Elf, Shell and Statoil, as being a scaled down version of the full POSC DAE specification - hence the "light" tag. Keith Steel, Prism's CEO will have none of this claiming that the product that they are developing is a full, commercial strength implementation of the specification. At that time, Philippe Chalon, Methods and Standards department manager at Elf Aquitaine stated "Elf considers that the availability of a POSC DAEF component is on the critical path of POSC take-up. We will be recommending our E&P applications providers to adopt the LightSIP product when it becomes available, as POSC standards are central to our technical target architecture." The DAE is also variously known as DA, DAE, DAEF and LightSIP - for combinations of Data Access and Exchange Facility and Software Integration Platform. It is a C Application Programmer’s Interface (API) with functions and data structures defined in POSC header files. Delivered as a library its behavior is defined in POSC’s Data Access and Exchange specification. The software checks your Epicentre data store and is available as a freeware version that produces summary info (no detail) and the commercial offering that reports data quality problems in detail (and suggests fixes if applicable).

Beta Testing

The product is entering a "formal" beta testing phase right away with a full commercial release scheduled for March 1998. The final product will include native compiler versions, a projection meta-data loader and other enhancements. The following platforms will be supported AIX 4.1.4 / Oracle 7.1.4, HPUX 10.20 / Oracle 7.3.2 and Solaris 2.5 / Oracle 7.3.2. Future priorities are NT and IRIX. Sponsoring oil companies have given Prism an assured further two years financing for the project. Prism are to work with CAP Gemini to provide training in the DAE, in building applications using POSC specifications and in data migration.

Downside

Adding a layer between applications and their data is not without potentially serious downside. Performance is the critical issue and Prism will have to demonstrate minimal performance hit from going through the DAE. Another field requiring attention is the stability of Epicentre. What needs to happen for all this to work? Well for this technology to invade the desktop, major vendors such GeoQuest and Landmark will have to buy Prism's DAE and implement it between their applications and databases. This may seem a tall order for a software house that has just revamped its product line around a proprietary database and is offering its own API up to all comers. However, the technology is being tested at GeoQuest, Landmark and, of course IBM.

IBM to implement DAE

The potential implications of this technology are great and both IBM and Prism will be marketing the LightSIP deliverables as a stand-alone product for application vendors. In addition, LightSIP will be available for the Project Data Store platform in the next major release and is being integrated into the PetroBank master Data Store environment. In the PDS environment, LightSIP will sit under a layer of Business Data Objects, which will be the preferred means by which applications will address the database. For the Mater Data Store (PetroBank), data management utilities and applications will be able to access the database either via LightSIP or via direct SQL access. Because IBM holds the ex-Tigress data model in trust for third parties such as PGS with the Tigress application suite, the addition of a DAE layer would seem a natural route. This might enable IBM to steal the POSCian high ground and open up the PDS to access by the other main vendors. The move from "poles" of interoperability within individual vendor frameworks to true data sharing is far from neutral for vendors, there will be winners and losers, with a migration from the "all things to all men" vendor to the famed "best of breed" "plug in". If it comes off one sure fire winner will be Prism Technologies.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_1 as the subject.


Prism CEO POSC appointment (October 1997)

Bob Pindell, Chairman of the Board of Petrotechnical Open Software Corporation (POSC), announced the appointment of Mr. Keith Steele as an at-large director at the corporation's annual meeting in Oslo.

Mr. Steele, CEO of Prism Technologies, was appointed to serve a two year term commencing immediately. "This is a great honor for both me and Prism, and I hope that I can make a valuable contribution to all of the issues facing POSC" said Keith, "My motives are undeniably in line with POSC goals and I intend to try to inject enthusiasm, energy and a little momentum into achieving them".

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_2 as the subject.


The Perfect Knowledge Work Talk Class (PKWTC) (October 1997)

PDM's editor went to the POSC/Focus Knowledge Work Conference and reports on the state of the art.

As an occasional attendee at POSC meetings I am almost getting used to the successive shocks to the system as you discover that what you thought was important is no longer centre stage, and that some project that had hitherto escaped your attention has actually been consuming hundreds of man-years of effort and is just about to bear fruit. In this edition of PDM we have focused on some parts of POSC's activity which seem important to us, but this is necessarily an idiosyncratic and very incomplete view. It may seem ambitious for PDM as an outsider to try to capture POSC, it is, but what has made us decide to attempt it is that there are many people involved in G&G IT who haven't got a clue as to what POSC is doing. Furthermore if you listen to POSC talks or visit the web page (http://www.posc.org) you tend to get grand claims for POSC benefits (we will save you money, standards are GOOD for you and so on) or incomprehensible technobabble at a much lower level. Where is the overview? Well it is here in your hands, at least a bit, read on.

POSC/CAESAR

This year's shock to the system came from seeing POSC from a Norwegian viewpoint. Now all upstream IT-ers know Norway as the home of the first "relational projection" of Epicentre, in PetroBank, but while PetroBank got a brief mention, the focus of activity in this conference was not DISKOS, not even Epicentre, not even POSC, but POSC/CAESAR the construction part of the equation - see separate article. Part of the non-CAESAR part of the conference was devoted to "knowledge work". Now this is a nebulous term, and I have pages of notes taken during the different offerings. I will spare you them and instead try to extract the essential commonality of the knowledge work expose. First though may I offer you the following enigma for your consideration.

gobbledygook

Most people working in IT have at some time or another been called upon to write computer code. This is a painstaking task, which even when done badly requires great precision, clarity and an understanding of exactly what one is trying to achieve. When the erstwhile coder moves up the corporate hierarchy, he or she may well be called upon to speak on IT and related issues in conferences - or to write in plain English - in manuals, sales literature and the like - explaining just exactly what their company, organization or product is trying to do. Having listened to some of the self-styled knowledge workers describe their wares one has to wonder - where did the precision go? How can you write detailed instructions to a machine one day, and the next, throw away any notion of clarity, talk in management speak, psycho and technobabble, deviate from the subject, not even have a subject?

Talk template

I think I have found out the answer to the mystery, it comes from the popular OO programming techniques involving "abstraction". These speakers are actually singing to the same hymn book, it is just that it is an abstraction. This has led me to propose what we ordinary folks would call a knowledge work talk toolkit or template. But since this is too explanatory, we will baptize it the knowledge work talk "class". Now first you must realize that "class" is technobabble for whatever you have in your mind at a given point in time. It could be a subroutine, a data structure, both or something quite different. It is an "abstraction", it doesn't have to mean anything. If you are ever challenged in using these techniques, if you are forced to say what you mean by your class, you can "populate" or instantiate it with whatever seems appropriate then, so you don't need to know what you mean when you actually say it. And seeing as what you say can and will be interpreted differently by the majority of your audience who won't bother to challenge you, you can spread a multiplicity of ideas and warm feelings by using this technique.

The PKWTC

I'd like to have a go at a bit of abstraction before your very eyes as it were, and offer you here a totally abstract, class of knowledge work talk the Perfect Knowledge Work Talk Class (PKWTC).

Part one - turn on the top brass by saying what they want to hear cost savings, downsizing, asset team, knowledge work etc. Deprecate - implicitly - the way we used to do business without of course giving examples, let your audience instantiate them for themselves.

Part two - turn on the programmers by ensuring that they will have lots of work, and will be able to use shiny new tools fresh from the box Java, OO, abstraction, code re-use (even if the code hasn't been used once yet, and maybe never will!). Deprecate silly old fools who do things the hard way - again in an abstract way, most of your listeners will be doing it this way anyhow.

Part three (optional) - turn on the users by offering them functionality beyond their wildest dreams such as interoperability. This is optional because there will probably not be any users present; they will all be working!

Part four - conclude with the conclusion you thought of in the first place - do not worry if there is no logical sequence leading up to this, there almost never is! Thus with the same first three sections you might conclude that outsourcing is a good thing, or that we need standards, or the Java is the answer, or business objects etc. Or - and I predict that this will be the new trend - that we should hire more people!

Now all you have to do is to go and populate the class with your own spin - to create an "instance" of the PKWTC. This is of course where the going gets hard. Ultimately, by describing what you are trying to do, by breaking down the specifications, you may find that you are back to writing code, with the precision that we mentioned earlier. Alternatively, you might find, especially if you started with too falutin' or poorly specified instance of the PKW talk, you may well find that what you are trying to do is impossible!

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_3 as the subject.


PDM interviews Gordon Phillipson, IBM’s Eastern Hemisphere Oil Business Manager. (October 1997)

PDM’s editor, Neil McNaughton interviewed Gordon Phillipson, General Manager for Process and Petroleum Industry Solutions for IBM's Europe, the Middle East and Africa (EMEA). Phillipson previously spent 27 years with Shell Oil Company, latterly as Head of E&P Computing at Shell International, so we were especially interested to hear Phillipson's views on POSC and the future of E&P computing.

NMcN When you visit the PetroConnect website it is easy to be overwhelmed by the scope of IBM's offering in the E&P field. There is software, hardware, services - a finger in all pies! How would you define IBM's approach to supporting the E&P sector in the field of computing and data management.

G.P. IBM has undergone much reorganization over the last few years, and we now have two principal branches, - "customer facing" and brands. The customer facing people are there to gain an understanding of client's business needs and to match IBM's offerings to them. The brands division contains the familiar hardware and software groupings such as PC, RS6000, AS400, our ES390 mainframes, solid state hardware, networks etc.

About three years ago IBM launched what have been termed "special views" of customers - who were previously grouped together according to their industry. These new views represent a different segmentation of our marketplace such that different industries sharing the same methods are now grouped together. In our Process and Petroleum division for instance, we have petroleum, petrochemicals, forestry, textile and apparels - all linked by the common connecting thread of a process transforming an input into an output.

N.McN. This thread is obvious looking at the downstream oil business, but it is a bit harder to discern in E&P.

G.P. Of course, the Petroleum division is further sub-segmented into oil products, petrochemicals and E&P. Our mission is to populate these divisions with customer facing people with the relevant skills and backgrounds and also to supply technology appropriate for the sector. These are set up as Industry Solutions Units, with strategic and tactical roles. Many customers have their own IT architecture and plans and we respond tactically to these. Other companies may be seeking our strategic offerings and our consultants will engage with them to identify their business objectives and in this context we have noticed great interest in data management, where there is an urgent need to collapse time frames.

N.McN. Who do you talk to in oil companies, the CEO, the IT people?

G.P. If required we will talk to the company president. Businesses are increasingly taking over IT having been frustrated with IT departments. The business now says "show me the value" and it is the requirements that determine the technology. What used to be a technological push is turning into a business pull with the aim of a more effective use of IT. Of course not all of our customers are the same. IT departments may come in different flavors, some with corporate standards - like SAP, with different internal IT organizations, one IT department per domain, hence the focus on our client-facing activity.

N.McN. Can we now focus on IBM's E&P specific offerings. IBM has had a long history of involvement with E&P data modeling in particular with the Mercury project in the 80's and with PetroBank in the 90's, how do you situate your E&P domain activity today?

G.P. The industry at large has been taking a long look at itself and has began to re-form its core business with a focus on assets, usually oilfields, and asked the question "how do you improve these assets?". Do you acquire new assets, if so by purchase, or exploration. This leads on to issues of how to manage these assets and the view of the asset's lifecycle. This itself evolves through a field's life with issues such as more seismics, delineation wells early on in the field's life, water-flood and so on. IBM spotted this developing trend and also did some assessment of market size. The first thing we realized was how dependent on data management the cycle was, which was of great interest to us because IBM is extremely strong in this field. Some 70% of all the data in the world is stored on IBM mainframes, so it was natural to try to apply this strength to E&P.

The analysis showed that there were some strong, competent products out there, in acquisition, processing and interpretation. The industry has a long history of investment in these fields by companies such as Landmark GeoQuest and others and it would be extraordinarily difficult for IBM to contemplate gaining a competitive advantage in this area.

As I said though, we determined that the management of the data was critical to this activity - not just in the field of acquisition, but in tracking an asset all the way through its' life-cycle of. This involves managing persistent data - where media is as important as software, where the read/load/manage cycle needs a standards based approach that is viable for perhaps 30 years or more. This led us to implement our data management solutions on POSC and we continue to believe that this is the way forward. This does not mean, though, that we believe that the industry's problems will be solved by building the "mother of all" databases.

N.McN. Wasn't that POSC's intent though?

G.P. Yes in the subsurface, but not for facilities. Lets return to the field asset. The asset is what is important, it is managed on behalf of the asset owner - or perhaps one or two owners. It is serviced by many more contractors. All this requires coordination and easy access to contract information. In the world of "leaner and meaner" companies we need multi-contractor access to data and links to business management.

N.M. So POSC has been more successful in facilities than in the subsurface?

G.P. I was thinking particularly of the POSC/CAESAR compliant VAV project which is going well but we haven't finished yet. One reason the facilities area is ahead of E&P is because of the prevalence of the PC as the hardware platform.

N.McN Does IBM see POSC's DAE as the way forward, or the relational projection of Epicentre?

We follow and implement POSC as far as possible, we are certainly not intending to promote a competing architecture. In fact the DAE when it is released next year will be promoted as the preferred means for applications to access the Project Data Store (PDS).

N.McN. Do you see the wave of business software tools such as SAP Oracle and J.P. Edwards as crossing the divide between finance/administration and the E&P technical department?

G.P. That's not clear. We are currently working on links between PetroBank and the PDS - which today is a development of the Tigress database, but which is being migrated to Epicentre. PDS handles multiple copies of an interpretation perhaps done by different teams. PetroBank to SAP links would need to be implemented at the project level - we are currently working on this.

N.McN. Can we now take a look at the broader world of IT, and where you think that's going. To take one of your earlier statements about technological push being replaced by business pull, I remember the days when everything just had to run on UNIX, as though that was the only criteria for an E&P buyer. Do you think that Java is shaping up to be another example of pure technological push?

G.P. An aspirin looking for a headache? To an extent, but Java is useful especially in the pace and ease of Inter/Intranet implementation. IBM's view is that the Internet's underlying suite of standards promote information for everyone anywhere. Java complements this vision very well but IBM is not in the Java/Windows 95 war - nor in the PC/Net PC war either. We will make our products compliant with what is required.

N.McN. Would you go as far as to say that the Internet has solved the problem of Client/Server computing?

G.P. Yes. Of course performance and availability of critical business applications still need improvement. Security is also an issue before e-commerce will take off. Incidentally, the development of PetroBank has been in the forefront of secure systems. There have been orchestrated attempts to break into PetroBank so far without success.

N.McN. What is IBM's attitude towards the POSC Business Objects/Interoperability initiative, do you see this as an alternative route to the "mother of all databases" approach.

G.P. We are very strongly committed to object technology and use and promote OO programming techniques heavily. Without getting too technical, if you define an objects as actions and associated non-persistent data they begin to start looking like mini-applications. You will still need a database or databases for the storage of persistent data. Our experience in manufacturing, comes in here where an object framework fits the business quite naturally. In fact the applications we have developed using this technology have been used in other industries.

N.McN. In PDM we have been tracking the way the E&P software industry, far from moving towards interoperability, seems to be polarizing around GeoQuest's GeoFrame environment and Landmark's Open Works. Do you see IBM, with PetroBank and the PDS developing as a third pole here?

G.P. I don't like to think of a third pole. We have developed a POSC compliant architecture allowing for plug and play for similarly compliant applications which lowers the entry barriers to new vendors and promotes cooperation between say the petrophysicist, seismic interpreter, petroleum engineer. As businesses follow the asset based paradigm I outline earlier, speed of change becomes the dominant factor - wouldn't it be better if more companies adopted this POSC based technology - that is the key.

N.McN. That still sounds like a third pole to me, and GeoQuest also claims POSC compliance.

G.P. If GeoQuest want, they can plug and play with us! Landmark has already an agreement with PetroBank. The problem is one of direct access to the data - comparable to a direct write to hardware in the Windows environment. End users want performance, when you take the high ground of standards you can starve up there. The same issues apply to the thorny problem of upgrading the data model. At some point you will have to make someone unhappy! I believe that the industry is at a turning point - vendors are to an extent at the mercy of their customers. The usual trade-offs apply, short term-ism versus the long term, performance against interoperability. We need tangible support for standards and we need it soon.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_4 as the subject.


POSC Data Access and Exchange Dissected (October 1997)

PDM attempts a layman’s guide to POSC’s Data Access and Exchange Layer, intended to federate Epicentre databases from different suppliers.

To understand the importance of the DAE featured in this month's lead it is necessary to dissect POSC's objectives in the field of interoperability. For applications to be able to "plug and play" into a foreign data model they would appear to need a complete understanding of the model's internal structure, then they can be tailored to access the data directly. The drawback of this approach is that this means that the application has to have an intimate knowledge of how the database is built, and that an application to database mapping must be hardwired for each new database encountered. A further problem arises when a new version of the database comes along, the application needs a re-write to be able to use the new-improved version of the database. Modern database designers use techniques that not only get around these problems, but also that allow for a better separation of the high level conceptual design of the database from the low level tables and rows of the physical implementation. This is the world of logical data models, of tiered database schema, of entity relational modeling and of the relational "projection".

Complexity

The essentials of these methods is that you first design your data model as the data really exists in the real world. Imbricated hierarchical relationships between objects are described as completely as is possible, so that a many wells belong to an oilfield, and many oilfields belong to a joint venture and many joint ventures belong to a company and so on. Simultaneously, many separate sets of wells will be owned by different joint ventures with different combinations of companies grouped together in the joint ventures. Companies may also enter and leave the JV's bringing a temporal element into the equation. If you think about how hard things can get with these simple high level objects, you get an idea of how complicated the resulting logical model will become once it has completely described fluid properties, well bores, logs and all the other entities that make up the real E&P world. The scope required of POSC's Epicentre data model can be imagined from a description of version 2.0 of the Epicentre Logical Data Model Version 2.0 which was "designed to serve the needs of a very broad variety of technical application programs from many different suppliers and meet the data management needs of all E&P companies throughout the world".

Logical?

The resulting "logical" data model defined by POSC has been defined in a specialized language called Express, which came from the building and construction industry and is used there to described parts and components, and the way they are assembled to produce sub-assemblies and finished products. The Express-defined logical model is complex in the extreme, and its manipulation is not for the faint hearted as PDM revealed in the Discovery story last August. The next step in modern database design is to fit the square peg of the logical model into the round hole of the relational database. A relational database needs everything to be described in two dimensional tables filled with data in rows and columns. The process of creating a relational projection - or a "physical" data model from the logical model is known as mapping, and is something of a black art. It is important to note that the creation of a physical implementation from a logical data model is non-unique. In other words, a single logical model such as Epicentre can be "projected" to many different relational "planes". relational plane

Commercial Epicentre

Thus GeoQuest has projected parts of Epicentre to parts of the GeoFrame physical implementation, IBM has done likewise with PetroBank and PECC/Petrosystems with PetroView. The poor application trying to access these different data models is back to the stage of a complete re-write for each environment. Interoperability nil!

POSC saw this problem way back down the road and while the previously mentioned "Epicentre" implementations are described as POSC compliant, they do not really do it the way POSC intended. The protection built into the complete POSC specification relies on the existence of an intermediate layer between the application calling the data and the database - the Data Access and Exchange Facility (DAEF). This software layer allows the application to see a consistent description of the data which is independent of the underlying data model. Better still, the DAEF is capable of peeking into the physical database, and reading information about how the data is really stored - the metadata - and then supplying this in the way the calling application requires it. Enter the Data Access and Exchange specification and LightSIP.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_5 as the subject.


Read the book! (October 1997)

If talk of Epicentre and the DAE is still all Greek to you, a jump start into the world of E&P database management can be had by reading our publication the Survey of Data Management in the E&P Business - details from PDM's publisher The Data Room - sdm@the-data-room.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_6 as the subject.


POSC/CAESAR - grass looks greener (October 1997)

POSC’s attempt to hook up the upstream’s Epicentre data model to the Product Data Management world of the offshore construction business has encountered some serious problems.

Elsewhere in this issue you will find some explanation of how a data modeling tool borrowed from the Product Data Management arena - the Express modeling language - was selected by POSC as the tool of choice for E&P modeling. It was thus fairly natural then for POSC to team up later and work with the Norwegian offshore construction industry on the development of a PDM like extension to Epicentre to cover platform building and the like.

Unified?

The initial goal has been described as an attempt to find the Grand Unified Theory of data modeling, or alternatively to build the "mother of all data models". Things did not quite go according to plan, and the history is obscured by marketing claims. What seems to have happened is that first there was CAESAR, using ISO-STEP (more Express) to define their data structures, then along came POSC and said how about tying this all in to Epicentre? So POSC/CAESAR was formed to do just that, but it turned out that this was too hard and CAESAR decided to carry on as before, except that it is now called POSC/CAESAR anyhow. The only losers are people who believed that this one was going to fly and actually went to the trouble of coding some of the Epicentre extensions which were to encroach on the CAESAR part. They have been left out in the cold. Viewed from the E&P side of the equation the grass does look greener on the construction side of the fence. This could be just an observation, or it may relate to the appropriateness of Express in describing construction objects over E&P data structures.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_7 as the subject.


LGC releases Parallel VIP (October 1997)

Landmark Graphics Corporation is shipping Parallel-VIP, described as a 'powerful and fast oil and gas reservoir simulator for both black oil and compositional simulations'.

This new release provides reservoir engineers and geoscientists with the computing horsepower and "full-featured", integrated parallel simulation software to simulate large, as well as small, hydrocarbon reservoirs more quickly and accurately. Parallel-VIP, which runs on distributed memory parallel processors, was rolled out on IBM's RS/6000 SP at the Society for Petroleum Engineers' Annual Technical Conference and Exhibition in San Antonio. "We believe that Landmark is delivering a powerful and unique capability to the industry," said John Gibson, executive vice president of Landmark's Integrated Products Group. "Essentially, Parallel-VIP will enable reservoir simulation to make greater contributions to the core business of our customers-managing and optimizing oil and gas reserves."

Mega-cell

Parallel-VIP and the IBM SP provide users with the solution they need to address the entire reservoir as well as manage the interrelationships between individual gridblocks. "By vastly expanding the scalability of Parallel-VIP to include even more processors, we are enabling our customers to significantly increase the processing speed of their reservoir models," said John Killough, Landmark's principal reservoir scientist. "Our customers can better quantify and manage uncertainty because they can conduct multiple iterations of their simulation models rather than be restricted to a single run due to time constraints. VIP can also process even more gridblocks of data-faster. Now simulations of one million gridblocks to five million gridblocks can be simulated with less upscaling. "We are also excited about the application of Parallel-VIP to 4D seismic," Killough continued. "This capability will allow reservoir simulations to be performed at the resolution of seismic or geologic data, without the need for upscaling, which has traditionally lowered resolution considerably."

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_8 as the subject.


Epicentre - A Shrink-Wrapped Relational Projection at last? (October 1997)

The debate continues as to whether POSC should continue with the high level abstraction of Epicentre, or go for a quick fix with a ‘shrink-wrapped’ relational projection of Epicentre.

In the POSC meeting in Oslo last month questions were asked as to progress in the development of a "definitive" relational projection of Epicentre. Readers of this month's lead and other articles in this edition will appreciate that this is not exactly a politically correct question. Nevertheless it has been asked, by the POSC board of the POSC management and was the subject of quite enthusiastic debate between members. There has for a long time been a push from on high -starting with the ill-fated Discovery project, to have a relational projection of Epicentre that could be supplied to the world as THE physical implementation of Epicentre. POSC members who have invested in the DAE type of solution are not over enthusiastic about this potential climb-down. Nor are the major vendors who have already done the work and are pushing their physical implementation as the true path.

Anti-trust?

But at the heart of this issue is the future direction of POSC, the extent to which such a standard could be developed without infringing US anti-trust regulations and what exactly are the member's requirements. Possibilities other than a relational projection are LightSIP, or a non-relational database. But the former is perhaps a bit youthful while the latter has already been tried in the UniSQL implementation of Epicentre without producing much in the way of product. One of POSC's problems in this area is that it has to keep to the open systems high ground and cannot be seen to be underwriting one database management system at the expense of another. The move to a shrink-wrapped Epicentre would almost certainly involve opting for one vendor - almost certainly Oracle and perhaps involve a migration to Oracle CASE tools from Express. Oracle 8 must look like a tempting platform in this respect. POSC is currently involved in some serious soul searching in this area.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_9 as the subject.


ESRI announces ArcExplorer Version 1.0 and new magazine (October 1997)

ESRI has just announce the final release of ArcExplorer Version 1.0, an ‘easy-to-use’, free GIS data browser that supports Windows 95 and Windows NT 4.0.

ArcExplorer is designed to "dramatically" change the way that geographic data can be viewed and shared throughout organizations and the world. Simultaneously, ESRI is launching a new quarterly magazine, ArcUser, in response to growing interest from the ESRI user community for more technical information. ArcUser will be available in the first quarter of 1998 and will be mailed automatically to all registered users of ESRI software. It is free to all registered users. Outside of the United States, ArcUser will be distributed through ESRI's network of worldwide distributors. More on http://www.esri.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_10 as the subject.


PPDM ASSOCIATION RELEASES MODEL VERSION 3.4 (October 1997)

The Public Petroleum Data Model Association (PPDM) has just announced the release of model version 3.4 (PPDM v3.4) for review and approval by the membership.

PPDM is the "other" E&P data model. There is no "logical" high-level description of the model here, just a relational implementation which is supplied for use by members. PDM (no relation) covered the PPDM extensively in the November 1996 edition. We concluded that, apart from the intrinsic merits of the different models, an understanding of the way the PPDM model has evolved is useful in mapping out the future - even for POSC. In particular we highlighted the difficulty of upgrading the installed base from one release of the model to the next. The new version 3.4 is now available to download from the PPDM web site at www.ppdm.org.

Calgary AGM

Highlights and implementation insights will be presented at the PPDM Annual General Meeting in Calgary, October 22 & 23, 1997 where the model approval process will be reviewed and discussed. (Traditionally, the model is approved via a membership vote.)

Release highlights include:

New subject modules for Production Reporting and Land Mineral Rights

User friendly, drill-down style documentation

Increased rigor & testing throughout model development process

Architectural consistency and increased integration for Well, Production, Seismic & Land data

Hydrocarbon production tracking and reporting is enhanced and expanded in PPDM v3.4 in readiness to tie into production accounting and other downstream systems. The seismic portion of the model has been improved to better support seismic acquisition and data management. Minor adjustments were made to the structure of the well tables for international well requirements. Land Mineral Rights handles the description of mineral rights both spatially and chronologically. It forms the nucleus for more extensive modeling in the Land subject area.

Mapping

Extensive drill-down documentation makes the model easy to understand and assimilate for both end user and technical staff. To facilitate take-up, a complete mapping between PPDM v3.3 and PPDM v3.4 will be available shortly. Additional supplementary documentation (such as description of testing and users guide) will be distributed as it is finalized.

Increased emphasis on model validation and testing occurred at every stage of development - business requirements, model design, model implementation with sample test data and sample queries. Workgroup activity through the alpha and beta testing phases helped to provide real world feedback and validation.

Stake in the ground

PPDM v3.4 is described as "a solid stake in the ground, marking a healthy evolution to consistency with the PPDM architectural design principles". Reference tables and constraints have been expanded to enable strengthening data asset value through enabling increased data reliability and early detection of errors and inconsistencies.

Extensive industry volunteer workgroup activity is the heart of PPDM. More than one third of the PPDM membership participated in some way to help develop and evolve PPDM v3.4. This contribution of time and expertise to define business requirements, model the subject area and validate through testing is invaluable. PPDM would like to make special mention of the following members for their exceptional commitment and contribution: Alberta Energy Utility Board (Calgary), Amoco Corporation (Houston), Petro-Canada (Calgary), PI/Dwights (Calgary & Denver). For more information, contact: Mel Huszti, Executive Director, Public Petroleum Data Model Association, (403) 660-7817 or info@ppdm.org. PDM will be reporting on the PPDMA AGM in the November edition.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_11 as the subject.


Express - but is it the right bus? (October 1997)

POSC made a bold choice in selecting the Express data modeling language to describe E&P data. But doubts have been.... ‘expressed’ regarding this exotic language.

POSC's choice of Express as its data modeling language of choice was a bold decision, made when there wasn't much else in the marketplace. Is it the right language for defining THE industry-wide data model today? There are two sides to this issue, one is the suitability of the language, the other is the take-up of the language. The second issue may not seem too important, but it is. There are many examples of brilliant IT inventions, from ALGOL, through LISP to OCCAM. But is there are not a lot of people using a language, it is expensive to deploy, and may even expire. The paucity of Express related information on the web - nearly all is POSC related looks gloomy in this respect. The Express language was initially used in the Product Data Management (PDM) arena, so it is used to describe the way nuts and bolts are assembled into bits and pieces of a car - can this really help us describe an oilfield?

Early adoption

On the face of it the answer appears to be a qualified "yes" in the POSC/CAESAR environment - i.e. in the construction field where the parts of an offshore platform are pretty much like the parts of an airplane or car. In the upstream POSC area things are not so clear. It is not so much the complexity of the exploration business that is the problem, after all an airplane is pretty complex, but rather the difficulty and even ambiguity in defining the data structures used in E&P. It is arguable that POSC suffers in this domain from being an early adopter of this technology. It does not appear that, POSC apart, Express has found much following outside of the PDM arena. The underlying concept of the logical data model being a stepping stone to a robust implementation has however gained wide acceptance, but the tools used are either more or less generic Computer Aided Software Engineering (CASE) products or add-ons supplied by the database vendor such as Oracle CASE tools.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199710_12 as the subject.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.