January 1998


IHS GROUP ACQUIRES PETROLEUM INFORMATION / DWIGHTS (January 1998)

Acquisition Creates World’s LeadingSource of Information on Oil & Gas Exploration & Production Activities

Following on from its takeover of Petroconsultants in 1996 (PDM Vole 1 N° 4) Information Handling Services Group Inc. (IHS), the leading international information database publishing group, has announced the acquisition of Petroleum Information / Dwights LLC (PI/Dwights), based in Houston. PI(Dwights) is the result of the recently completed merger of Petroleum Information Corporation and Dwight’s Energydata, Inc., together described as "the leading source of information on oil and gas production and well history data across North America and in the North Sea".

synergy

Miles Baldwin, Vice President of Corporate Development at IHS told PDM "PI/Dwights and Petroconsultants have a perfect fit and represent great potential for synergy. Additionally IHS has a lot to offer the two units in terms of performant databases, web-based information distribution and adding value to delivered data. IHS has undertaken a large migration to electronic commerce over the last 6 years or so and some 95% of revenues are from e-sales. Our acquisition of Petroconsultants has been a runaway success for IHS so far, we have every expectation that the PI/Dwights will leverage this further." Michael J. Timbers, Chairman and CEO of IHS Group described the acquisition of PI/Dwights as a "natural extension of IHS Group’s 1996 acquisition of Petroconsultants, the leading supplier of information on oil and gas exploration and production activities outside of PI/Dwight’s core North American markets." He explained that "IHS Group now represents the most comprehensive source of information on energy exploration and production around the world."

maneuvers

Corporate maneuvers revolving around Petroconsultants PI and Dwights have had a long history with PI looking to acquire Petroconsultants as far back as 1971. Several attractive offers were made over the years but they failed to tempt Petroconsultants' president and founder Harry Wassal who preferred to keep his company rather than take the money. More recently Petroconsultants looked at both Dwights and PI but their merger and the subsequent acquisition of Petroconsultants by IHS pre-empted these efforts.

Further down the road, a name change is a possibility for the new grouping ,but this is causing some headaches. All the members of the new group have strong brand images and strengths in various parts of the world. Calling the new group "Petroleum Information/ Dwights/Petroconsultants" is generally held to lack pizzazz. In countries where one or other of the constituent companies has a predominant position, it will be business as usual on the corporate branding front. Elsewhere such as in the UK, some reorganization will be inevitable.

Suter to go

Christian Suter, president of Petroconsultants told PDM of that he had tendered his resignation about a year ago, after 12 years with the company. He will be leaving the group in May 1998. In the interim, the new merged unit will be jointly led by Suter and Charles Ivey CEO of PI/Dwights. Suter will remain with the group in a consulting position after that date while Ivey assumes the role of CEO of both companies. Suter told PDM "The combination of PI/Dwights and Petroconsultants has created many interesting possibilities for improved and more cost effective services to our combined client base".

To date the technical input from IHS into Petroconsultants has been minimal with some exchanges of experiences in the field of secure communications. Suter foresees increased synergy coming from the association with PI/Dwights. To accelerate this process an exchange of senior management roles is already underway. Matt Carter (from IHS) will be VP Finance for both of the units, David Noël (PI/Dwights) will head up the group's technology division and Dale Pennington (Petroconsultants) is in charge of sales marketing and product support.

Model merger?

The thorny issue of a merger of the two constituent IT offerings, and in particular, the different data repositories deployed by the two companies will be addressed pragmatically. The existing data delivery systems from both PI/Dwights and Petroconsultants will continue to be supported. For clients who require visibility of both datasets, loaders will be developed to move data from Petroconsultants Iris21 to PI 2000 and vice versa. This is considered to be a more realistic route than a model merger, at least for the time being. Meanwhile all future software and communications will be developed in common. Standardization issues are to be addressed too, and common naming conventions are to be developed. As a result of these two acquisitions, energy information represents IHS Group’s second largest revenue stream, after its engineering information businesses. The acquisition by IHS means that it now controls a $110 million per annum market in E&P information services with a contribution of around $70 million from the PI/Dwights group and the remaining $40 million coming from Petroconsultants. This will represent quite a hike in revenues for IHS who hitherto grossed around $440 million per annum. Both companies will report to Chris Meyer, President, COO and CFO of IHS Group, who expects "to further expand IHS Group’s activities in this arena with selected acquisitions of additional energy related information businesses."

PI/Dwights and Petroconsultants share strong sales and operating synergies. Where appropriate, each company will develop database products across common software platforms, enabling customers active in several geographic markets to manage, evaluate and exploit all exploration and production information from a single solution. Both companies will also cross-sell each others’ data products and services, combining common sales operations in Houston and the UK.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_1 as the subject.


PDVSA $165 million data management contract claimed as largest ever. (January 1998)

Petroleos de Venezuela SA (PDVSA) has awardeda three-year, $165 million contract to GeoQuest to manage PDVSA's exploration andproduction (E&P) integrated applications and data environment.

The agreement was effective January 1, 1998. Technical data from PDVSA's Exploration and Production Divisions will be consolidated in a corporate database compliant with international standards enabling PDVSA's E&P personnel to process, analyze and interpret data and information through commercial applications. Within this integrated environment, GeoQuest will deliver specialized services to PDVSA's E&P personnel to enable them to effectively and efficiently use these data and information. "As a strategy to support the Apertura (Opening) process of the Venezuelan oil industry, since the beginning of this year PDVSA has begun implementing a radical organizational transformation. This reorganization replaces the vertically integrated affiliates with a functional structure in which PDVSA E&P has assumed all the responsibilities for upstream activities," said Marco Rossi, chief information officer of PDVSA E&P. "An E&P integrated environment," continued Rossi, "managed by a world-class corporation such as Schlumberger GeoQuest, is a fundamental step to strengthen the new entity technologically. In addition, this environment will substantially increase the staff's productivity and provide a common technology platform that will enable the adoption of best practices and, in general, enhance the organizational learning process of the newly created corporation."

By integrating technology and specialized services, PDVSA takes a step toward its goal of increasing production capacity to more than six million barrels per day by the year 2006. "As a strategic partner, GeoQuest's and PDVSA's visions for the future are closely aligned," said Alberto Nicoletti, GeoQuest vice president of Latin America. "Our corporate objectives are to improve the quality of the decision-making process and reduce the cycle time in E&P activities. Better data quality, advanced technology and ongoing training are key to making that happen."

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_2 as the subject.


PDM Editorial - are things really getting better? (January 1998)

Neil McNaughton discusses a remasteringproject that The Data Room was involved in recently and shows how in IT, even the simplesttask can become complex. The best laid plans can come unstuck through details such as whatcharacters are allowed in a file name. Rather than offering a new standard solution, hesuggests that a little education might go a long way.

The impression given at the SMi E&P Data Management and Data Repositories conference held in London this month was that the opinion people hold on data management depends on which sector of the industry they belong to. If you are a vendor of data management software or services - then things are on the up, the problems are being solved. If you are an oil company client or consultant - actually working on the shop floor then the improvement is sometimes harder to discern. Two client side presentations at the SMi conference - from Tim Bird of Enterprise and Mairead Boland of Shell illustrated this point and a summary of their papers is given elsewhere in this PDM. The common experience here is that while the data management has been solved on a conceptual level, somehow it is just not being put into practice. I'd like to align myself with the client side viewpoint on this one, chip in with a few anecdotes of my own and try and come up with, well I'm not sure what, certainly not an answer.

Devil in detail

As a starting point, lets take the havoc wreaked by the failure to respect some simple conventions. I would like to relate a recent personal experience. This was an outsourcing project that involved processing a very large number of legacy well logs and transcribing them to High Density Media. The details are not important, it is the devil therein that is. First it is worth observing that when you are working with a contractor, you cannot go into his shop and expect him to re-engineer all his software to your corporate standards, unless you are a very big corporation and this is a very big contract. This may mean using some software and computing platforms which will not allow you to do what you want to do - even in the simple area of naming conventions. In the project in question, UNIX, VAX and PC hardware was in use. Unfortunately, you cannot expect much cooperation from the hardware in such an environment. A slash "/" maybe OK in a file name in one system, but would become a new directory in another. The PC world, at least in its DOS and Windows 3.1x incarnations will only allow you to store 8 plus 3 characters in a file name. Yet another constraint was the acceptance or otherwise of whitespace in file names across the different operating systems. All of which meant that the well names changed three times in this one project as files were read in processed and output. As I said, short of re-engineering the whole process, nothing could be done to avoid these operating system quirks so we just had to live with them. The important issue here was to be aware of the problem, to try to achieve a co-operative way of working with the contractor and to adopt an agreed work around at each stage of the process. In short to develop a more or less formal procedure for the work in hand. One might expect that this would be the normal way of doing business for any contractor, but while contractors will have procedures for doing the most commonly executed tasks, these may be constrained for reasons such as those outlined above. Elsewhere, no two datasets are alike, and in special processing no two jobs are alike. If you follow the procedures manual to the letter, you may not actually be doing what you want.

Human error

Another recognized cause of error is the mis-typed entry, human error. Everyone agrees that this should be addressed by improved constraints on data entry with checks, limits and lists of values being performed by the database. But in the real world a lot of data is not even entered into a database - at least at the point of capture. Probably the most common tool used for data entry is the spreadsheet. This is a devilish invention indeed. You may have read in three thousand integer values into a database from a spreadsheet before you come across two values separated by a comma, or a note from the operator to himself saying "can't read this". Anything goes in a spreadsheet because the is no intrinsic data integrity, no checks, no ranges no lists of values. Well actually this is not true and while the proper answer to how to use a spreadsheet is probably "don't", a reasonable compromise would be to investigate the data validation possibilities of your favorite tool. In MS Excel, areas of the spreadsheet can be set to pre-determined ranges of values, or restricted to values in a list. It is "just" a matter of educating people to use these functions. But the education bit is probably the hardest part, you may even have to educate your boss.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_3 as the subject.


Interview with Charles Ivey (January 1998)

In an exclusive interview, Charles Ivey, CEOof the newly formed PI/Dwights/Petroconsultants group told PDM of the way the new combinedcompanies would be working in the future.

PDM - How will you approach the future, what changes are in store for the two companies, will they merge?

Ivey - We will continue to have two national flags - Swiss and the USA. In the longer term we are seeking to have a world-wide perspective. Already our customer base overlaps to a considerable extent with perhaps up to two thirds of our present customers having subscriptions to both Petroconsultants and PI/Dwights data services. We therefore need to be particularly aware of the way in which our market is segmented between domestic and international customers. These different needs must be addressed with appropriate technology and data services. Because of our position we will be extremely focussed on providing our clients with standardized solutions, with a view to improving data availability and reducing cost.

PDM - will the IRIS21database from Petroconsultants be subsumed into the Dwights/PI P2000 database in the medium term?

Ivey - I would not say that at all. In North America, it is the market that has driven PI/Dwights to the adoption of the PPDM data model. Elsewhere in the world where the basin and license paradigm is more prevalent, Petroconsultants' IRIS21 data model has been very successful. IRIS21 has "boxes" for just about everything, and with the PROBE subset is a very comprehensive solution. We will be very open with all our clients and we will attempt to find the best solution to deliver all their data - whatever that solution might be.

PDM - You mentioned standards, how do you see POSC's Epicentre in this context?

Ivey - One thing is for sure, we do not believe that one solution can fit all needs. We are very keen on the concept of late binding, whereby objects are composite and created "just in time" rather than "just in case". So in a real-world solution, one could imagine both PPDM and POSC based data models at a client location - which would be integrated through object technology much in the same way as the web does this.

PDM - Both PI/Dwights and Petroconsultants have done a lot more than just deliver data for quite a time now. You could be said to be encroaching on the "data management" field of the GeoQuest and Landmark Graphics of this world. Seen from the client view point, this situation could lead to a multiplicity of browsers and data access engines being necessary to build a project. Whose data browser and data delivery tools will prevail?

Ivey - We have no desire to compete with Landmark or GeoQuest. Our business is in data delivery and management, browsing and selecting. For a company with another system, there are two ways to move your data around - by push, or by pull technology.

PDM - We hear a lot about these terms today, particularly in the context of the world-wide web, how do you define them in data management terms?

Ivey - Pull technology is the way things generally work today. If you fire up your data management system and it goes looking for data in someone else's system, then that is pull technology. If however you fire up and then it is the server system that provides the data - on its own terms as it were - then that is pull technology. The difference is fundamental to us - especially in terms of interoperability with other vendor's tools. We want to be able to push our data into anyone's system and we can do this. On the other hand, a request to pull data from our system can quickly turn into a nightmare for us, as there may soon be hundreds of different ways of accessing our data all of which we must develop and support. We are currently working with Landmark on pushing data into Open Works projects. Ultimately this will be achieved through a middleware layer using business object type technology, but I believe that is unlikely that there will ever be a single way of achieving this. Site specificities and the market mean that we have to and will provide the data transfer mechanism that is right for the job in hand.

PDM - Both Petroconsultants and PI/Dwights have a long history of actually using data, how do you intend to build on that in the new group?

We have been working for some time on some new software - Powertools - which would be categorized as intelligent data retrieval with processing on the fly. So that for instance an engineer selects a group of wells and decline curves and perhaps net revenue are computed using company defaults - so that we are delivering data plus its meaning. This is a similar approach to that used in Microsoft PowerPoint where the image selection tool displays a thumbnail of images as they are being browsed in the file locator. In this context we are working on an intelligent tool that performs statistical analysis on the fly upon selected data. Most software vendors have underestimated the data problem. I firmly believe that it is data that is the heart beating inside the exploration and production business.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_4 as the subject.


Landmark welcomes UNIX extension of Microsoft's COM middleware (January 1998)

Microsoft has just announced their plans tolicense and support Common Object Model (COM) on platforms other than Windows. COM is thebasis of Object Linking and Embedding - the technology that lets you use a spreadsheetfrom within your word processor.

The extension and support of this technology to platforms such as UNIX would mean potentially that you could work in a similar manner across the whole spectrum of enterprise computing. It would be conceivable that you could access UNIX based well construction software, reservoir simulation, or anything likely to provide relevant data, from a spreadsheet in the finance department. More probably, the enterprise computing system would be configured so that the relevant data and tools were hard-wired together through the middleware so that your data would be computed and maintained on the fly.

CORBA Killer?

On a more general level, interoperability through COM would provide a credible alternative to the Common Object Request Broker Architecture (CORBA) offerings from the Object Management Group (OMG) which have been slow in getting off the ground. The focus of interoperating with Microsoft Windows is also highly relevant for a company such as Landmark Graphics Corp. whose product line spans both the Windows environment and UNIX. This fact was underscored in a communication from Landmark's John Gibson, executive vice president of Integrated Products who stated "Landmark applauds Microsoft's commitment to COM interoperability. Consistent, cross-platform COM implementations will enable independent software vendors, systems integrators and industry standards organizations to create a new era of enterprise-level solutions integrating business and technical applications, based on Windows and UNIX."

Iona onboard

To date most interoperability efforts - such as the recent POSC Interoperability RFT and the Open Spirit initiative (PDM Vol. 2 N° 12) have focussed on the CORBA side of the business, with interoperability with Microsoft's environment usually an afterthought. Turning the traditional approach on its head, Microsoft have teamed up with CORBA providers Iona Technologies PLC. Iona is one of the first licensees of COM and will be providing interoperability with their Orbix implementation of CORBA. In a simultaneous announcement Silicon Graphics Inc. (SGI) have licensed COM for its IRIX systems. Other IT organizations jumping on the bandwagon include Software AG who is supporting COM on MVS, IBM's Mainframe Operating System. Leading systems integrators Andersen Consulting, Electronic Data Systems Corp.(EDS), KPMG and Vanstar Corp. also said they would support COM across their customers' mixed-platform environments.

Try it yourself

Finally, a suggestion. If you have Microsoft Office installed on your system you can try out some of this technology for yourself. Try inserting an Excel spreadsheet into a Word document (Insert - Object), then editing it and printing the resulting document. This is not exactly distributed computing, but is a best case test of COM technology - components from one supplier operating on a single system. You may find some aspects of your test very positive, you may also unfortunately find that this test of software interoperability under optimal conditions shows some weaknesses. Editing in Excel may not quite produce the results you expected in Word. System resources in Windows 95 may get gobbled up rather quickly and - we have to say this, the whole thing has been known to fall over completely. Distributed COM is here to stay and will probably eat up a whole chunk of the marketplace, but that should not stop Microsoft from dotting a few i's and crossing some t's in both their operating system and COM code to make this thing work better and to make them work all the time.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_5 as the subject.


Essence Associates announces new version of the E&P Software Product Repository(January 1998)

The Software Product Repository is describedas an 'intelligent repository of information about software products in the oil andgas industry'. The repository relates software products to both business activitiesand information topics.

Levels of compliance to international standards such as POSC (Petrotechnical Open Software Corporation) and ISO STEP are also recorded. Version 1.2 has now been released and supplies information on almost 300 products from nearly 100 different vendors world-wide. A free demonstration database, containing a subset of the total information, is available from the web site. Alternatively, a CD version including a run-time MS Access license is available on request. Inclusion of product information is free to vendors while users of the repository are charged a corporate subscription fee. A minimum of two releases of the repository are planned per year.

To provide the necessary resources to support, develop and market the Software Product Repository, Essence Associates has formed an association with Brookeswood Computer Consultants (BCC). BCC provide, in addition to general consulting services, accredited training courses on the POSC Epicentre data model. BCC will be the first point of contact for Software Product Repository support, development, sales and marketing. More info on http://www.essence.co.uk/essence/.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_6 as the subject.


OFM and GeoFrame - an apology! (January 1998)

Jeff Woodward (GeoQuest Indonesia) took issuewith our 'confused' style and some content that mis-represented the overlapbetween GeoQuest's GeoFrame scope and the PC-based Oilfield Manager product (OFM). As faras the confusion and style goes, we'll try and shorten our sentences and improve our proofreading. As for the mis-representation, as Jeff pointed out, OFM cannot be said to overlapwith GeoFrame in the fields of 'simulation, interwell imaging, or drillingcapability' as we wrongly implied. Our apologies.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_7 as the subject.


Oz Yilmaz joins Paradigm (January 1998)

Oz Yilmaz, erstwhile research geophysicist atWestern Geophysical and author of the chef-d'oeuvre Seismic Data Processing (SEGInvestigations in Geophysics N° 2.) has joined Paradigm Geophysical as Managing Directorof the Europe-Africa-Middle East (EAME) region.

Yilmaz states that Paradigm "pioneered the shift in paradigm from time- to depth-domain analysis of seismic data". The company now has over 100 employees at four regional offices - Houston, London, Singapore and Beijing, and a "talented research and development team dedicated to developing the technology needed for in-depth solutions in exploration and development of oil and gas fields". For additional information, email info@geodepth.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_8 as the subject.


PPDM Releases Petroleum Data Model Version 3.4 (January 1998)

The latest version 3.4 of the Public PetroleumData Model Association (PPDM) has just been officially released and'reaffirmsPPDM’s position as the leading upstream petroleum data model standard'

Details of the internal release to members were already covered in PDM Vol. 2 N° 10. PPDM Chairman, David Fisher, said, "Version 3.4 is a major step forward in terms of content, documentation, and industry acceptance. Designed and evaluated by people who use it in their business, the PPDM Model delivers real value for member companies." The Association's Co-Executive Director, Melvin Huszti added, "Version 3.4 expansion of the well, production, seismic and land data, has positioned the Model to support integration of both technical and regulatory reporting information. Our new comprehensive documentation, combined with increased testing throughout the model development process, has made it even easier to understand and take up the Model."

Commenting on the progress of PPDM, Dr. Charles Ivey, CEO of Petroleum Information/Dwights, said: "The PPDM data modeling effort is of enormous value to our industry. PPDM has indeed become the working standard for oil and gas companies. We will continue to support this standard and those customers who use it because it is the future of our business as well. PPDM works very well for PI/Dwights and our customers."

PPDM's Chairman invited further participation in the continued development of the Model: "To sustain and expand this business-driven value, more effort is required through the cooperative development process that has proven so successful." Volunteer work groups are now developing the Model in the areas of lithology and land contracts, as well as evolving architectural principles to exploit the latest enhancements in database technology. More info from info@ppdm.org.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_9 as the subject.


Why keep pre-stack seismics near-line? (January 1998)

We have been fretting of late as to why somecompanies keep their field seismic data on or near-line and others dispatch it to shelvingor even off-site storage. We asked Calgary-based Kelman Archives, a spin-off of KelmanSeismic Processing Inc. why this was one of their key offerings. Neil Baker Kelman's VPMarketing told us the following.

We here at Kelman have several reasons for offering pre-stack seismic data on demand. The seismic processing shops that we serve actually prefer to process their data from disk. This obviously reduces the time required to read the data into their systems for demultiplex and speeds the turnaround cycle for all our common clients. We have supplied processing companies with the software required to read in field data from disk The data is received in its' recorded format so that there are no changes required for the receiving company in their demultiplex programs. Tape mounts are zero and data is read in at disk speed.

Our oil company clients using AVO and other pre-stack interpretive tools have found these to be computationally cumbersome in the past due to high disk space requirements and long processing times. To turn these into viable pre-stack processes we have created an interactive demultiplex module which allows our users to retrieve pre-stack data from the robot and demultiplex on the fly. This can be performed on either single or multiple records.

Mirrored data

Elsewhere, clients may be focused on seismic field program design. Here legacy field data can be analyzed from the workstation so that field parameter selection can be optimized. During seismic acquisition, clients with in-house processing capabilities can now easily view, test and analyze the data as part of normal QC procedures - even in isolated areas.

In Canada trading and brokering data is a large component of geophysical activity. With pre-stack data on-line, together with other imaged data such as observers logs, survey, and other supporting documentation, purchased data can be accessed and delivered instantly.

Our system is also an on-line disaster recovery tool that has two robots for "mirrored data". With the current state of most libraries containing only one copy of data and the known problems of stiction causing data loss this system allows us to recover and protect this asset for the future. We believe that field data is for using, not just for storing. Companies can only exploit their data to the full - especially in the field of high-tech pre-stack processing - if it is accessible.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_10 as the subject.


GeoQuest: FOCUS on Interoperability (January 1998)

In last month's PDM an article entitled'GeoFrame, POSC and compliance' questioned the extent to which plug and playfacilities would be offered to GeoQuest's competitors. Schlumberger asked us for anopportunity to put the record straight, and have contributed the following article byNajib Abusalbi, Johnny Brown, and Sam McLellan.

In September, 1997, POSC membership hosted the FOCUS conferences on information and knowledge management in the E&P industry. At these conferences in Melbourne, Australia and Oslo, Norway, GeoQuest demonstrated, along with some its commercial software products, an experimental, World-Wide-Web-based software interoperability platform, called the GeoSIP. GeoSIP is a pilot test-bed for evaluating methods of integrating Petrotechnical Open Software Corporation (POSC) and other industry standards into Schlumberger GeoQuest product lines and, at the same time, a simplified development platform for manipulating technical information, including geoscience data.

From the web-based GeoSIP applet, an asset team member who is interested in a fast interpretation on some key data to make a multimillion dollar well completion decision might, for example,

determine which data are already available in any number of POSC Data Stores or PDSs (e.g., from GeoQuest’s GeoFrame interpretation system or those from other vendor systems),

send a query to an external data provider for supplemental information on log data, such as those stored in GeoQuest Finder’s LogDB archive,

as a result of this query, express an interest in receiving log data as it become available in real time from selected reservoir wells,

display, select and load relevant data via an implementation of POSC’s business object specifications into the team’s local PDS,

add, delete, edit and run queries against the selected data using an implementation of the POSC’s Data Access & Exchange (DAE) specification to better understand the data that the team member needs to interpret or display,

generate on-line results for the team’s interpretation decisions.

While GeoSIP represents the latest in a series of efforts by GeoQuest to verify the POSC specifications themselves, it is also part of a process to evolve GeoQuest products with these specifications.

As POSC makes seed implementations available for its oilfield service members, these members must continue to validate them, correct any errors or ambiguities, and champion these improvements into the original specifications before integration into specific commercial products. For the past few years POSC has assumed the role of promoting formation of joint industry projects (JIPs) where E&P companies and vendors jointly test and validate individual interoperable data solutions proposed by POSC. An earlier joint interoperability effort between GeoQuest, Elf, and PetroSystems was conducted in 1995 and publicized with TechnoPOSC, POSC’s membership technical journal. It demonstrated the issues of shared access to a PDS built from POSC’s out-of-the-CD Data Access & Exchange (DAE) implementation and Epicentre relational data store. As a result of the POSC-sponsored and managed joint effort called RESCUE, which begun around 1996, GeoQuest products will interoperate with products of other vendors via POSC’s new business object specifications.

Schlumberger has contributed both to the process of compliance verification and the expanded acceptance of POSC specifications within the E&P industry. While requiring considerable coordination and commitment of resources, all of these efforts provide a practical as well as the recommended means of

understanding the technical issues involved in using the specifications by POSC or implemented by its membership,

estimating the resources needed to implement these open specifications commercially within GeoQuest products, and

validating how resulting implementations satisfy the interoperability demands of E&P team-focused work-flows.

Schlumberger and its GeoQuest division have been strong, supportive members of POSC since its inception, serving on most of the work groups, committees and POSC organizational structures.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_11 as the subject.


Conference Report (January 1998)

PDM attended the SMi conference on E&PData Management and Data Repositories held in London this month. Despite a (hopefully)short sharp downturn in the oil price, the mood was relatively buoyant - at least from thedata management vendors viewpoint. Some dissent was expressed however by those at thecoal-face.

Helen Stephenson noted more willingness to attack the problems although she doubted whether technological solutions were keeping pace with the ever increasing data volumes. David Archer, President and CEO of the Petrotechnical Open Software Corporation (POSC) shared the positive mood, he told PDM "We feel that our users are optimistic and that they are starting to feel that the use of the POSC standard is becoming inevitable". The inevitability of standardization was the leitmotif of the conference with strong support for the "there is no alternative" (TINA) position expressed by Stewart Robinson of the UK DTI. POSC itself is increasingly having to find its own funding and is about to start up consultancy services. Archer confided to PDM "We need to start earning money".

Gigo

Tim Bird head of Information Technology with Enterprise Oil plc. made the important distinction between data logistics and data quality. The focus of most work to date done by data management software vendors has been data logistics. This leaves us with "solutions" which only address half the issue. Modern data management systems focus on "storing and moving data with ever greater speed". In short the QC/QA function has been overlooked. Unfortunately if the proper constraints are not applied to the data as it is acquired, this business process can degenerate into the transport, at high speed, of bad data. As they used to say "garbage in and garbage out" - but faster. Bird proposes a new focus - whereby data management is equated with data logistics AND data quality. This can be addressed in a multitude of ways, from clearly defined units of measure to standard data dictionaries, unique well identifiers etc. Bird also stressed the importance of defining standard naming conventions - at a project level before it was too late.

horror

This theme was picked up on and amplified in a horror story recounted by Mairead Boland (Shell Expro) - subtitled "to hell and back, one woman's journey through the world of petrophysical data management". As an indication of the monetary value of data management Boland related how a naming convention slip-up cost Shell around $5 million in a North Sea development. Boland also emphasized the need for cooperation in the implementation of common naming standards between oil companies, contractors and government.

Purist

Ugur Algan from Petroleum Exploration Computer Consultants (PECC) described the deployment of their PetroVision flagship data management tool currently used in 11 sites world-wide. The most recent installation being at Common Data Access (CDA) - see article elsewhere in this PDM. PetroVision uses the POSC Epicentre data model, with data accessed in a "purist" manner through a Data Access and Exchange (DAE) layer. Algan agrees that the use of a completely specified data model such as Epicentre does not solve the data quality problem. But the constraints implicit in populating an Epicentre data model do go a long way to improve data quality. On the issue of leveraging the corporate data store by providing efficient links to vendor applications, Algan underscored the need for cooperation between vendors, which was qualified as "variable - some are monolithic, some are keen to interact". Commenting the recent signature of the CDA contract with PECC, Stewart Robinson described PetroVision as the DTI's future "POSC window on the world". On the data quality debate, Robinson stressed the importance of peer group pressure by colleagues keen on sharing correct data. Recognizing that while POSC is not "the solution", Robinson agreed that it was a component of a solution, and emphasized the practical contribution that a quality National Data Repository could play.

Security

Contrasting views of Inter/Intranet deployment were presented by Ron Winwood of Hardy Oil and Gas and Chevron's IT supremo David Clementz. Winwood gave an enthusiastic description of the deployment and real-world use of Hardy's Intranet which allows for efficient communications between their subsidiaries in far flung parts of the world. Video-conferencing to Australia was cited as a particularly effective use. On a rather different scale, Chevron has around 30,000 users in more than 90 countries creating a wide battlefront for Chevron's defense against intrusion. Chevron have recently conducted an audit of their network and are using the results of the study to educate their data managers in security.

Moat

Clementz described Chevron's approach to Internet security as protecting the "castle's moat" emphasizing the risks to any system from hackers, unscrupulous competitors, foreign governments, thrill seekers and bandits. Security problems range from theft, sabotage, fraud to plain old accidents. A frightening array of readily available tools for network hacking are freely available on the Internet. These can sniff your packets - snatch passwords during transactions and allow subsequent access to the system by hackers. IP spoofing can allow a hacker to masquerade as a bona-fide part of the Intranet and other nasty Internet hacking techniques - such as the "Ping of death" - can just clog up the system. A 1997 FBI study (they don't spend ALL their time looking for BC's girlfriends) estimated that 75% of intrusions involved financial loss and of these about half came from outside attacks. Chevron take all this very seriously and have a police force operating an active monitoring program. Much of this effort was in educating personnel in correct implementation of security policy.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_12 as the subject.


CDA award seismic phase to QC Data, Petroleum Exploration Computer Consultants (PECC)and Stephenson and Associates. (January 1998)

Common Data Access Ltd., the UK's nationalE&P data repository has signed a contract with QC Data (the operator of the existingwell log and hard copy phases), Petroleum Exploration Computer Consultants (PECC) andStephenson and Associates for the initiation of the seismic phase of operations.

The project will involve the establishment of a seismic navigation database for the UK Continental Shelf, together with a database of company entitlements, license and cultural information. This information is to be stored in PECC's PetroVision Epicentre Database. Data access is to be via QC Data's Axxes front end which is already installed in-house at most of CDA's client sites. Stephenson and Associates will be initially populating PetroVision with cultural and license data from the DTI. Seismic positional datasets, starting with the last 6 years worth of North Sea activity will follow.

Entitlements

The establishment of entitlements will also be addressed early on in the project. CDA is to take a more pro-active role in the establishment of entitlements in view of the anticipated complexity of the task ahead. PDM understands that, contrary to established practice in the UK North Sea, all positional data will be viewable by all CDA members. In the past, seismic location maps have been considered as proprietary. This phase of CDA's development is a preparation for the on-line seismic phase which will be announced at a later date. Current thinking is that the future bulk seismic data repository will be "virtual". That is to say that entitlements will be managed by CDA but the data will be stored at a variety of sites - both commercial and in-house. Access to bulk data will depend on the previously established entitlements, but there is also hope that the hitherto tight restrictions on public domain seismics in the North Sea might be relaxed.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_13 as the subject.


Energy Research Clearing House host NT 'Saints' (January 1998)

ERCH in collaboration with Resource 2000hosted a meeting this month to 'explore an industry wide proposal aimed atstandardizing enterprise solutions onto a single operating system'.

The SAINTS Project - Strategic Analysis and Implementation of NT Solutions is designed to "facilitate the migration and optimization of Enterprise class applications to Windows NT based servers and clusters". SAINTS is sanctioned and endorsed by Microsoft. The project focus is on high performance enterprise solutions for the Energy Industry. Issues to be investigated include; scalability, compute performance, data management and I/O. Comparisons between UNIX systems and NT systems will be made evaluating throughput as well as price performance. Cost to corporate sponsors will be $25,000.00 per year. More info from sbennett@erch.org or Randy Premont at www.r2kgroup.com.

Click here to comment on this article

If your browser does not work with the MailTo button, send mail to pdm@the-data-room.com with PDM_V_2.0_199801_14 as the subject.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.