November 2003


Fugro buys Thales Geo

Fugro has bought offshore survey and geophysical specialist Thales GeoSolutions for €147.5 million cash. Included in the deal is innovative 4D/4C recording technology for deepwater deployment.

Fugro continues its spending spree with its largest acquisition to date. This time the target is offshore survey and geoscience specialists Thales GeoSolutions. The company has a forty year history of marine positioning from the early days of Decca and Racall Survey. Fugro will be paying Thales €147.5 million cash to acquire the debt-free unit.

Savings

Fugro anticipates some €30 million savings from synergies and anticipates that full integration will be achieved in six months at a cost of €25 million. This amount includes restructuring charges on a 10% headcount reduction (around 400 will go).

€ 590 million

The acquisition will increase Fugro’s pro-forma turnover for 2003 from € 380 to € 590 million. Fugro anticipates that certain non-core fixed assets may be divested after the deal has been completed.

Efficiencies

The rationale behind the deal is that ‘substantial efficiencies and synergies’ are achievable. Thales GeoSolutions will also bring new technology and people into the Fugro organization—accelerating research and development of new technologies. The acquisition will lead to rationalization in marketing and administration activities and improved equipment use. Fugro’s global reach will be extended—particularly in Latin America and China.

Strategy

Fugro describes its current strategy as seeking a ‘strong worldwide and balanced position in all its activities’. The acquisition is said to reinforce Fugro’s position in the offshore survey, positioning and geotechnical market segments.

Serial acquisitions

In the last couple of years Fugro has spent around €200 to acquire several oil service software houses and consultancies including Geologic BV, Volumetrix, Petcom, Jason-Geosystems, Robertson Research and most recently, SeiScan Geodata.

4D/4C

Included in the deal is new 4D/4C recording technology spun-off from development work originally done for the Australian Navy. This includes a novel ROV-deployed autonomous seabed multi-component recording system. There will be more on Thales/Fugro’s new ARMSS system in our report from the SEG International Exposition and annual meeting in next month’s Oil IT Journal.


Divestco divests!

Divestco has turned around its investment in IDC and Riley’s E-Log with a quick sale to TGS-NOPEC. The $9 million deal completes A2D’s US log library.

TGS-NOPEC is to acquire Calgary-based Divestco’s Riley Electric Log unit along with about 1.2 million depth-calibrated images of Canadian logs and a software license. The deal is worth about $9 million cash. Riley’s collection of 3 million US hard copy logs was bought last year for $4 million by International DataShare Corp—which merged into Divestco earlier this year.

Kotowych

TGS plans to merge the acquired assets into its Houston-based A2D unit. A2D president David Kotowych commented, “This acquisition accelerates the completion of our US digital well log inventory, adding tremendous value to the total well log solution we provide for our customers in this market.”

Hamilton

TGS president Hank Hamilton told analysts that it would likely “take some time” to realize the benefit from the acquisition— “Rileys has a large inventory of paper logs. We need to get these into a digital format and this won’t happen over night”. Hamilton stated that further guidance on the deal would be supplied after the transaction closes at the end of the year.


Free agents, captains of industry and risk

Industry employment trends show one growth sector—that of the ‘free agent’ or consultant. Oil IT Journal editor Neil McNaughton reflects of his personal experience of downsizing, free agency, and starting-up. He observes that the free agent, like it or not, is immediately exposed to the risks and rewards capitalism and wonders why this is not the case for the captains of industry.

Speaking at the SPE Annual Conference in Denver last month, Kemble Bennet commented the changing demographics of the industry. In 1984 there were some 12,000 petroleum engineering students at faculties across the United States. Today? About 200! This staggering drop in numbers goes way beyond the general decline in activity levels. It is also occurring in what one would have thought was a relatively buoyant industry segment—lord knows what the picture is like for geophysicists!

Free agents

The only growth sector in the industry is that of what Bennet calls ‘free agents’— independent consultants in other words. Which is a funny kind of ‘growth’ really—more of a corollary of corporate downsizing. Having been such a corollary myself in the not too distant past, I’d like to share some thoughts on this topic with any other ‘free agents’ who might be reading this—and with or those of you who may be ‘freed’ at some future date.

Entrepreneur

In an ideal world, free agents are a special case of the entrepreneur. They could be at the heart of the new, technology-facilitated distributed work paradigm. They manifest that noble aim of capitalism—something they share with those engaged in other forms of commerce—that they live and die by selling a product or service that someone else wants to buy.

Downsizing

The first problem with the ‘noble entrepreneur’ is that the downsizing process is forcing entrepreneurship onto the free agents. If they had actually chosen of their own free will to engage in an entrepreneurial activity, this would be perhaps a more auspicious start than being kicked out of a corporation. However it originates, entrepreneurship is not to everyone’s taste. Many folks do not have the psychological make-up or luck that is necessary to start out on their own.

Jinx

Even if you intend to go it alone, other folks who are still in gainful employ often just can’t quite get their heads around the idea of you as a noble entrepreneur. Heck—last week you were a geophysicist! I remember several awkward interchanges as I gave my ‘elevator pitch’ to ex-colleagues. They were more concerned about giving me sympathy than work—seeing me not as an ‘entrepreneur’ but as ‘unemployed’. For some, it seemed like they feared that my very presence was going to jinx them somehow! On reflection, the downsized engineer/wannabe entrepreneur’s main attribute is probably a thick skin rather than acumen.

Propitious?

It is inevitable that free agents and consultants are created at times of downturn—the least propitious time for them to get their own thing together. Why don’t folks jump before they’re pushed and set up their high tech consultancies while the going is good? Of course they often do—the dot com boom was a good example of a collective mania for doing your own thing on a variety of scales—from modest to megalomaniacal. But generally, two things conspire to make a timely exit into consulting a very risky venture—education and security.

Training

It was a great eye opener to me to take a course a few years back in setting up your own business. I finished by regretting that I had not been exposed to the world of commerce before. How could I have gone through like without understanding (well up to a point) a profit and loss statement? Or without knowing that marketing is what makes the world go around? The answer is of course that they just didn’t teach any of this stuff to scientists. Anyhow, to a child of the sixties—commerce was a dirty word anyhow and to be avoided at all costs.

Security

The other factor arguing against setting up as an entrepreneur is that you need to be quite mad to willingly abandon the advantages that gainful employment brings. Forget the regular salary, benefits, pension plan and the rest. Your new business model is a) find out what folks want and b) provide it to them. Any failure in the evaluation of a) or in the execution of b) will immediately impact your ‘bottom line’ and what’s on the dinner table!

Risk-reward

This is, if you like, true capitalism—where folks are rewarded for taking risks. Interestingly, in the corporate world, the straightforward risk-reward equation has been distorted out of all recognition in the last few decades. In fact the risk reward spectrum has virtually been inverted. Today, the low reward jobs are often where there is the greatest risk of seeing yourself downsized as your blue collar job goes offshore.

Captains

At the top of the corporate food chain, it is sometimes hard to recognize a capitalistic intent at all. Our captains of industry—apart from their extraordinary pay packages (which I don’t begrudge them—in fact I’d like one myself) display a personal risk aversion worthy of a lowly government employee or union man.

Peace of mind!

Of course executives’ stock options and bonuses are incentives to encourage performance. But set against these are: guaranteed pensions beyond the wildest dreams of the free agent, golden handshakes and worst of all, contracts that assure massive severance payments in the event of things going pear-shaped. At the top of the pyramid, the reward for failure is—financial peace of mind!


Oil ITJ Interview - Bill Bartling, SGI

Oil IT Journal spoke to Bill Bartling who heads-up SGI’s Energy division. Bartling describes SGI’s activity in storage, high performance computing (HPC) and visualization. SGI’s focus is moving from its MIPS/IRIX architecture to Linux and Intel-based clusters. SGI’s Linux deployment is significantly different from ‘commodity’ clusters, leveraging its HPC and visualization know-how.

Oil ITJ—What is SGI’s raison d’être?

Bartling—SGI is solving technical computing’s hardest problems—in defense, manufacturing, media and sciences including space, pharmaceuticals genomics etc. The energy sector represents 10-15% of turnover. The common theme is very large data, massively parallel processing and visualization.

Oil IT J – What is your contribution to Energy IT?

Bartling—We have three product lines in the energy business. Storage management, computing and visualization. Storage management is SGI’s ‘best kept secret’ as data growth soars and distributed data management is the norm. Companies need to avoid replication and versioning. SGI’s CXFS* is ten times faster than NFS and scales to multi-Exabyte file systems and Terabyte file size. LightSand** allows for distributed file systems spanning thousands of kilometers. Distant parties share the same view of data supporting mirror failure/backup systems. This has become very important post 9/11 now that the Federal government mandates that backup systems need to be at least one time zone apart. These solutions leverage OC48 high bandwidth fiber—distributing businesses without burdening IT. One SGI customer has one PetaByte (PB) of data on spinning disks and is planning to go up to 5PB on disk with a forecast 50% capacity growth per year. Only SGI can handle this.

Oil ITJ—And on the computing front?

Bartling—With the new Altix 3000, SGI enters the high performance computing (HPC) cluster business. But our approach differs from other cluster vendors. To specify our Linux-based cluster offering we combined the good things from clusters and shared memory architectures (SMA). So SGI’s new clusters combine SMA’s single system image and global shared memory with the price and openness of the cluster. We have eliminated disadvantages such as the SMA’s proprietary OS and high price and cluster annoyances such as managing multiple OSs, node bottlenecks and poor connectivity. An SGI cluster ‘node’ is build with 64-128 processors, a single copy of the operating system and up to 8TB of globally shared memory. These machines run a combination of Red Hat Linux and SGI’s ProPack extended Linux. ProPack applies SGI’s supercomputer R&D and Cache Coherent Non Uniform Memory Access (CCNuma) architecture to cluster-based computing. CCNuma operates at 6.4 Gbytes/sec. as opposed to more conventional clusters running Myrianet at 1Gbit/sec. We recently sold a 256 processor system to Total in Pau for seismic processing. Paradigm Geo also likes our shared memory model and interconnect bandwidth. Another development is the use of graphics processing units—see below—for data processing. Look out for an announcement of a strategic move to Linux with our Onyx 4 soon.

Oil ITJ—Clusters are already widely used in seismic processing. But how applicable is the technology to reservoir simulation?

Bartling—CCNuma does offer improvement in seismic processing—systems can scale to 512 processors with just four instances of the operating system. But reservoir simulation is where the CCNuma architecture comes into its own. Schlumberger resells a CCNuma-based machine bundled with Eclipse. Total and Marathon use these systems and have reported that it outperforms everything except the IBM Power 4.

Oil ITJ—And the third line of business—Visualization?

Bartling—We continue to lead this segment and to innovate—especially with our high end Infinite Reality Onyx solutions. We have observed that the most influential demographic in visualization is the 12 year old male. Gamers have driven unbeatable price performance solutions from companies like ATI and 3DLabs—these are truly extraordinary machines. This is cool—so we’ll use the technology. Our next generation Onyx systems will use ATI cards. But the way we use them is somewhat different from the gamer. The Onyx 4 sports 24 cards on one computer—and soon 32. These are very high end Wildcat series graphics cards.

Oil ITJ—Who is using these systems in oil and gas?

Bartling—VoxelVision uses the Altix 3000 in an interesting manner—the computation is performed on the cluster with only the graphics file shipped to the workstation. Another remote visualization option is provided by our VizServer which has been used by BP to connect London to Baku and Cabinda. Elsewhere users have driven a 24 tile PowerWall*** with an incredible pixel density. This also allows them to load seismic attributes and reduce pixel depth.

Oil ITJ—What of SGI’s conventional MIPS/IRIX architecture?

Bartling—There are lots of good reasons to keep the MIPS chips. They run cool so a high density, low footprint is possible. This is important where speed of light latency issues come into play. We will continue to support and will up clock speeds. But oil and gas will probably move to Linux.

Oil ITJ—We have noted that although the conventional wisdom deprecates chip building today, NEC seems to have woken the HPC world up with its super-calculators****. What is SGI’s take on chip manufacturing today?

Bartling—That’s a good point! All I can say is that today we are increasing spend in Intel/Linux space.

Oil ITJ—How important is the oil and gas sector to SGI?

Bartling—Oil and gas customers have plenty of cash and lots of opportunities. The downside is the shrinking population. We have taken these issues and decomposed them into an IT problem. Our offering is to support decision making with less people by leveraging data transformation. Storage, processing, visualization and collaboration all come into play. Our balanced technologies combine to increase ‘insight velocity’.

*CXFS—SGI’s multi-platform file system.

**LightSand—SGI partner lightsand.com.

***PowerWall—from FakeSpace.com.

****NEC’s 40 teraflop Earth Simulator.


ONGC selects Guardian for transcription

Indian ONGC has awarded Veritas unit Guardian Data a 280,000 tape seismic transcription project.

India’s Oil and Natural Gas Corporation (ONGC) has awarded Veritas unit Guardian Data a follow-on seismic transcription contract. The second tranche consists of approximately 70,000 tapes including 9 track and 3480 cartridges.

Legacy

The legacy data is being transcribed to high density IBM 3598 cartridges for use in ONGC’s Dehradun processing center. A previous ONGC contract was awarded to Guardian last year for the transcription of 105,000 9 track tapes and 106,000 3480 cartridges. This work is now half way to completion.

Loss

Seismic data recorded on aging magnetic media needs to be re-written to new tapes to preserve the data from loss and to maintain its usability. The process is time consuming and requires considerable knowledge and expertise in older seismic recording formats and a good knowledge of tape mechanical characteristics.

$ 10 million

ONGC’s $10 million computing center was delivered by Paradigm Geo in 2001 and was the largest computer installation in the country at the time.


ChevronTexaco intranet rewarded

ChevronTexaco’s intranet made the top ten in the annual Nielsen Norman design awards.

The Nielsen Norman Group, a ‘world authority’ on Web usability has recognized ChevronTexaco Corp.’s (CTC) intranet in its third annual ‘top ten’ intranets worldwide. The report, “Intranet Design Annual 2003: The 10 Best Intranets of the Year” rewarded the CTC site for its ‘simple, streamlined, and consistent’ aspect.

Dimension Data

Dimension Data originally designed the CTC ‘Inside’ intranet following the three-way merger between Chevron, Texaco and Caltex back in 2001. The challenge was to create a ‘day one’ interim intranet site to promote the new company culture and serve as the initial online gathering place for all 53,000 employees. Further work on the site was performed by Dimension and CTC’s in-house experts to improve users’ experience with Inside and to enhance, improve and expand the site.

Harting

Dimension Data director Olivia Harting said, “We are honored by this recognition which shows we go beyond developing a visually appealing site - providing intuitive navigation and solid information architecture.”

Nielsen

Nielsen Norman Group principal Jakob Nielsen added, “Dimension Data and ChevronTexaco utilized best practices to develop an effective intranet organized according to how people actually use the information rather than in line with a company’s departmental structure. By developing an easy-to-use site, ChevronTexaco encourages employees to return frequently and, consequently, to be repeatedly exposed to the company’s vision and values.”

Quizzes

The ‘clear and easy-to-use site’ features news, shortcut navigation to frequently used intranet-based resources; Quick Question, an interactive feature that quizzes employees about company or industry facts and can serve as a quick polling device; the “CVX” stock quote, updated every twenty minutes; and more.


Kuwait Oil Co.’s Visual Reality Center

Schlumberger is to equip Kuwait Oil Co. with a state-of-the-art ‘Visual Reality’ center.

The Kuwait Oil Company (KOC) has awarded a contract to Schlumberger Oilfield Services for the supply of a ‘next generation’ 3D ‘Visual Reality’ Center. The center will be located in Ahmadi, Kuwait. The 3D Visual Reality Center is a collaboration and virtual-reality facility that integrates computing and communications with collaborative workflows.

Al-Ajmi

KOC Information Systems team leader Miriam Al-Ajmi said, “This facility will be an integral part of the effort to transform KOC into a ‘digitally intelligent’ company. The 3D Visual Reality Center will provide a collaborative environment where our multidisciplinary teams can view and manipulate information, facilitating decision-making in the areas of well planning and real-time drilling.”

Reservoir management

KOC specialists will use the center to solve reservoir problems and to advance the company’s real-time reservoir management initiative. By assembling expertise and combining technologies, the facility will help KOC geoscientists and engineers advance their interpretation and visualization capabilities. The first such center in Kuwait will feature the latest virtual-reality technological developments, including an 8 by 2.3-meter curved screen with a three-channel back-projected system using six projectors.

SIS

Schlumberger Information Solutions (SIS) will provide full-time collaborative expert consultancy services to ensure optimal utilization of the facility. This includes information technology (IT) support and software and domain science experts.

Q-Reservoir

In a separate announcement, KOC signed with WesternGeco for a Q-Technology based seismic reservoir project in Kuwait. WesternGeco’s high multiplicity Q acquisition is said to have ‘great potential’ for resolving some of Kuwait’s unsolved geophysical challenges.


New interpretation suite from Geo-Logic

Geo-Logic Systems’ new geological analysis package leverages INT’s component technology.

Geo-Logic Systems has just released LithoTect Interpreter (LTI)—an entry-level geological analysis package. LTI claims to offer advanced geological interpretation with comparable functionality to that offered by the major vendors.

State-of-the-art

LTI combines seismic, wells, logs, map, section and other data types along with coordinate conversion. The package offers ‘state-of-the-art’ depth conversion and subsurface reconstruction tools along with professional presentation capabilities. Data can be interpreted in 2D or 3D. LTI is used for well monitoring and MWD. LithoTect Interpreter is written in Java and runs on Windows, MacOS, Unix, and Linux. The user-interface leverages GUI technology from INT.

Geiser

Geo-Logic president Jim Geiser said, “The integration of INT’s technology has been instrumental in our decision to offer LT Interpreter at such a low price point.”

Plug-in

For special studies, LTI’s plug-in technology allows for the addition of structural restoration and balancing capabilities (including decompaction, isostatic adjustment, and forward modeling) on a pay-for-use basis. LithoTect Interpreter is available without the plug-in technologies at what Geo-Logic describes as a ‘truly affordable’ price of $5,000.


GeoProbe for seismic processing

With BP’s backing, Landmark now offers high-end visualization of seismic processing workflows.

Landmark Graphics has just released a special version of its GeoProbe high-end seismic interpretation software for use in seismic data processing. The new ‘ProMagic’ platform bundles GeoProbe with Landmark’s ProMax 3D seismic processing software.

Ellis

The software was developed in collaboration with BP. BP seismic specialist Dave Ellis explained the rationale behind coupling high-end visualization with processing software saying, “Seismic data processors must use the same tools as interpreters. You can’t process what you can’t see!”

Lane

Landmark president and CEO Andy Lane concurred, “Collaboration between processors and interpreters through a common 3-D visualization environment assures higher fidelity seismic data output, greater confidence in seismic interpretations and more rapid prospect generation.”

OpenWorks

ProMagic integrates well information and data from other OpenWorks applications. Processing parameter selection and QC can now be performed taking account of all relevant geological and geophysical information.


Apache signs I/O up for new seismic deal

Input-Output gets a groundbreaking contract for the supply of full wave seismic technology.

Apache Corp has just signed a memo-randum of understanding with seismic equipment manufacturer Input-Output (I/O) to initiate a strategic technology alliance. The multi-year agreement provides for the adoption of advanced seismic imaging technologies that will be used by Apache throughout its global portfolio.

Farris

Apache chairman and CEO Steve Farris said, “In Egypt, we have learned first-hand how I/O’s technology improves the quality of our seismic images. We look forward to a collaborative relationship that will enable us to shape these technologies to meet the challenges we face in exploring for oil and gas.”

Full wave

Initially the alliance will focus its efforts on the deployment of I/O’s System Four full wave land acquisition platform. System four records four component seismic data using I/O’s VectorSeis digital geophones. I/O’s AZIM seismic processing technique for anisotropic subsurface imaging will also be used.

Peebler

Bob Peebler, president and CEO of I/O, “This alliance will enable I/O to work shoulder to shoulder with a leading upstream company to gain a better sense of their seismic challenges and opportunities and to use that knowledge to make recommendations regarding technology deployment. This deal is a sign of things to come as oil companies align themselves with those who can deliver discernable bottom-line value via the appropriate deployment of technology.”

Sweat that asset!

In a keynote address given at the SEG last month, Farris claimed that independents like Apache can ‘sweat’ assets better than the majors. Farris cited examples from Apache’s acquisitions from Shell showing how the smaller company’s attention to detail has paid off.


SPE Denver ACTE 2003

The message from the keynote addresses was ‘we need oil and will do for a few decades to come’. Oil supply is OK for the next decade or so—in fact global oil production may not yet have peaked. An aging workforce is a concern as are gas supplies to North America. The environment was practically absent from the debate—surprising, since CO2 sequestration was a popular topic in the paper sessions. We noted a growing interest in real time production control and optimization through simulation. Here techniques used in the oil refinery are being applied to modeling wells, flowlines and production facilities. New technology is also successfully changing the game for monitoring of drilling and production operations. ConocoPhillips’ account of its Norwegian Real Time Operations Center left no doubt that remote, real time operations are here to stay. Roxar has moved simulation into the geological model with its RMS FlowSim breakthrough technology.

Keynotes

Kemble Bennet (Texas A&M) believes that technologies such as 3D/4D seismics, artificial lift and the digital oilfield of the future are contributing to breathe new life into old fields—reversing natural decline. Demographics is a concern. US enrollment in petroleum engineering courses is currently around 200—way below its 1984 peak of 12,000. As many workers quit the industry, the only growth segment is of ‘free agents’—independent consultants. Bennet’s points out that as oil plays a major role in energy supply over the next several decades, a large proportion of the workforce will vanish. Mark Sikkel, (Exxon and NPC Council) observes a ‘fundamental shift’ to a tight gas market and high prices. Two alternative scenarios are suggested. ‘Conflict’—restricting supply and encouraging demand—will push gas prices to $6 per BTU. ‘Alignment’—of policies on supply and demand—will lead to moderate prices of around $3. A maturing conventional N. American resource base will only supply 75% of demand. Non conventional resources like coal bed methane, tight gas and shales will partly fill the gap. Halliburton Energy Services president John Gibson has been looking at the statistics to discover that, for wells in the 10,000-15,000 ft range, there has been no change in drilling efficiency over a 20 year period. A rig still drills about 30 wells per year. While marketing departments tend to focus on successes and exceptional achievements, in the ‘middle of the Gaussian distribution, technology is just not impacting drilling’.

Real time

We noted a lot of buzz around coupling surface facilities with reservoir models. This can be just at the design phase or taken a step further to real time optimization during production. Techniques borrowed from the refinery are touted as having widespread application to the oilfield. But there is some reticence to putting sophisticated and potentially delicate equipment down hole. Real time optimization (RTO) is usually associated with the high end of the production business—high-tech, fiber-equipped wells and sophisticated downhole instrumentation. But RTO also has a role to play in the much larger market of ‘dumb wells’ and old fields where surveillance and alarms may be of critical importance to safety.

Sim/Opt software

Petroleum Experts’ software can now simulate aggregations of fields and production facilities. Simulator outputs from multiple fields can now connect to ‘arbitrarily complex’ surface facilities. Applied Flow Technology’s Mercury provides ‘intelligent sizing’ of pipelines by coupling network flow modeling with optimization. Scandpower’s Olga 2000 system models wellbore to pipeline fluid flow. An ‘APIS’ real time add-on couples calculations to SCADA or DCS controls to provide a ‘window into the pipeline’. Caesar Systems PetroVR models the big picture. A demo modeled gas demand and deliverability to identify and mitigate risks such as schedule slippage and HSE issues in the fabrication yard. New collaborative business simulation software, ‘Team VR’ extends PetroVR to reservoir, well and facility modeling. United Devices GridMP can also be deployed to distribute computation. A new release of Decision Team’s integrates information from SCADA, simulators, FieldView and TOW into a central database. A ‘hybrid artificial intelligence’ approach makes the system ‘self-learning’. SimSci-Invensys’ PipePhase was originally developed by Chevron. The latest release includes copy/paste of pipeline profiles and other tabular data from Excel. A vertical flow performance generates profiles for well characterization.

Hardware

Wellsite hardware is a very busy area. Innovations we spotted cover data acquisition, surveillance and communications. Aks Labs' new ‘Data Trap’ is a bolt-on data logger for gas well test and measurement. Data is recorded on to a Compact Flash card. The technology is used by Marathon and was the subject of an SPE paper (62881) on the field application of production analysis. E2 Business Services’ ‘PhoneHome’ intelligent remote video surveillance provides a virtual rig site ‘presence’. Motion detectors trigger recording and the technology can read car number plates as they drive on site. The big news in data acquisition was the acquisition of startup Luna iMonitoring by IHS Energy. iMonitor featured in our report from last years SPE ACTE. iMonitoring’s solar powered well data collection systems and remote automation technology will now integrate IHS’ FieldDirect service. Another company working this space is vMonitorwhich provides web-based automation software for data acquisition, data management and integration. vMonitor has received an equity investment from Baker Hughes.

Well reporting

Epoch’s myWells.com is a single point of information on operational activity including electronic tour sheets, drilling and mud logs. Logs can be plotted and data queried with secure user access to the data hosting service. Geoservices was showing its new ‘gWeb’ real time web-based mud logging platform. On-site data and documents from multiple vendors can be collated and transmitted to client sites. gWeb uses Visean’s Secure Web solution. Access control is secured with RSA SecurID one-time pad smart cards.

Reservoir simulation

Roxar is claiming a first for a new ‘Eclipse/VIP’-class simulator, now embedded in its flagship RMS geological model. By bringing dynamic flow modeling into the geological model, Roxar hopes to bring simulation to a wider user base. More scenarios can be screened before building the full-scale model. Static and dynamic data is kept in the same database. Decisions made are tracked and stored. Geologists and engineers ‘speak the same language’. A demo showed how multiple realizations of the geological model can be conditioned to dynamic data and matched with well test data. Pressure pulse data was analyzed with well-to-well animation. The new dynamic options integrate the RMS tree-view workflow manager and support data mining of production data and infill well planning. Optimization Petroleum Technologies PE Office is described as ‘the Petroleum Engineer’s Microsoft Office’. PE Office performs production data statistics and analysis, performance analysis, optimization and forecasting. We saw a preview of a new 3D module for ScienceSoft’s S3Graph offering data mining and graphing for reservoir simulation and production data. An OEM version is embedded in Landmark’s SimResults tool. The software uses Microsoft DirectX graphics and ScienceSoft expects to leverage cheap gaming cards to allow high end visualization on a PC. S3Graph works with Eclipse, VIP, MORE, SURE, FrontSim etc. 3DSLnet is StreamSim’s new hosted service running 3DSL remotely from any web browser. Once the job is complete, output is compressed, encrypted and transferred back to the client machine. All data is immediately destroyed on the server as soon as it is transferred back to the client. 3DSL interfaces with Scandpower Petroleum technology’s MEPO optimizer.

Data mining

Spotfire is now sold bundled with a data set IHS Energy’s Gulf of Mexico deepwater study. The data set can be used to compare different estimates of oil in place with decline curve analysis. Acquisitions can be evaluated by looking for inactive wells with sizable OOIP – as determined by statistical analysis. SpotFire is a bit bewildering—but definitely powerful. If there are determinant statistics hidden in your data—this is the tool to weasel them out.

Papers of note

If you ever doubted vendor’s hype on real-time remote operations you should have heard Mike Herbert’s (ConocoPhillips) report (SPE 84167) on its use of a remote visualization and control center on Ekofisk. The remote operations center—developed with Halliburton/Sperry-Sun, InSite and Sense Technology—supports drilling, geosteering and cementing operations— all carried out remotely from the Stavanger-based center. Saudi Aramco will require 50 large scale (1 million cells and up) reservoir simulations in 2006—a 20 fold increase over 2000 as Walid Habiballah related in SPE 84065. Aramco opted for PC clusters with a high speed MPI switch and OpenMP to run the Powers simulator, a dual parallel scheme with grid partitioning. A 9.6 million cell model of the Gahwar field matched 50 years of history from 3,200 wells in 8.3 hours. Parallel scalability was tested up to 125 processors and found to be 90% linear (computation was ‘super-linear’ but the network slowed things down). Grid partitioning schemes are critical for performance. 4 clusters, each with 128 processors are now installed. Luigi Saputelli (University of Houston and Halliburton) described (SPE 84064) how ‘real-time’ operates on different time scales: from fast (flow control at surface and SCADA) through planned well shut in, injection planning to slow (asset management). Optimization scope varies—from a single well focus to field-wide. In general the different numerical models used in optimization are not coupled. University of Houston has developed a hierarchical ‘data-driven’ model of a self-learning, self-optimizing process. The control system uses Model Predictive Control, a ‘very mature technology’ taken from refinery control systems, to optimize parameters such as the bottom hole pressure needed to maximize production. In multivariate optimization the least work path from existing to desired performance is determined by ‘exciting’ the system with small random perturbations. Multi-level (short and long term) optimization is reduced to a linear optimization problem of net present value.

This article has been abstracted from a 25 page illustrated report produced as part of The Data Room’s Technology Watch report service. For more information on this service please contact tw@oilit.com .CD-ROM proceedings and SPE papers available from www.spe.org.


Folks, facts orgs, et cetera

News this month from ISA, Input/Output, Geomodeling, Enertia, Baker Hughes, A2D, PIDX and EDS.

ISA has named Mark Franke as manager of its new Denver office. First customer for ISA’s software is Calgary-based Nexen.

~

Input/Output has appointed Chris Friedemann VP of Commercial Development.

~

Geomodeling Technology Corp. has established its European unit at the Rogaland Research Park, Stavanger. The unit is supported commercially by Statoil through its Supply Development Program.

~

Enertia Software has appointed Scott Biggerstaff as Business Analyst in Houston and Tim Wadle as marketing director in Denver.

~

Andy Szescila is to retire as COO of Baker Hughes after 33 years of service.

~

Peter O’Shea has been appointed VP International Business Development at TGS-NOPEC unit A2D Technologies.

~

Dan Collins of Halliburton and Steve Green of Weatherford International have been elected to the API-PIDX Executive Committee.

~

Earth Decision Sciences has just raised €5 million from a group French investors.


Enertia Software—Well bore schematics

Enertia has released new software for data-driven wellbore schematics and field data capture.

Enertia Software has released new software for well bore schematics design and field data capture. Well Bore Schematics drafts multiple completions, downhole equipment, tubing, casing, pumps, cement, formations and perforations. Well logs and deviation diagrams can be displayed. Because all information is ‘data and date driven’, the schematic can be animated to show the changes to the well bore over time. A new PocketPC-based Field Data Capture (FDC) tool supports collection of wellhead pressures, downtime, tank gauges, run tickets, gas meter readings, liquid meter readings, and many other reading types at the point of origin in the field. The performance of Pocket PC, even with large routes and 30 days of historical data is described as ‘phenomenal’.


Meager third quarter results for service cos.

Core bucks the trend with ‘all-time’ record quarterly revenues. Elsewhere financial gloom abounds.

BHI

Baker Hughes Inc. reported a $59.5 million loss for the third quarter of 2003 including after tax charges of $151.2 million relating to its 30% interest in WesternGeco. BHI chairman Mike Wiley said, “Our oilfield divisions continued their strong performance in the third quarter with a consolidated operating profit margin in excess of 15% despite disappointing customer spending in the Gulf of Mexico. For the current quarter we expect continued modest improvements in the international markets and flat US activity.”

CGG

CGG reported a net loss of €25.5 million including a €19.0 million provision for restructuring land acquisition. Further to the approval by a US Court on October 21st 2003 of the PGS financial restructuring program, CGG will own 868 000 shares, i.e. 4.35 % of PGS capital. At today’s share price this represents about $33 million—a tidy return on CGG’s investment of approximately $19.0 million.

Core

Core Laboratories reports ‘all-time record’ quarterly revenue of $105 million for the third quarter of 2003. The company reports that international oil companies are investing capex dollars to increase production from existing fields. Core has repurchased some 5 million of its own shares and has approval for an extra 2.9 million shares over the next 18 months.

Input-Output

Input/Output has announced a third quarter 2003 net loss of $4.8 million on revenues of $30.3 million. I/O president Bob Peebler said, “Despite lower than expected revenues, we are pleased with our continued improvement in year-over-year sales and gross margin and are beginning to see positive signs of increasing capital outlays by some of the E&P and seismic contracting companies.”

Schlumberger

Schlumberger has reported third quarter 2003 operating revenue of $3.51 billion. The company has posted a net loss of $55 million after charges of $298 million, including a $205 million impairment charge on the WesternGeco multiclient library. Chairman Andrew Gould noted, “continued strength in the Americas, Russia and the Middle East. The write-down of the WesternGeco data library reflects the distressed conditions of the multiclient business, particularly in the Gulf of Mexico. The SchlumbergerSema sale is a major step in our strategy of refocusing Schlumberger on our core business of technical services and reservoir solutions to the upstream oil and gas industry.”

TGS-Nopec

TGS-Nopec’s consolidated net revenues were $32,0 million up 50% on Q3 2002 with operating profit of $9,2 million. The A2D unit added 77,000 logs from 41,000 wells to its digital well log library, bringing the total inventory to 1,65 million digital well log images from approximately 790,000 wells.

Veritas

Veritas’ fiscal 2003 fourth quarter showed charges of $59.9 million including a $7.6 million loss on the sale of the (RC)2 unit to Seismic Micro Technology (SMT). SMT paid $2.0 million cash for the unit which was acquired in 2001 for $33 million of Veritas paper.

~

See the article in this issue for reporting from Halliburton’s Landmark Graphics unit.


ISA rolls-out GeoBrowse 3.0

ISA’s new GIS front end to E&P data stores introduces quantitative analysis and spatial drill-down.

ISA has just released version 3.0 of its GeoBrowse dynamic database querying and visualization engine. GeoBrowse reads data stored in industry applications such as Finder, Geoframe, OpenWorks, Tigress, OpenRSO and Probe (and indeed others!). Multiple databases can be viewed and compared in a common map view.

Consistent

Data is displayed in a consistent, user-defined coordinate system with projects and datum conversions handled on the fly. Live queries ensure that data stays current and no data transfer or loading is required.

Drill-down

The new version offers multi-database quantitative analysis and a flexible data source hierarchy. Spatial drill-down can be customized. A query results grid control groups results for detailed analysis. Other improvements include query builders, query cloning, schema browsers, hierarchy builders and support for database binary objects (BLOBS).

Development

ISA also offers bespoke development leveraging its technology. This was used to web-enable ConocoPhillips’.pipeline and facilities data, leveraging ESRI’s ArcIMS.

HubCentral

ISA is also working to integrate technology from PGS-Tigress into a new ‘HubCentral’ solution that will add data management and migration to GeoBrowse’s functionality.


Oil ITJ interview—Dave Camden

Oil IT Journal has been reporting on upstream knowledge information and data (KID) catalogue over the past few months. We thought it was a good time to take a rain check with Dave Camden of Flare Consultants—purveyors of catalogue technology to industry—and ask just how the new technology was being deployed. POSC’s Alan Doniger also provided insightful answers to our queries.

Oil ITJ—What exactly is the deliverable from the catalogue? Starting with an Excel spreadsheet of ‘approved’ names of things – what do you do next?

Camden—Our deployment technology (now resold by Landmarks Graphics) sits over existing information source systems. Source systems are used ‘as-is’. The Catalogue holds metadata that describes the individual entries in the underlying systems together with a link that can access those entries. For a document system you would link to the metadata in the DMS and hold this in the Catalogue together with the document ID. Likewise for a database you hold metadata about the item (say a well header in OpenWorks) and the item’s ID in the database.

Doniger—We can think of a catalogue in the sense we are defining as an adjunct to a federated system of data sources. Rather than simply bolting a common portal over multiple data sources, the catalogue idea calls for a thin and consistent characterization of the populated ‘items’ in the data sources that can be useful for qualified queries.

Oil ITJ—One can see how all new data capture software should use the catalogue in drop down lists etc. But do you integrate a new catalogue into legacy systems?

Camden—The whole point of the Catalogue is to bring together information from existing systems. Normally the Catalogue holds nothing but metadata—all the source systems stay as they are.

Doniger—I think of the catalogue concept as an analogue to a (book) library catalogue system, which in my youth consisted of catalogue ‘cards’ for each ‘book’ organized by ‘title’, ‘author’, and ‘subject/keyword’. Physical books were identified and organized by a vocabulary-based classification system. Our catalogue concept is an electronic form of this practice augmented with additional catalogue attributes and, as Dave describes, with value-added access mechanisms to digital data and document data stores.

Oil ITJ—Do you have to wait until Landmark and Schlumberger have adapted all their database software?

Camden—No, but you do have to write adapters to provide the views from databases. We are working with Landmark, who already have a set of adapters for most of the commercial databases through its Team WorkSpace technology. We will use those adapters ‘under’ the Catalogue to provide ready-made connectivity. Connecting to existing document systems is similar but tends to be simpler.

Oil ITJ—The KID work comes from one or two companies—where the Catalogue is deeply engrained in all their tools. But what happens when this list meets another—perhaps less formal list in another company’s toolset? Is there any real hope of adoption of a common nomenclature—especially given local differences in terminology and usage?

Camden—We are not talking about low-level integration of the type that companies like Tibco are implementing or the kind of detail required for inter-application connectivity. When it comes to existing database models it is relatively easy to map from their low-level attributes to the Catalogue’s high level ones because they define things like data types very well. Problems can arise, usually with document systems, where the existing metadata is inadequate and so incapable of being used to properly populate the Catalogue.

Doniger—The goal of the POSC Cataloguing initiative and the associated evolving standards is to gradually develop common vocabularies—at least as Dave says at the large grained, high level portions of the classification vocabularies—so that the need for and complexity of mappings is reduced.

Oil ITJ—How does the Flare technology actually work?

Camden—We have a web-served system based on Lotus Notes. We either serve the Catalogue remotely or have a server at the client site. The Catalogue is essentially a set of Notes databases with a lot of functionality built around them to provide users with a ‘smart’ interface. This is necessary because the metadata model is quite complex and the users don't want to be exposed to it.


Apache to deploy NetApp storage

NettApp is to support its Australian operations with a FAS940 unified storage system.

Apache Corp. Australian unit is to deploy a unified storage system from NetApp to support its offshore Western Australia operations. The solution was supplied by NetApp reseller ASI Solutions.

FAS940

Apache has installed a NetApp FAS940 storage system configured with NetApp Data ONTAP software. Apache uses the FAS940 to store its 2.7 TB of Landmark format seismic data and interpretations.

Sun

Apache previously ran a Sun server with locally attached disks. But this proved complex to manage with different levels of patches to assure RAID reliability.

Clayton

Apache’s systems administrator Vic Clayton explained, “Ease of managing the information storage was a key issue for Apache in selecting an alternative solution, as was system reliability. The FAS940 offers storage area networking in a reliable and easy to manage platform.”

ASI

ASI Solutions installed the new system over a weekend. One problem—a card failure was spotted by the system which automatically notified NetApp, triggering overnight delivery of a replacement.


Oxy to build SGI visionarium in Qatar

Qatar’s first Reality Center will be built around an SGI Onyx 3000 visualization system.

Occidental’s Qatar unit has installed a $1 million SGI Reality Center visionarium for use in seismic interpretation and well planning. The Center incorporates state-of-the-art digital light processing projection technology, driven by an SGI Onyx 3000 visualization system. This is backed up by an integrated system of supercomputers, high-speed graphics systems, four screens and digital stereo projection equipment.

Lowe

Oxy’s Carey Lowe said, “The Reality Center allows for virtual walkthroughs of oil wells, multiple images can be viewed simultaneously providing a comprehensive evaluation of data. Using the system, seismic data can be used to simulate ‘cyber wells’—rotating them in space to test their effectiveness”.

Al Attiyah

Abdullah Bin Hamad Al Attiyah, Minister of Energy and Industry, Qatar, congratulated the team saying “The Reality Center will be an important asset in the development of Qatar’s oil and gas industry.”


Woodside to deploy Cisco network

Cisco is to kit out Woodside’s head office with a 10GB/s WLAN and pervasive WiFi network.

Woodside Energy has installed a Cisco IP network at its Perth headquarters and at selected offshore locations. Woodside’s implementation of converged Cisco networking will include applications such as voice and video over IP, unified messaging, security, wireless local-area networking (WLAN) and storage area networking (SAN).

WiFi

Head office will deploy a Catalyst 6500 switched LAN running over a 10 GB Ethernet backbone. Wireless WiFi WLAN lets mobile users connect on all 19 floors of the building. A Firewall Services module and VPN technology assure network security. IP phones offer connectivity to Woodside’s offices in Mauritania.

SAN

A Cisco SAN is the core of Woodside’s disaster recovery and data-management strategy. Head office is connected to a disaster-recovery site using Fiber Channel over IP (FCIP) across a 10 GB Ethernet link.


Texas Net disaster recovery for Global

A new ‘campus concept’ offers advanced data replication, networking and living accommodation.

Oil pipeline and construction company Global Industries has contracted with Texas.Net (TxN) for disaster recovery services. TxN will provide Global with a mirrored ‘hot site’ with real-time duplication of headquarters’ data.

Campus

TxN offers a ‘campus’ providing a post-disaster data center and networking capability. Campuses provide secure, off-site locations where clients can co-locate mission-critical data and hardware.

Accommodation

Campuses provide accommodation for employees tasked with keeping businesses running during an emergency. TxN also provides disaster planning through partnerships with leading consultants.

Metuassalol

Global VP Celest Metuassalol said, “We evaluated four providers and none came close to offering the capabilities and peace of mind that TxN provides. Disaster recovery plan is a crucial element in our business operations, and partnering with TxN ensures fail-safe back-up and connectivity should we ever need it.”


Santos goes for Matrikon connectivity

OLE for Process Control is to be Santos’ connectivity standard for its Australian gas production.

Santos has awarded Edmonton-based Matrikon a multi-year contract for the capture of historical process information from its Australian operations. Santos has also standardized on Matrikon’s OLE for Process Control (OPC) Connectivity Suite for communication between plant devices and integration with AspenTech’s InfoPlus.21 historian.

Crisafulli

Matrikon director Sam Crisafulli said, “We are working with Santos to leverage IT to optimize production. Access to historical data ensures optimum operational decision making”.

Total-InSight

Matrikon has also just released Total InSite—an ASP-enabled version of its Process Suite package. Total InSight leverages Matrikon’s SCADANet to provide alarms, predictive maintenance, optimization, and industry-specific solutions over the Internet. Matrikon hosts the data and takes care of the infrastructure.

Capper

Matrikon’s oil and gas industry specialist Andrew Capper said, “InSight will provide numerous oil and gas applications—leveraging our 15 years of experience in the industry”. Total-InSite collects data from all levels of an organization, from the field to production to management, and consolidates into a single location.


Elynx MeterDoc collects remote flow data

New control technologies link wells to the data center via the cellphone digital overhead channel.

Tulas-based eLynx Technologies (ELT) has just released two new control technologies for the gas industry. MeterDOC transmits well information from an electronic flow meter (EFM) to an eLynx Data Center using cellular phone technology. The information retrieved includes current flow, previous day’s production, spot and differential pressure, and temperature. MeterDOC provides user-configurable alarms including high line pressure or well downtime. Alarms can be routed to personnel via pager, e-mail or mobile phone.

Powers

eLynx VP John Powers said, “MeterDOC allows producers to see current data instantly and to receive notification when a well is down. MeterDOC communicates with most types of EFMs providing nationwide coverage in the US.

CompressorDOC

Another new tool—CompressorDOC— uses the same communication route to inform personnel when a compressor is down. CompressorDOC includes a solar panel and battery power package and uses the cell phone digital overhead channel for communication with the eLynx Data Center.


INT’s GeoToolkit embeds TrollTech Qt

A new version of INT’s GeoToolkit library embeds cross platform technology from TrollTech.

Interactive Network Technologies (INT) has just released GeoToolkit 3.0 – a new version of its suite upstream oil and gas data visualization components. GeoToolkit’s C++ of components now includes a ‘high-level’ XY-Plot, seismic, well log and contour widgets and cross-platform printing capability.

Trolltech

New with version 3 is support for Qt, Trolltech’s cross-platform application development environment. Qt runs natively on Linux/Unix, Windows and Mac OS X and allows developers to use the same code across all platforms. GeoToolkit also integrates with Qt Designer.

Schatz

INT VP Paul Schatz said, “Trolltech’s portable GUI concept and INT’s reusable components lets developers focus on projects that contribute directly to the company’s bottom line. Collaborating with Trolltech was a natural progression for INT.”


iPod data delivery for Veritas

Veritas is delivering seismic projects to clients on Apple iPods.

Writing in Veritas’ Veritrax magazine, Tamara Oleksow reveals that the Calgary-based depth imaging group has adopted a novel seismic delivery mechanism. Anisotropic depth imaging projects covering over 200 square kilometers will be delivered on the Apple iPod—a hard drive-based MP3 music player. iPods engraved with the Veritas logo will be given away with the data. The iPod’s 20 GB hard drive replaces CDs and DVDs for delivering seismic data volumes. FireWire or USB II connections allow for rapid data transfer to client workstations.


‘Favorable’ macro environment for sector

Simmons & Co.’s Dan Pickering sees some light at the end of the tunnel for the oil service sector.

Speaking at the New York Society of Security Analysts’s 6th annual ‘Investing In The Energy Industry’ conference last month, Simmons & Co.’s Dan Pickering described a ‘favorable macro environment’ for the oil service industry.

Less explosive

This is due to OPEC’s success in maintaining an oil price in the $22-28 range (mostly) for the last 5 years. Moreover, down trending inventories are sparking a renewed drilling cycle—although Pickering describes the current cycle as ‘less explosive’ than previous booms and busts.

Increased guidance

Service sector players are all giving significantly increased earnings per share guidance for 2004. Rises in the 25-50% range over current depressed levels are forecast. These rises are to be offset against falling earnings expectations from the analysts as a slower pace of recovery is factored in.

Prospect Drought

This cycle is also slower because of the Gulf War, a prospect ‘drought’, rising F&D costs and lower activity by majors. Excess capacity and a more discriminating Wall Street also have also contributed to slowing things down.

Advice

Pickering’s advice to investors? Oil Service Stocks are not particularly cheap but do offer strong growth opportunities and should be more familiar to investors by now. Key factors for the investor will be the duration of the current downturn and relative performance of the sector.

Gassy stocks

Pickering warns that ‘gassy stocks’ are unlikely to repeat 2001 performance and suggests that investors ponder whether ‘broken’ subsectors—including seismic—represent value, or a value trap. The Simmons crystal ball doesn’t help a lot here! But Pickering advises that the macro view is all-important, pointing out that ‘contrarians typically win’.


Halliburton breaks-out Landmark financials

For the first time, Halliburton has lifted the veil on Landmark’s financial situation.

Oil service companies have tended to be somewhat recalcitrant when reporting on their software activity. In its third quarter 2003 financials, Halliburton bucks the trend by breaking out financials for its Landmark Graphics unit.

$ 132 million

Third quarter revenues from Landmark and ‘other energy services’ were $132 million, down $33 million from the same period last year. The reduction was attributable to the sale of Wellstream in the first quarter of 2003 and a winding down of a North Sea project.

Litigation

The unit reported a third quarter 2003 operating loss of $52 million compared to an $8 million profit last year. This result included a (contested) litigation charge of $77 million. Landmark Graphics revenues were down 4%, but operating income was up 22% as compared to the third quarter of 2002.

9 months

For the first nine months of 2003, Landmark and other energy services reported revenues of $410 million (down from $649 million). The unit is posting a 58 million loss for the 9 months—as against a $122 loss for the same period in 2002.


BP, ConocoPhillips sign with Oildex

E-business specialists TransZap has signed two majors for its web based revenue reporting system.

TransZap has signed up BP and ConocoPhillips for its Oildex web-based finance and production reporting service. Working interest holders now have online access to oil and gas revenue information through Oildex’s Owner Relations Connect.

Flanagan

Speaking at the National Association of Royalty Owners annual conference in Austin, Texas, last month, Oildex CEO, Peter Flanagan said, “BP and Conoco-Phillips join a growing number of industry leaders supplying owners with on-line tools to track well performance and effectively manage their investments in oil and gas.”

Scarborough

BP’s Lee Scarborough said, “By delivering paper-free information, BP shaves costs from the owner relations process and disseminates information quickly and easily, benefiting both royalty owners and BP.” According to the TransZap press release, Steve Kellert of ConocoPhillips said exactly the same thing!

Connect

Oildex Connect claims to simplify financial and operational processes for energy companies and delivers essential real-time decision support. Oildex Connect claims to be one of the largest data exchanges in the industry with 500 corporate clients.

E-commerce

Services include: digital and scanned invoicing, customer relations data posting, check stub reporting, crude oil and gas data exchange, volume reporting and joint interest billing. By coupling this web-based data with electronic funds transfer, owners can eliminate paper altogether.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.