April 2007


BP joins ‘Smart Chips’

BP joins construction industry standards body FIATECH, signing up to RFID ‘smart chips’ and ISO 15926 ‘work-in-progress’ projects. The WIP targets handover, maintenance and operations.

BP’s North America unit has joined the FIATECH standards body to further R&D on radio frequency identification devices (RFID) and its ground-breaking use of the ISO 15926 plant data management spec.

FIATECH

FIATECH is part of the Construction Industry Institute, a research unit in the College of Engineering of the University of Texas at Austin. The FIATECH board has representation from construction software and the chemical industry—but with the increasing complexity of offshore projects, the standardization movement is seeing take-up in oil and gas developments.

Smart Chips

FIATECH’s ‘Smart Chips’ project investigates emerging RFID—technologies that can be adapted for construction, operations and maintenance applications. The project has already completed eight in-field pilots of commercial-ready technologies in construction industry applications.

Pipeline

Projects have addressed RFID applications for site asset tracking and inventory, materials logistics, pipeline monitoring and concrete maturity studies. Chevron’s Ignatius Chan said, ‘Smart Chips allows us to get first hand experience with a new technology at a fraction of the cost of doing it ourselves.’

ISO 15926

The ‘Accelerating Deployment of ISO 15926’ (ADI) project seeks to promote use of plant data standards that have evolved from early work by the Norwegian POSC/Caesar organization. The project has backing from major engineering companies including Fluor and Bechtel and software developers such as Autodesk, Aveva, Bentley and Intergraph. Oil company members include Aramco, ConocoPhillips and Shell.

Handover

The ISO standard is basically a huge database of names and functions for parts used in a construction project. The standard nomenclature eases interaction between owner operators, suppliers, engineering companies and software developers, during plant construction and at the crucial handover stage.

WIP

Similar wins are expected to accrue in the operations phase where ISO 15926 will underpin computer-based maintenance, operations and compliance. The project has now matured to the stage of a Work-In-Progress (WIP) ISO 15926 repository that is ready for deployment.

Greater Plutonia

ISO 15926 underpinned the construction of BP’s Greater Plutonia development (OITJ Sept. 06), a $4 billion project with six FPSOs and 80,000 equipment items. The system is integrated with BP’s SAP ERP package. More on ISO 15926 from http://en.wikipedia.org/wiki/ISO_15926—and in next month’s Oil IT Journal.


CO2 sequestration

A new market for oil patch technology is opening up in CO2 sequestration according to a study from the US Department of Energy.

The US Department of Energy’s Regional Carbon Sequestration Partnerships have matched the power plant and other stationary sources of more than 3.8 billion tons of US CO2 production with ‘candidate’ storage sites in a new online ‘Atlas.’

Brown field

Underground CO2 sequestration may open up a huge new market for oil and gas technologies and software. CO2 sequestration looks like a particularly attractive possibility when teamed with ‘brown field’ secondary recovery from depleted oilfields. The other economic driver for the techniques is proximity to a major CO2 source such as a power plant.

IPCC

Another estimate of the market potential can be had from a 2006 study by the UN Intergovernmental Panel on Climate Change (IPCC) entitled ‘Carbon dioxide capture and storage,’ that forecasted that up to 40% of the world’s CO2 could be sequestered by 2050. This would entail building ‘several hundreds to thousands’ of CO2 capture systems over the coming century, each capturing some 1–5 Mt CO2 per year. Geological storage is currently considered to be the most economically attractive sequestration technique. Download the Atlas from www.netl.doe.gov.


Google Earth, hosted data and the way of the world

Oil IT Journal Editor Neil McNaughton tries to distill a ‘boilerplate’ technology talk from the many C-level addresses he’s heard recently and then writes last month’s editorial on recent some developments in geographical data management. The deal between Talisman and Valtus announced at the ESRI PUG looks like a ground breaker—but can it be applied to other E&P data types?

It’s probably not the first time that I have observed in these columns that the talks given by the great and good of this industry have a boilerplate similarity about them. One year everyone will be banging on about the need for cost cutting and headcount reduction. A few years later folks will be warning of imminent doom as underinvestment in infrastructure and an aging workforce threaten our survival.

Pendulum

While the swing of the invest/cost-cut pendulum is just the way of the world, the aging workforce may reflect a tendency on the part of the oldies in charge of the show to ‘let go’ their younger colleagues. I suppose that’s the way of the world too. All of which is a way of introducing some of the issues that constitute the current upstream technology boilerplate and some maturing technologies that may actually be able to address them.

Boilerplate

My boilerplate of the month includes: the problems of a dwindling and aging workforce; of entering data again and again, of fixing bad data again and again, of the evaporating skill sets needed to handle tough problems like formats from the SEG, ESRI, LIS, LAS, DLIS, issues with geodetics and exploding data volumes. I could go on. In fact I will—how hard is it to build a project of intersecting 3D seismic surveys of different vintages, a few horizontal wells, some with bad deviation surveys, and cultural data with a different projection? All of which is layered on top of the hardware and infrastructure needed to run the show—file servers, virtualization, workstations, graphics subsystems. Not to mention different operating systems, Oracle versions and so on. All in a day’s work what?

C-level

Convolve this daily reality with a hiatus in educational input following the last downturn and you have the picture painted by the ‘C-level’ folks who get to hold forth at the industry conventions. Their answer to such problems is in general, more training (i.e. more people), more investment and more technology. More of everything in fact. As I remarked at this month’s SPE Digital Energy show, not so long ago we were flagellating ourselves as technology laggards. Now we have to sell our industry on its high tech leadership to compete for the young grads. But I digress.

GIS

I should say that this is the editorial that I was going to write last month after the ESRI PUG. But it got displaced by my rant over Microsoft’s HPC effrontery. So going back to the ESRI show, I think that geographical data is a rather interesting proxy for E&P data in general. It is complex (geodetics rivals seismics in the mathematical and domain-specific skills required). Data volumes are huge (especially for raster) leading to layered complexity and serious scalability issues. But the good thing about GIS is that, as a generic, horizontal data type, it is more likely to benefit from mass market developments than say, a cement bond log.

Google Earth

And the mass market is of course what has happened to GIS. It seems churlish to make a meal of this in the context of the ESRI PUG, but coincidentally, there were no less that four Google Earth-based announcements in last month’s edition of Oil IT Journal—and no, we were not trying to make a point! Commodity has come to GIS. Instead of pushing the complexities out to users, keep all that stuff on the remote server and just give them what they need. We’ve been here before, with hosted data offerings (Schlumberger et. al. circa 1998) and with application service provision (various circa dot com boom time). In the GIS domain itself, we reported on Microsoft’s TerrServer back in 1998. But these earlier services never really caught on. With Google Earth, the paradigm has landed.

Thin client

You don’t really need me to explain how the hosted data and thin client approach addresses pretty well all of the issues in our C-level boilerplate above. One time data entry? You got it! Hide the complexity? Yup—all the ‘clever’ stuff is managed off site on one single authoritative data set. OK, I know, GE does have a few issues with the alignment of roads and satellite imagery—but someone is working on it (I hope). Even the demographics and the aging workforce issues are largely addressed by the hosting model. Scalability is good too—both upwards and in the face of future downturns (also a way of the world I’m afraid).

Talisman

The GE effect has already galvanized ESRI into producing an ‘industrial strength’ version in the form of ArcGIS Online. But what impressed me most at the PUG was Talisman’s account of its spatial data outsourcing with Valtus. Somehow this really did sound like a breakthrough. Certainly, the idea of outsourcing your data management to a company that already specializes in that data type makes sense. The rest is just a matter of bandwidth.

Enerdeq

Will outsourcing break through to more generic E&P data types? IHS appears to be moving this way with Enerdeq and both the service majors have significant offerings in this space. All the same, having just got back from the SPE Digital Energy show, I can assure you that this topic is not yet in the hearts and minds of the upstream. This can be explained by a certain reticence in the face of the new technology. On the part of the vendors—how many license sales will we lose? And on the part of the oils—what will ‘application support’ and ‘data management’ do? We’re back to the old ‘people problem’ again—perhaps we need a generation to retire without replacement before this one is going to run.


Oil IT Journal Interview—Matthew McKinley, AVEVA Inc.

Matthew McKinley, Executive VP and Head of Operations for AVEVA’s Americas Region tells us how Aveva’s portfolio has extended beyond plant design management and into lifecycle data management, helping owner operators and contractors transfer plant data on project hand-over. A FIATECH board member, McKinley has a special interest in the emerging ISO 15926 standard.

What’s Aveva’s interest in the upstream?

Aveva started out in the engineering construction sector with tools for the design and construction of physical assets. The company has always looked up and down the value chain—for more synergies and efficiencies and we were especially interested in improving the process whereby the engineering prime contractor hands the completed plant over to the owner operator. Aveva’s foundation technology is the plant design management system (PDMS). But our portfolio has expanded into maintenance, operations and now, the upstream. Our process design and simulation of production facilities has led some of our larger clients to ask for more accessibility of data involved in these activities. Oil and gas clients are also interested in how they can access new resources by extending existing assets. Another area of interest is emergency management—getting back online quickly after an incident.

How do you ring-fence your design activity?

We are concerned with everything above the reservoir, to the exclusion of the subsurface.

Where are your big projects today?

Aveva has seen three years of strong growth and is involved in very big projects in Canada’s oil sands and in Brazil.

Our previous coverage in this space was at the USPI-NL conferences. Is Aveva active in the standards area?

Indeed. I am on the board of FIATECH which was established to leverage technology throughout the asset lifecycle, particularly by exploiting ISO 15926 data interoperability. This is is a kind of nirvana for our clients who receive all their plant data on handover in a standard model.

In the past there were some issues that the major EPC preferred to use their own proprietary models and weren’t interested in playing the standards game.

This is still a challenge. In fact we see our own proprietary system as giving us a competitive edge. But we can leverage standards for data exchange although there may be real world issues of resource constraints. But du Pont and Dow Chemical were among the first to change their way of working with contractors. Today there are two major 3D design systems, Aveva and Intergraph. Clients want to execute projects in both environments. They may expect more interoperability than is currently possible.

What do you understand by lifecycle data management?

The process of build and delivery means that handover is an appropriate point for standards deployment. Interoperability is desirable during project execution, but this is more challenging.

But you should be well placed to do something about this.

Absolutely. In design there is still a challenge re data formats. Especially given the time constraints we operate under. If we have the time we use XMpLant—Adrian Laud’s tool*. There are no problems doing it this way. But to introduce this approach in the middle of a project can be risky. It’s great for startup or at handover. We are doing this for clients in Canada and the US migrating legacy data into Aveva’s PDMS.

Can you tell us more about your ISO 15926 involvement?

Through FIATECH we are very active in ISO 15926. Our own plant information application is format and application agnostic. In the real world there are more concerns about interoperating with SAP and process data. This is achieved through a common reference system and can be used, for instance when a pump is failing, to see into the maintenance system and take appropriate action. The technology can also let systems hook into real time data streams. But if you want to learn more about ISO 15926, FIATECH’s annual conference is taking place this week in Washington DC.

How much of your business is VNET*?

Our traditional design tools make up 95% of the US market. But the rest of the world has around a 15% VNET share. New technology is a harder sell in the USA. To sum up, we see real benefits in doing the whole plant data thing right! BP’s work on the Greater Plutonia FPSO development in Angola was a game changer, in fact we are presenting a paper on this at the FIATECH show this week titled, ‘Successful Interoperability for major projects using ISO 15926.’ The project involved building a virtual plant that was used for training and even to re-evaluate engineering design in the light of maintenance requirements—all checked-out in a virtual world.

* Noumenon Consulting.

** VNET is AVEVA’s lifecycle plant data integration platform.


ZEH Software—CGM for Petrel 2007

New package adds computer graphics metafile support to Schlumberger’s flagship.

Technical plotting specialists ZEH Software has released a new package ‘CGM Extension for Petrel 2007’ to give geoscientists the ability to export Petrel graphics to graphics applications such as CGM-enabled printing solutions. ZEH used Schlumberger’s Ocean API to add CGM capability to Petrel.

Reeder

Scott Reeder, Product Marketing Manager at ZEH said, ‘CGM has been an industry standard graphics format for G&G software since the inclusion of CGM*PIP and CGM+ in the early 1990’s. The introduction of CGM Extension provides Petrel users with the ability to move forward with new technology, while continuing to support existing standards and applications.’

SeisInfo

ZEH also announced a new release of its SeisInfo seismic information management package. Enhancements include basemap export to ESRI shape files, user-defined geodetic datums, visualization with Google Earth and GEOTIFF attachments inside SeisInfo’s MapViewer.


Enigma’s SmartMove adds project housekeeping, versioning

Enigma Data Solutions’ new release enables policy-based project archival and migration.

UK-headquartered Enigma Data Solutions has announced a new release of its SmartMove information lifecycle management (ILM) solution. SmartMove 2.4 provides ‘cost-effective’ policy-based tiered storage management. The new release adds housekeeping and versioning features for migration of files to secondary storage.

Copley

Enigma CEO Pete Copley said, ‘SmartMove gives organizations the ability to discover and classify data, enabling policy-based archiving and file migration. The comprehensive ILM solution is key to the efficient implementation of multi-tiered storage architectures, ensuring that your data resides in the most appropriate storage tier for its value to the organization.’

Housekeeping

Housekeeping allows organizations to keep different storage tiers synchronized, increasing efficiency of the filesystem and simplifying data management. Versioning addresses the growing requirement for retaining and managing multiple versions of files that are kept on secure secondary storage. Enigma has also added support for new storage systems including Archivas’ ArC, HDS’ HCAP and PolyServe.


IQPC Oil and Gas Information, Knowledge and Process

IQPC’s recent London conference included ENI’s web-based reserves system, more on Shell’s information delivery infrastructure, BP million-item operations database and a new meaning for ‘DATA,’ which according to Hydro’s Lars Olav Grovik is a Norwegian acronym that stands for ‘double work for all!’

Stephano Ventura described ENI’s reserves information system, a common data environment for ENI’s financial reporting. A central database in ENI’s head office serves 20 geographical business units. ENI was able to preserve existing systems by ‘wrapping’ them into an E&P data warehouse. Different data reporting formats are transformed on the fly. Building blocks include an Oracle repository, an Apache Tomcat web server and Business Objects’ reporting tools. End users connect across the WAN with a web browser or Microsoft Office applications—leveraging Excel OLAP Services.

Custom reporting

A user group has been created to address cultural issues and customization to local reporting requirements. Usage monitoring allowed for fine tuning of the system’s functionality. The project was described as successful, but it was a ‘long and painful process!’ The project involved ‘limited’ outsourcing. Ventura is a strong advocate of in-house development on a project of this complexity.

Flare Solutions

Paul Cleverley (Flare Solutions) provided an update on information delivery to Shell’s new business, exploration and development teams. The idea is to use all available information in decision making and not to waste time assembling it. Planning for information re-use mitigates information loss when a project is put on hold. The solution, which is global to Shell, was designed to be easy to use. For Cleverley, ‘If you need to train, it is not an end-user tool.’ Cleverley stresses that straightforward full text search, as used by many oil and gas companies limits retrieval accuracy. Shell’s best practice is to constrain text search with taxonomies to ‘pin down’ terminology for assets, production and E&P data types and project names. The system also leverages Shell’s metadata stores and search ‘helper’ applications like MetaCarta. Enabling technologies include Schlumberger’s DecisionPoint and of course, Flare’s Catalog (OITJ Feb 07). ‘New knowledge’ can be derived from usage patterns.

Bright Sky

The system offers fine grained control over results’ entitlements that makes sure confidential information stays that way. External data sources such as IHS, SPE, CASP, Telus, K-Res and C&C can be blended into the search results and visualized using Shell’s ‘Bright Sky’ knowledge map application. The system has an estimated 4,000 users. The system is replicated across Shell’s worldwide IT hubs with ‘cloned’ metadata. Deployment was performed by IBM using WebSphere-based automation. In the Q&A Cleverley revealed that the system was originally delivered as an SAP I-View portlet. But there was little take up for the Portal and today the system is a stand-alone application.

BP

Maintenance performance analysis at BP’s production facilities was the subject of Ian Hendry’s presentation. BP has developed a tool for operations maintenance of onshore and offshore facilities that now has 4,500 users and one million ‘maintainable’ items. BP invests some £3.6 million per annum on data collection,. Maintenance operations are highly dependent on data quality. Hendry was a strong advocate of IBM/MRO Software’s Maximo product that underpins BP’s system. BP has the same issues of data quality maintenance as Shell.

Hydro

Lars Olav Grovik (Hydro) explained that in Norwegian, the acronym DATA can be interpreted as ‘double work for all.’ A revamp of Hydro’s drilling and G&G software portfolio with the deployment of end user-centric tools like Schlumberger’s Petrel has contributed to a reduced focus on data management. Hydro’s data management was strong in the past, but is challenged, not by a lack of technology, but by a lack of manpower. A data management program kicked off in 2006 with a focus on end-user efficiency. ‘Data management should accept nearly anything’ and provide fast and simple access to data that is close to the end user. Hydro’s involvement in integrated operations has impacted the program with a requirement to enable cross silo data flows. Grovik observed that Exxon, BP, Shell and Statoil have very different policies. There is no ‘right’ way to do it. The issue of data mining has proved problematical. Hydro is keen to offer a simple GIS interface to all of its data and is likely to leverage Google Earth. If you can’t get the results of your study into Microsoft PowerPoint ‘it’s a big problem.’ This has been a particular issue for geological models.


Software, hardware short takes ...

Software news this month from IES, ESRI, IBM, Caesar Systems, Decision Dynamics, EVC and Deloitte.

IES is about to release V10.0 of its PetroMod basin modeling package. A new lithology database, developed in collaboration with Doug Waples complements the standard IES dataset. A crustal heat flow processor computes heat flow maps throughout time and a heat flow calibration module speeds calibration. A 3D TecLink add-on, originally developed for Repsol, enables full thermal, maturation and 3-phase/n-component petroleum migration to be performed on 3D structural models.

~

ESRI’s new REST-based web services API allows for multiple layers of map data, tiled map data sources and automatic map projection management. REST is a simpler approach to web services than the usual SOAP protocols.

~

Caesar Petroleum Systems has released V5.0 of its Petroleum Ventures and Risk (PetroVR) simulator. The new release includes expanded scheduling options that take account of real-world scheduling interdependencies, explicit definition of planned and unexpected downtime and automatic ‘choking’ corrections to keep forecasts within capacity limits. Project scheduling enhancements align prior and current simulations and group related activities and events.

~

A new ‘Red Paper’ from IBM, ‘GIS Based Enterprise Solutions with WebSphere Server and ArcGIS Server’ describes key components of Enterprise GIS with emphasis on the interface between ArcGIS and WebSphere, including the deployment of ArcGIS Server 9.2. The publication includes Java 2 Enterprise Edition (J2EE) code for application integration. More from www.redbooks.ibm.com.

~

Decision Dynamics’ Wellcore Enterprise 4.3 release includes a new business performance module to provide executives with a drill-down capability into key metrics for measuring, monitoring and managing operations. Wellcore’s ability to capture state changes related to business operations lets users track the ‘velocity’ of business processes through indicators such as forecast spend vs. budget vs. costs, AFE cycle times, operations cycle times and non-productive gaps between the different well lifecycle operations. The company reported revenues of $8.1 million for 2006, a stonking 95% rise on the previous year!

~

East View Cartographic (EVC) has created a digital 1:50,000 scale map of the Middle East in CADRG format. CADRG is a compressed version of the ARC Digitized Raster Graphics format, used by the military. 3,000 local and Russian topographic map sheets were scanned and formatted to create the resource. EVC has 15 terabytes of in-stock raster data and can create seamless maps for most parts of the world.

~

Deloitte has extended PetroView’s geographical coverage of its TGS well data module. This offers PetroView users access the TGS-NOPEC catalogue of well logs and other borehole-related data via LogLine Plus. Coverage, previously limited to the Gulf of Mexico, now includes the UKCS, Russia, Canada, Nigeria, and key African locations.


iStore announces PetroTrek on Microsoft SharePoint Server

The Information Store has ported its upstream asset management solution to .NET and Visual Earth.

The Information Store (iStore) has announced availability of its PetroTrek asset management solution on Microsoft SharePoint Server 2007. A Microsoft Virtual Earth GIS front end is also available. Launched in 1997, PetroTrek is used by BP, Chevron, PEMEX and Shell.

Fortune

Steve Fortune, IM director, for BP’s Gulf of Mexico unit said, ‘Products like PetroTrek are important to our operations and integral to our Field of the Future Program. SharePoint Server allows our teams to collaborate more effectively on managing our production assets. There are also potential cost savings from leveraging our existing information technology investments and infrastructure.’

Web services

PetroTrek lets users access, visualize and control data, enabling more effective use of information. The web services infrastructure lets asset teams combine information from multiple data sources into customizable web pages. The .NET-based solution can accesses data anywhere on the network, embedding role-based data security. PetroTrek presents information in familiar form as well logs, production charts, directional surveys and maps.

Irani

iStore president and CEO Barry Irani added, ‘Our goal is to help users deploy leading-edge technology to achieve higher levels of efficiency. By migrating PetroTrek to SharePoint Server and Virtual Earth, our customers can derive even more value throughout the enterprise.’


TGS bags Parallel—rolls out Prima 9 processing suite

TGS-NOPEC’s acquisition adds high-tech imaging algorithms—‘critical’ R&D mass attained. PrimaViz 3D viewer announced.

TGS-NOPEC Geophysical Co. has acquired Houston-based seismic processing boutique Parallel Data Systems (PDS). PDS was founded in 1997 and now has 21 employees and offices in Houston and Dallas. The transaction is subject to US regulatory approval.

Hamilton

TGS COE Hank Hamilton said, ‘In working with PDS on a number of recent projects, we have been impressed with their ability to solve highly complex imaging problems on massive datasets while meeting very challenging project schedules. The combination of PDS with our existing TGS Imaging group will increase our capacity and our array of imaging algorithms. The deal also brings a critical mass for R&D into next generation seismic imaging technologies including wide azimuth 3D.’

PRIMA 9.0

TGS has also announced a new release of its flagship PRIMA seismic processing suite. New features include PRIMAViz, a 3D Viewer, depth domain gathers for AVO analysis and enhanced 2D project functions. PRIMA is used by over forty companies in their data analysis and prospect evaluation workflows. More from John Adamick, jada@tgsnopec.com.


AAPG Convention 2007—Long Beach, California

In our report from Long Beach, Chevron VP Exploration describes the five ‘discoveries’ industry needs to make in order to stave-off peak oil. BP, ExxonMobil and Shell extol the merits of their high-end ‘visionaria.’ We bring you a first hand account of the Launch of Petrel 2007—with the promise of large dataset interpretation on ‘commodity’ hardware; news from Landmark’s geomodeling effort in both GeoProbe and Geographix and new structural and sedimentological modeling tools from Midland Valley and AAPG newcomer, iGeoss.

The 5,200 or so attendees to the 2007 Conference and Exhibition of the American Association of Petroleum Geologists must have been either on the beach, or diligently attending the talks, because the cavernous exhibition area was somewhat traffic free. Those who did visit the beach will have noticed the large flare from the Wilmington field which has produced 2.65 billion bbls and 1.2 TCF gas over 75 years. It’s still producing some 75,000 bopd from 1,300 wells.

5 ‘discoveries’

Bobby Ryan, VP Exploration Chevron, gave the Division of Professional Affairs lunchtime address. The title of his talk was ‘Mapping the Route of the global energy industry’ or ‘a tale of five discoveries.’ Ryan is skeptical about peak oil but warns of access issues, demand growth and ‘challenged’ new resources. To offset these issues we need ‘discoveries’ in exploration, recovery, renewables, efficiency and talent. Ryan was music to the geologists’ ears when he announced that ‘the world is full of undiscovered resources.’ These are estimated at 1.8 trillion boe, located in Siberia, Greenland, GOM, Guinea and Middle East. A map of drilling since 1997 showed lots of dry holes and a few discoveries. Ryan also pointed out that a 1% reducing in consumption equates to 180 million bbls/year. Comparable to 1-2 large discoveries.

Visualization

Jim Thomson (BP) cited NOAA visualization guru Alexander MacDonald as saying ‘if people visualize something, they tend to understand it.’ To help its multidisciplinary teams ‘understand’ the subsurface, BP began building its Highly Immersive Visualization Environment (HIVE) in 1999. Today it has 17, three of which are high-end, triple-head, front projection systems with curved screens and stereo displays. Intriguingly, this massive, centrally funded investment came from headquarters, ‘no one asked for them.’ The latest systems are being upgraded with high resolution DLP projectors. BP’s HIVE technology is under constant review.

GeoProbe

In the early days of BP’s visualization effort, GeoProbe was the driver. Today the visionarium portfolio has grown to include CadCentre’s ReviewReality, Fledermaus VR, EarthVision’s CoViz, WalkInside and GIS. Thompson is a great believer in rear projection. This allows users to ‘walk up’ into the ‘intense zone’ of collaboration at the screen face. Other neat stuff includes SmartBoards alongside the HIVE for low resolution stuff like PowerPoint and TeraBurst technology for ‘hive to hive’ collaboration. This has had a huge impact on knowledge sharing and travel. BP is working on a GIS framework and toolkit although take-up has been slow with a lot of ‘skunk work.’ GIS systems for pipeline and asset management are under evaluation and BP is working on a paperless mapping system using TouchTable technology. HIVES are powered by SGI Onyx, but this technology is at the end of its life. SGI is no longer in this space and BP is currently testing several high end three channel 128GB machines. Large format SXRD 4K projectors represent the biggest leap in resolution yet. An 8.3 megapixel 56” holographic LCD display also ran.

ExxonMobil

Marek Czernuszenko described ExxonMobil’s use of high end visualization in geoscience and training. ExxonMobil deploys the whole enchilada of interactive stereo and collaborative, walk through 3D displays. These are used for virtual outcrop studies viewing LIDAR displays of field geology. Virtual field trips are safer and cheaper than the real thing and allow for fly through of digital terrain models integrating satellite imagery and geological maps. In the facilities arena, CAD models allow engineers to walk through unbuilt plants, avoiding design changes and errors. VR is used for operator training and immersive well planning. The only downside is that some desktop application functionality and accuracy is lost. ExxonMobil’s visualization center features in the April 30th issue of Fortune Magazine.

Shell

According to Kevin Bradford, the workstation has entrenched geologists’ and geophysicists’ isolation. It’s time they talked to each other! Shell’s strategy of leveraging ‘technology’ plays is calling for a return to a collaborative environment. It’s all about integrated subsurface evaluation—from basin modeling, through rock, fluid, reservoir and pressure prediction. Holistic, parallel workflows may leverage heat flow, gravity magnetics and sea bed logging. How do you achieve a holistic view across multi data types? Some believe volume interpretation and visualization are the silver bullets.

COTS

Shell is working to enable holistic workflows by evolving the visualization space. The proprietary systems of yesterday have been replaced with Linux-based clusters, GPU rendering and multi core, multi CPU systems. Displays now use High Definition LCDs and soon, DLP. Co-location involves bringing rooms to teams—moving away from intimidating complex structure to a ‘natural’ environment. Collaborative environments in Shell scale from desktops to visionaria and the real time operations center. Shell currently has 13 VR Centers with full immersion and stereo, 17 ‘rooms’ and 20 ‘locations’. Shell is working hard to disabuse those seismologists who still favor the ‘paper on screen’ approach and has now reached ‘critical mass’ in the seismic interpretation community. Use of volume interpretation is on the rise and there are signs that collaboration is taking hold.

Large data sets

Phil Weatherill (Shell) continued with the lower end visualization theme, describing how Shell is performing multi-scale volume interpretation on the desktop. The aim is to perform true 3D observations of petroleum systems and less ‘PowerPoint interpretation.’ This approach is enabled by high end workstations like the HP XW 9300n and Appro’s WH5548. These can be used to animate very large volumes and view whole petroleum systems, enabling ‘powerful scenario testing.’ Volume interpretation software does the ‘heavy lifting’ but integration with other applications is also key.

‘Data’ Session

Information management and ‘data’ was relegated to a poster session in an obscure corner of the exhibition. Here, Les Denham (II&T) was showing off a method to turn large numbers of well logs into pseudo seismic volumes. These could be used to map sand percentages at mega scale (for instance across the whole of the Gulf of Mexico) or to investigate reflectivity variations. de Groot-Bril’s OpendTect was used to display the results. Don Downey (Chevron) was back on the metadata crusade this time in defense of geological metadata standards. Downey’s ‘metadata workplan’ leverages XML templates to create and maintain metadata, add data QC and validation. ESA’s Tony Dupont was showing how ‘true’ 3D could be displayed in using the new functionality of ESRI’s ArcGIS 9.2. Seismic data is displayed by draping an image over the new multipatch feature. Eric Hatleberg was showing off Schlumberger’s revamped log mnemonics catalog—now available on the web.

Petrel 2007

Schlumberger’s Petrel 2007 launch was snazzy if insubstantial. Racing car analogies, guessing games and prizes clouded the message which was that you can now interpret a 200,000 sq. km., 158GB dataset on a ‘commodity’ workstation. Data access is speeded by ‘disk roaming’ and parallelized access across 4CPUs so that a variance cube calculation could be run as a background job while moving around the data set. Is a 4CPU PC with 33GB of physical memory a commodity? Maybe, but the show lacked the pizzazz of early GeoProbe demos. Too much Formula One, not enough geology.

Linux

For high-end interpretation and visualization Petrel can be attached to a Linux cluster, for the analysis of Terabyte datasets running across 256 CPUs. Schlumberger is migrating GigaViz/InsideReality tricks and techniques into Petrel, adding virtual reality and haptic interfaces. Regarding HPC and operating systems, Linux is Schlumberger’s ‘primary’ cluster solution and there is also support for Windows Compute Cluster Server 2003.

GeoFrame

As Petrel takes over larger and larger datasets, what is to become of Schlumberger’s ‘legacy’ toolset, GeoFrame? The answer is that GeoFrame’s footprint is exploration—interpretation and structural modeling—although as Petrel introduces more seismic interpretation the dividing line is a bit blurred. GeoFrame’s strength is its capacity to handle hundreds of thousands of wells, multiple projects and hundreds of 3D surveys.

GeoGraphix

Landmark is rebuilding its 3D modeling framework with the new EZModel, embedded in GeoProbe and sharing its geometry engine with GeoGraphix’ Smart Section. We saw a demo of framework modeling from well data which uses stratigraphically-conformant bed mapping. A fault networks tool does correct sealing faults and horizon modeling and is said to model thrusts. Looks like quite a credible tool for seismic-less interpretation—something the geologists seem to like doing!

This ‘n that

Frenetic activity in Western Canada’s oil snads is generating cores ‘by the truckload.’ Calgary-based Datacon’s RocksOnline solution characterizes cores and associated slabs and measurements so that they can be moved offsite to low cost storage. Petrosys now offers connectivity with IHS Petra, adding its high-end contouring and volumetrics to the geological interpretation package. Midland Valley Exploration has released ‘4D Move,’ extending geological modeling right back to basin sedimentary processes. Currently only turbidites are supported, but a carbonate deposition model is under development. Newcomer iGeoss is applying geomechanics to subsurface fracture characterization in its Poly3D application.

This article is a summary of a longer, illustrated report produced as a part of The Data Room’s Technology Watch Service. For more information please email info@oilit.com.


Escondido Selects Production Access’ Production Center

Package links departmental databases, business intelligence and real-time production data.

Texas independent Escondido has chosen Production Access’ Production Center (PC) application to manage its reporting. Escondido president and CEO Bill Deupree said, ‘Using Production Center, we plan to transform our labor-intensive, spreadsheet-based production reporting process to a streamlined, information rich environment, while eliminating significant duplication of effort. Professional resources are hard to secure and getting the most effective work products in their hands is a key to leveraging all of our assets.’

Yurkanin

PA CEO John Yurkanin added, ‘PC brings productivity gains by leveraging business intelligence delivered from Production Access (PA). PA links departmental databases into a comprehensive information management system. By capturing well data early and enabling all levels of employees to access real-time information, the entire E&P business benefits.’

Petris

PA recently announced a 3-year growth strategy to invest in professional services, software development, and sales staff. A Change Control Board was established to manage the quality and commercial value of changes to the product suite. A Support Escalation Team was commissioned to give customers a voice in the incident escalation process. PA has partnered with Petris to offer OC as a component of PetrisWinds DrillNET.


Folks, facts and orgs ...

CERA, Fugro, GeoEye, Infield, MRO, ISS, MetaCarta, GXT, ISA, Isilon, Energistics, OpenSpirit, PGS, CIT, Merrick, SPE

Samantha Gross and Graham Spence have joined CERA’s Global Oil Group. Gross is a corporate responsibility and environmental expert, Spence is to analyze the North American gas market.

Fugro has acquired remote sensing and GIS specialist EarthData. The company has also opened a new office in Venezuela.

GeoEye has acquired General Electric’s MJ Harden Associates unit. GE will retain the pipeline GIS consulting, software and data management business within its PII Integrity Services Division.

Oil and gas analysts Infield Systems has appointed Steve Adams, formerly of OPL-Clarksons, to international sales manager.

IBM unit MRO Software has launched a free web site providing resource and best practices in the field of enterprise asset management, http://eam.mro.com.

ISS has opened a second UK office in Aberdeen. EU manager Brian Neve heads up the region.

MetaCarta has appointed John Donnelly III as VP sales. Donnelly was previously with Interwise Software.

Mary Norris has joined Venture Information Management as business development director. Dave Fraser has also been joined the company as sales manager.

GX Technology has signed a JOA with Suelopetrol to offer high-end seismic processing services in Latin America.

OFS Portal’s EU rep, Dave Wallis has been co-opted to the advisory board for EC’s ‘e-Business W@tch’ ICT unit.

The ISA standards committee has launched a new website, www.eddl.org, providing information and training resources on the EDDL standard.

Clustered storage specialist Isilon Systems has signed with Schlumberger to provide project management and information systems to the oil and gas industry.

Dave Latin (BP), Steve Zelikovitz (ExxonMobil) and Roger Brown (Paradigm) have been elected to the board of Energistics (formerly POSC). Tracey Dancy (Paras) has been named West European Region Lead.
Brooks Klimley is to head up CIT Energy’s new Houston office. The company plans to do $3-$4 billion of financing in 2007.

Lynn Babec has joined OpenSpirit as VP Marketing. Babec was previously with Halliburton’s Landmark unit.

PGS is back into the exploration business with a ‘strategic investment’ in Genesis Petroleum.

Philippe Flichy has joined Merrick Systems as VP business development. Flichy was previously co-founder and CTO of GlobaLogix.

Leo Roodhart (Shell) is the 2009 SPE President elect. Roodhart was in charge of Shell’s ‘Game Changer’ technology program.


Conchango business intelligence at heart of Chevron’s SEER

Blog posting outlines Chevron’s migration from Excel to service-oriented business intelligence.

UK-based software house Conchango was behind much of the development of a Chevron/Microsoft flagship ‘SEER’ business intelligence application that has been deployed on the Captain Field in the North Sea and on the Californian San Ardo ‘i-Field’ (OITJ Nov 2006).

Thomson

In a recent blog*, Conchango’s Jamie Thomson notes that business intelligence often relies on a complex network of point to point, inter-application data transfers and Excel spreadsheets. In such situations, control over data formats, integration and quality has effectively been lost.

SOBI

To address these issues and provide Chevron with usable business intelligence (BI), Conchango has leveraged web services and internet standards to ‘wrap’ legacy applications, creating new, web enabled workflows. For service-oriented business intelligence (SOBI) to work, you need common data exchange formats, master data definitions and trusted systems of record (SoR) for critical data, common applications and interfaces.

Governance

Above all SOBI demands a huge investment in data governance. As Thompson puts it, ‘Without a willingness on the part of the customer to clean up their SoR’s, SOBI can’t work.’ Data owners can’t hide sloppy implementations or poor data quality behind a data warehouse. Data integrity starts at the SoR.

Facades

Data is made available to consumers via services—a.k.a ‘facades’ or business objects. Facades serve data in a manner that can be understood by data consumers. Data can be composited from one or more SoRs into a single facade. Data is usually cached above the facades in a project data store for before use.

Facades

XML-based wrappers or ‘facades’ allow data to be shared between disparate data consumers. Despite some initial performance concerns, XML data transfer is a keystone of the project. Thomson says that with the right infrastructure, the system can even support real-time BI.

Master data

Mapping across different systems of reference with typically, different names for the same well was a major problem for Conchgango’s developers. The solution was to build a new system to handle master data like well names, along with mappings to objects in the SoRs and to define a shared data structure that was suited all stakeholders.

PPDM

Chevron suggested using the Public Petroleum Data Model Association’s (PPDM) data model for the master data store which proved problematical. The PPDM data model has some 1,500 tables and 11,000 database objects making it far too big for the job in hand. There were also issues with the large number of foreign keys and the size of some primary keys. It would have also been necessary to add tables for behavior-based safety (BBS) information, absent from PPDM.

Abstraction

In the end, Conchgango abandoned PPDM and went for an abstract data model consisting of only three tables. This provided the flexibility required to model all of Chevron’s master data.

Comment

Did it really take the advent of SOA to tell us that cleansed, unduplicated data and a rigorous taxonomy are necessary for enterprise BI or even information management? Probably not. But SOA and XML have at least given us visibility of each other’s data. And projects like Conchango are successful in getting the data issues fixed once and for all at source, rather than by every Excel user in the community, whenever new data comes in. For real-time, it’s the only way to go.

* http://blogs.conchango.com/jamiethomson


Cabot deploys Acopia Networks’ ARX file virtualization

‘Freedom Fabric’ network operating system ‘works as advertised,’ simplifies storage management.

Houston-based Cabot Oil and Gas has deployed a file virtualization solution from Acopia Networks to serve its 30TB of online seismic data. The solution uses Acopia’s ARX intelligent file virtualization hardware devices running Acopia’s ‘FreedomFabric’ network operating system. FreedomFrabric is claimed to simplify file storage management and lower storage costs by automating data management tasks and reducing disruptive storage management operations.

Burch

Cabot technical manager Norbert Burch explained, ‘Following technical issues with our previous stub-based storage system, a thorough evaluation found that Acopia was the only product that worked as advertised. The ARX solution met or exceeded our expectations in regards to its NetApp compatibility, functionality, scalability, and stability.’Cabot uses the system to identify inactive files on its high-end NetApp FAS960 F-CAL systems.

Transparent

These are moved transparently, on-the-fly to more economical Serial-ATA storage. Burch added, ‘We have over 30 terabytes of file data spanning eight volumes, and the amount of data is growing fast. We need a technology that will grow with us—Acopia has delivered that solution.’ Cabot’s future plans for Acopia include use in conjunction with its Tier 0 high-performance application and load balancing initiatives.


Microsoft partner Meridio’s eDRM sees oil and gas take-up

CITGO, Statoil and ConocoPhillps have deployed the .NET electronic document management system.

Enterprise document and records management (eDRM) software house Meridio reports oil and gas sales to CITGO, Statoil and ConocoPhillips. Meridio’s Microsoft .NET-based technology adds ‘robust’ document management to Microsoft’s Office suite, supporting compliance, process improvement and ‘litigation avoidance.’

Liability

Meridio’s solutions help control the costs and liabilities associated with e-mail, providing an enterprise class compliance infrastructure. Integration with third party tools leverages .NET and a web service toolkit. Integration with SharePoint Server enables world-wide collaboration, adding traceability and secure access to the information lifecycle. Links to SAP allow unstructured content to be managed through the familiar Microsoft Office interface.

Thomas

Meridio senior VP Becky Thomas said, ‘Oil and gas has unique eDRM requirements. The combination of flexibility, integration with Microsoft Office and our service-oriented architecture has made Meridio a strategic choice for oil and gas companies.’


SPE, WPC, AAPG and SPEE release new reserves classification

New system addresses technological progress and growth of unconventional reserves.

The Board of Directors of the Society of Petroleum Engineers (SPE) has approved a new petroleum resources classification system following two years of collaboration between the SPE, the World Petroleum Council (WPC), the American Association of Petroleum Geologists (AAPG) and the Society of Petroleum Evaluation Engineers (SPEE).

OGRC

Coordinated by the SPE Oil and Gas Reserves Committee (OGRC), the new ‘Petroleum Resources Management System’ consolidates and replaces previous classification systems from the above organizations.

Ritter

OGRC chair John Ritter explained, ‘By late 2004, it became clear that current guidance failed to meet the requirements of industry stakeholders due to advancements in technology, internationalization and the growth of unconventional resources. The 2007 system provides a high level of consistency in estimating resource quantities, incorporating best practices identified in other international petroleum and mineral classification systems.’ The fruits of the OGRC’s effort are available at www.spe.org/reservesdef.


Web posting mashes-up SAP with SCADA, maps and BEPL

SAP guru Krishna Kumar describes new web technology as ‘visualization ecosystem.’

In a recent SAP Info web posting, ‘How to Visualize a Business Ecosystem,’ Krishna Kumar shows how service-oriented architectures (SOA) can be used to create a new visualization ‘ecosystem.’ Cross-application ‘mash-ups’ allow users to visualize business information on a map or leverage Business Process Execution Language (BPEL) to selection data from disparate service providers including Yahoo, Google, Amazon corporate SAP databases.

Enterprise Horizons

One application, Magma Ecosystem, from Enterprise Horizons, uses BPEL to orchestrate dataflows from sources such as SCADA sensors with web services call to ESRI’s demographics engine. The results are visualized in a 3D geographic display and can be used in utility planning and scheduling.


New publications address well test analysis and phase behavior

Just published: ‘Dynamic Flow Analysis’ (Kappa) and ‘Phase Behavior of Petroleum Reservoirs,’ Calsep.

Kappa Engineering has just released an online resource ‘Dynamic Flow Analysis’ (DFA) as a free, downloadable set of pdf documents. DFA, which includes well test interpretation and pressure transient analysis, is well-illustrated with screen shots from Kappa’s software. But the material is largely software independent, teaching techniques rather than software usage. The introduction traces DFA’s evolution from the Horner plots of yore to today’s compute-hungry algorithms. Later chapters include up to date information on permanent downhole gauges, formation testers and DTS. Highlighted sections indicate optional reads for those seeking full equation development. For us, the only downside was the awkward multiple PDF format.

Calsep

In the context of Petroleum Engineering training we have also received a review copy of Karen Pedersen and Peter Christensen’s new book, Phase Behavior of Petroleum Reservoir Fluids. The authors are also founders of Danish software house Calsep. Phase Behavior is a thorough textbook treatment of the subject with an up to date account of modern petroleum engineering techniques applied to producing oil and gas in ‘challenging conditions.’ Kappa’s book can be downloaded (after registration) from www.kappaeng.com. Phase Behavior is published by Taylor and Francis—ISBN 0824706943.


Yokogawa control system passes Wurldtech’s cyber security test

Rigorous testing now mandatory as process control deperimeterization sees more open IT.

Process control and engineering house Yokogawa commissioned Vancouver-based Wurldtech to test the resistance of its Centum CS 3000 R3 production control system and ProSafe-RS safety instrumented system to cyber attack. The tests were performed in February 2007 using Wurldtech’s Achilles Assurance Platform (AAP). The tests determined the controllers’ robustness to a variety of Ethernet, TCP/IP and Vnet/IP hacks.

Ethernet

Yokogawa’s controllers are deployed in oil and natural gas infrastructure which is increasingly migrating from traditional closed process control systems to more open IT and Ethernet-based systems in a process is known as ‘deperimeterization’ (OITJ Jan 06). The move to internet technologies brings a potential threat of cyber attack—hence the need for rigorous testing.

Pass

The Wurldtech tests determined that the Yokogawa controllers had stable and robust network stacks and that they passed all Achilles V1.2 tests. Wurldtech’s Achilles offers security and quality assurance tests that address network vulnerabilities in control systems.


Enterprise project management solution for Marathon’s Brae assets

UK-based Program Framework gets design and implementation contract.

Marathon Oil UK has awarded project management consultants Program Framework a contract for the design and implementation of an enterprise project management (EPM) solution to assist in scheduling, tracking and execution of maintenance and engineering activities across its North Sea Brae Asset. Brae comprises three production platforms and multiple onshore engineering teams.

Revamp

The deal is a component of a larger revamp of Marathon’s Brae planning process including new planning software interfaced to Marathon’s SAP-based management, scheduling and reporting tool.

Microsoft Project

The EPM solution is based on Microsoft Project Server 2003 and provides users with visibility of schedule and resource information across multiple project plans in a variety of locations. Program Framework will also provide training in Microsoft Project and the EPM solution.

Major

Program Framework director Paul Major said, ‘EPM is a critical component of the programmed maintenance that assures un-interrupted functioning of multi-billion pound assets. But EPM solutions are powerful and highly complex—tuning them to a company’s needs is critical to a successful implementation.’ More from christine.stone@programframework.com.


SolArc’s ‘RightAngle’ gets SAP NetWeaver seal of approval

Trading and risk management solution now allows for advanced interaction with SAP back end.

Houston-based trading and risk management specialist SolArc has received SAP’s NetWeaver certification for its RightAngle deal capture, inventory management, accounting, reporting and risk analysis package. RightAngle now leverages the NetWeaver Exchange Infrastructure (SAP NetWeaver XI) to communicate with the mySAP Business Suite. The certified integration promises faster implementations and lower integration costs.

Haynie

SolArc CTO Cynthia Haynie said, ‘Customers require flexible ERP integration strategies. By working with SAP we have leveraged our SAP integration experience to develop a standard suite of scenarios certified for the NetWeaver platform.’

Process integration

The system can be extended with the NetWeaver process integration functionality. Certified content integration lets users settle RightAngle transactions within SAP and enables other advanced interactions between RightAngle and SAP back-end systems. RightAngle users include Chevron and ConocoPhillips.


Invensys Operator Training system for Cheniere Energy

Sabine Pass liquefied natural gas (LNG) plant deploys SIM4ME-based system for terminal operations.

Cheniere Energy has awarded Invensys Process Systems a contract for the implementation of the Operator Training Simulation (OTS) system for its Sabine Pass liquid natural gas receiving terminal currently under construction in Louisiana.

DynSim

The OTS will be used for training control room operators, supervisors and plant equipment operators at the terminal. The OTS also offers checkout tools for the distributed control system (DCS) and emergency shutdown system (ESD). Cheniere is also to use the system for process engineering and testing, non-control system equipment simulation, and training workers outside the control room.

SIM4ME

The OTS leverages Invensys’ SIM4ME architecture which integrates process, controls and equipment simulation in what is described as an open and scalable architecture. SIM4ME’s functions are based on a shared set of dynamic simulation principles and user interfaces.

TOMS

The new contract builds on previous Invensys performance in the Sabine Pass project including the Terminal Operations Management which automates business processes across the facility. Some 20 software systems from Invensys and third parties will be integrated with Cheniere’s enterprise level business systems.

Race against time?

Investment in LNG facilities may prove to be a race against time according to a recent white paper from IBM’s Global Business Services division. The report ‘A High Stakes Race Against Time’ asks if LNG capacity will be brought on stream in time to benefit from the current gas price bubble. Download the IBM study from www.ibm.com. For more on the Sabine Pass LNG development see OITJ March and April 06 issues.


Abu Dhabi Marine Operations’ EPC deploys plant design management system

National Petroleum Construction Co. implements AVEVA PDMS on Umm Shaif redevelopment.

Abu Dhabi-based National Petroleum Construction Co. (NPCC) has selected Aveva’s Plant Design Management System (PDMS) for use on its part of Abu Dhabi National Oil Co. unit ADMA-OPCO’s Umm Shaif redevelopment project. Aveva PDMS will initially be used for two new well head towers and two new 24" sub sea pipelines, each starting from the new well head to the existing Umm Shaif Super Complex (USSC). Aveva PDMS will support front end design to construction.

Kurdali

Ahmad Kurdali, Engineering Manager of NPCC said, ‘We will be using PDMS throughout all phases of the project. After the 3D modeling and design, the design drawings and isometrics will be generated by PDMS. We will also be integrating the PDMS model with the structural steel detailing software to optimize the production of fabrication drawings.’

Statoil

The first edition of Aveva’s customer magazine, ‘Pipeline,’ covers a flagship PDMS deployment on Statoil’s Statfjord A Platform. Statfjord was laser-scanned in 2002 and tied to a 3D PDMS model. The company has now standardized on PDMS for all new projects. PDMS underpins Statoil’s ongoing maintenance and plant upgrades. The company maintains 3D modeled PDMS data for the complete lifecycle of each project. Statoil also leverages AVEVA’s ‘Global’ solution for multi-site concurrent working. More from becky.stevens@aveva.com.


Weatherford acquires geomechanical boutique

Software and consulting house Advanced Geotechnology Inc. will team with Hycal and Omni Laboratories.

Weatherford’s Canada Partnership unit has acquired petroleum geomechanical specialist Advanced Geotechnology Inc. (AGI). AGI was founded in 1994 by Pat McLellan and has since grown from a one-man show to a multi-disciplinary consultancy with two flagship software products.

StabVIEW

The deal brings Weatherford’s world-wide support and marketing effort to bear on AGI’s StabView and RocksBank products. StabView applies geomechanical theory to the well planning process enabling risk analysis of wellbore stability, lost circulation and fracturing.

Omni Labs

Weatherford’s rock mechanical and acoustic property testing services are performed in its Hycal (Calgary) and Omni (Houston) laboratories. More from Pat McLellan, mclellan@advgeotech.com.


Invensys announces Material Balance Module for real time process

New tool turns raw process data into ‘consistent and reliable’ information, eliminating errors and losses.

Invensys Process Systems unit SimSci-Esscor has introduced a new simulation and modeling package to transform raw process data into ‘consistent and reliable information’ that can be leveraged to enhance asset performance. The new Material Balance Module (MBM) for SimSci-Esscor’s Advanced Real-time Performance Modeling (ARPM) suite, is a flowsheet-based solution that uses mass and volume reconciliation to reveal error sources that could affect real-time data accuracy.

Gulati

Product director Harpreet Gulati said, ‘Consistent and reliable plant data is essential for operations and maintenance activities, but raw process data as gathered from plant devices can be subject to routine errors that may go unnoticed. By reconciling the mass and volume of process streams, MBM identifies errors or material losses so that these can be eliminated through tuning, maintenance, or other corrective measures.’

IT infrastructure

MBM interfaces directly with a plant’s IT infrastructure to automate data reconciliation without additional routine data entry. It combines advanced reconciliation methodologies within a flowsheeting tool to automate creation of daily material balance reports for each major unit, identify bad flow instrumentation, and aid in pinpointing material loss locations.

Historian

The scalable module can be used to create a simple material balance representation of a production system which can be extended for more rigorous heat and material balance, performance monitoring and decision support for closed loop optimization and real-time enterprise control. MBM interfaces with PHD and PI Historians, as well as Invensys’ own InFusion Historian.


IHS integrates data and interpretation suite, adds A&D decision support. Well test dataset acquired

Enerdeq data connection for Petra, Acquisition Screener announced as RapiData enters IHS fold.

The latest version of IHS’ Petra geological interpretation package includes the IHS’ Enerdeq Direct Connect functionality announced last month (OILITJ March 07), a services-oriented architecture for automated data access and update. The hosted data and software combo should ease data management for Petra users.

Acquisition Screener

IHS also announced a new ‘Acquisition Screener’ (AS) to provide decision support to upstream acquisition and divestment (A&D) teams. AS helps identify potential acquisition targets and validates economic assumptions, providing information on operator rankings, valuations, activity operating expenses and reserves.

Rose

IHS senior VP Mark Rose said, ‘We have added structure calculations and logical navigation to the numerous datasets that we know are used by A&D teams which can now focus on evaluations, negotiations and networking rather thank on data collection and manipulation.’ More from Neil.Job@ihs.com.

RapiData

In what has been a busy month, IHS also announced the acquisition of the RapiData Well Test, Pressure and Completions Data for Western Canada from Calgary-based Rapid Technology Corp. RapiData is already available from IHS AccuMap and Enerdeq products. Rapid Technology Corp. is now a holding company with an investment interest in publicly traded Rapid Solutions Corporation which is unaffected by the sale. More from Bob.meyer@ihs.com.


Landmark announces pre-release of ‘next generation’ interpretation suite

DecisionSpace R5000 release includes SDKs for infrastructure and engineering. ‘Classic’ OpenWorks option to stay.

Landmark has just pre-announced the ‘R5000’ release of its DecisionSpace upstream application suite with the ambitious tag line ‘2007: The Year of DecisionSpace.’ R5000 is Landmark’s first synchronous release of DecisionSpace with much of its software portfolio upgraded to a common platform of operating systems, processors, third-party software and graphics cards.

New SDKs

DecisionSpace promises multidisciplinary application and data integration—of Landmark’s ‘classic’ products, client and third-party applications and new DecisionSpace optimized apps. The R5000 release introduces software development kits (SDK) for infrastructure and the Engineer’s Data Model. The OpenWorks SDK is also to be ‘refreshed’ for the new environment.

Workflows

The veil will lift on R5000 with ‘workflow previews,’ running on the current R2003 platform, prior to the full R5000 release later this year. Initial workflows include the ‘Geoscience Desktop of the Future,’ offering multi-scale seismic interpretation, geocellular model building and subsurface interpretation. Other workflows address optimized reservoir decision making and well planning. R5000 implements data model changes to support high-end workflows like prestack seismic data on the desktop. By supporting both ‘classic’ and ‘built for DecisionSpace’ environments R5000 preserves clients’ technology investment. More from www.lgc.com/decisionspace.


MetaCarta unveils oil and gas geographic intelligence portal

Web-based GeoIntel for Petroleum service is deployable inside firewall and as free public website for trial period.

MetaCarta is now offering its oil and gas specific geographical lexicon as an online geographic search service. The new ‘GeoIntel for Petroleum’ (GIfP) service (geointel.metacarta.com) claims to be a ‘simple, cost-effective way’ to geographically search and discover web-based energy-related intelligence.

Public information

According to MetaCarta, public information on the web is leveraged by managers and geoscientists for new ventures, competitive analysis, HSE and political risk assessment. GfP scrapes some 3,500 web sites and 9 million web pages that contain energy industry-relevant information. Content is under continuous review by a MetaCarta ‘content curator.’

geOdrive

For current geOdrive users, the subscription-based service extends search beyond the firewall to access industry-specific web sites and aggregate results with internal data sources. For energy researchers that are not current MetaCarta customers, GIfP is available as an on-demand service.

Hutton

MetaCarta VP Rick Hutton said, ‘Pertinent information that can impact billion-dollar decisions may exist on industry web sites. GfP locates such data and, for geOdrive users, assimilates it with internal information.’ More from energy@metacarta.com.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.