Oil IT Journal met with Charles Karren this month, Oracle’s new Director of Oil & Gas Strategy & Marketing, to hear about Oracle’s revitalized marketing effort in the oil and gas vertical. Oracle’s Digital Oilfield (ODO) offering spans upstream ERP and geotechnical data.
The idea is simple enough. Most oil country applications run an Oracle database of one sort or another and Oracle itself has a range of federating technologies in the form of data warehousing and business intelligence. So why not bring all this together with a data warehouse and master data management combo?
Oracle wants to provide companies with a global E&P data hub, a ‘single source of truth’ integrating geophysical, geological, engineering, operations and financial data. By centralizing data access, Oracle plans to support real time business analytics, simple query and reporting, compound E&P workflows and data cleansing.
Back to the future?
Old timers will remember previous attempts to federate oil and gas data and to provide a ‘single source of the truth.’ Indeed Oracle’s very own project synergy (OITJ Feb 99) crashed and burned despite strong support from Statoil. But technology moves on. In particular, there is better understanding of how to address data federation today—through data warehousing and master data management—and this is the approach that ODO has taken.
Rather than trying to model the whole of the upstream, Oracle is working with the PPDM Association to develop a master data management system leveraging Oracle’s Data Warehouse Application Console. The PPDM Data Model is to assure master and spatial data management for the ODO across the ‘smart well’ data lifecycle.
eOil and Gas
Other components, notably Siebel’s eOil and Gas, offer production and HSE reporting. Poster child for the ODO is Petrobras which has deployed a central Oracle database platform leveraging much of Oracle’s technology—but not the PPDM data model.
Oracle is also entering the SOA space with its ‘Fusion Application Stack,’ a ‘pure JAVA/XML’ SOA infrastructure. Oracle plans to componentize its JD Edwards package into oil and gas-specific applets. Fusion Middleware is compatible with IBM, TIBCO and interfaces with .NET. The proof of concept Well Data Hub is in test with a supermajor which is using it to manage oil country tubular goods (OCTG). Three different naming conventions are managed in an Oracle Data Warehouse. According to Karren, ‘oil and gas is five years behind financial services in data warehousing and master data management.’ More from www.oracle.com/oilandgas.
Arcapita has turned around its $200 million purchase of Roxar (OITJ Feb 06) with an agreed $380 million sale to Norwegian flow assurance specialists Corrocean. The deal brings together Corrocean’s subsea corrosion and sand monitoring technology with Roxar’s multiphase and wetgas metering. Financing for the acquisition is through a new shares issue—most of which has been pre-subscribed—and a NOK 1,1000 million loan from DnB NOR and Fokus Bank. A Prospectus has been issued for the unsubscribed share capital.
The acquisition accentuates the division between Roxar’s high-end subsea metering and its software arm. Corrocean’s prospectus claims a 50% market share for Roxar’s Irap RMS 3D reservoir and geological modeling package—the ‘product of choice’ for customers with large and challenging reservoirs.
Last year we speculated on a possible split of Roxar’s two somewhat disparate businesses. The Prospectus states that ‘Roxar’s software division is well positioned for further growth and for realization of the full value potential.’ The deal is expected to close in July.
Next month sees the annual roll-over of sponsors to the online edition of Oil IT Journal—www.oilit.com. For 2007-2008 we have support from the following companies ...
We thank the eleven companies who renewed their support for another year and extend a special welcome our new sponsor
Eleven out of twelve renewals is a pretty good showing and reflects the steady rise of oilit.com’s popularity and effectiveness as a vehicle for getting the message out to the oil and gas software community. Activity on the www.oilit.com website reached positively paroxysmal levels in June with an average 2,550 ‘visitors’ per day (3,350 max—see graphic below) and around 15,000 hits per day. This is a pretty good traffic for a website that only gets updated once a month. In fact the visitor count is actually pretty constant throughout the month.
Ten years free!
This reflects the fact that there are now ten years worth (up to June 2006) of Oil IT Journal freely available online—along with headlines for the current year. There is a lot of useful information in these back issues for marketing departments, researchers and other knowledge workers. We have it on good authority that the online edition of Oil IT Journal is de rigeur for ‘onboarding’ new hires—especially those who, as is increasingly the case in the oil and gas sector, are brought in from other industries and who are expected to get up to speed in oil and gas terminology and folklore in short order.
These numbers are as reported by our ISP’s Urchin (now Google Analytics) tracking software and while they should be taken with a pinch of salt, they show very considerable year on year growth. In fact this time last year I was reporting, with some satisfaction, that we were getting 1,600 visitors per day. And already, in 2005, a survey for POSC (now Energistics) by Houston-based Spur Digital described www.OilIT.com as ‘the top website for energy professionals.’
A degree of over counting is inevitable as, since we added RSS feeds to the website we get a considerable number of visitors of the robotic variety. But that in itself is no bad thing as the robots are no doubt going forth and carrying the oilit.com message to the more remote parts of Cyberspace. Or perhaps that should read the Blogosphere because the robots translate Oil IT Journal into a blog format. We now have a hundred or so ‘subscribers’ from the blog community.
Another measure of a website’s popularity is given by its Google PageRank—a rather confusing reference to Google founder Larry Page’s ranking algorithm. Imagine the confusion though if Larry’s parents had called him Webster—but I digress. We had a bit of a fright earlier this year as our PageRank went down from five (where it had been for a couple of years) to zero. But this is proved to be an artifact as Google algorithms beavered away on our voluminous logs. They finally came back with a new rank of seven, which I think is pretty respectable. This is the same PageRank as the AAPG and the SEG and bests the EAGE (PageRank 6).
Well that’s more own-trumpet blowing than we have indulged in for a while—so it behooves me to offer you some pithy editorializing in the way of a reward for having read this far. How about some fascinating facts? In his keynote address to the 2007 PNEC conference in Houston this month (a report of which will appear in the July-August issue of Oil IT Journal) Information Week editor John Soat spoke about data management at large. First fascinating fact is that, according to an IDC study, there will 988 billion gigabytes online by 2010. Perhaps this fact is more fascinating for its spurious precision than the number itself. OK, well try this one instead. Would you believe that programmed trading (buying and selling shares by computer) is now so prevalent that the speed of light is the limiting factor. According to Soat, banks and trading houses now co-locate their hardware inside NASDAQ’s computer facility. The shortened electronic pathways gives them considerably more than a microsecond advantage over the poor day trader watching the ticker jump around on his screen at home checking for ‘momentum.’ Not sure how you interpret that, but I can’t see this as anything more than a twenty-first century version of insider trading! Can somebody explain please?
If you are involved in process control—whether it is in a refinery or on board the ‘digital oilfield’ then you should read the new book from the ISA, ‘Alarm Management: Seven Effective Methods for Optimum Performance’ by Bill R. Hollifield and Eddie Habibi of People and Asset Solutions (PAS) of Houston.
Operator action required!
Alarm Management is an entertaining easy read crammed full of information and insights. For instance, in Chapter 4 ‘the most important chapter in the book,’ an alarm is defined as a means of alerting an operator that action is necessary. Control systems manufacturers have made alarm systems so easy to deploy that they are often used inappropriately. Examples of inappropriate use are telling the operator stuff that is ‘nice to know,’ that the system is ‘working normally’ and so on. Alarm Management provides real world examples of how default alarm set ups can produce spurious information—or worse, how alarm logic, even in mildly complex situations, can combine to ‘obscure and interfere with the operator’s understanding of events.’
Programmer action required!
Alarm Management describes advanced process control and the ‘digitization’ of work processes as ‘the most significant breakthroughs in process automation.’ Unfortunately, all this has had an unintended consequence—information overload for the operator. Moreover, lack of interoperability between layered applications and a compartmentalized focus have created a silo effect and a ‘confusion of systems and applications.’
Alarm Management calls for a ‘breakthrough’ in automation technology and offers some thoughts on how this could be achieved. Chapters on alarm graphics and a sample alarm philosophy cookbook round off this timely book. More from www.isa.org/books.
At the quaintly titled ‘Semantic Days’ Conference in Norway earlier this year, Chevron researcher Frank Chum described what the W3C/Semantic Web community hopes will become the next ‘big’ application for Semantic Web technology (OITJ March 2004). Chum’s paper, ‘a survey of semantic web technology in the oil and gas industry’ outlined a future Open Oilfield Ontology Repository (O3R) project with backing from Chevron, Exxon and Total that sets out to ‘collect public oil and gas ontologies and make them freely available to the industry at large.’
Oil country data exists in both structured (database) and semi-structured forms (spreadsheets and documents). A new approach is needed to deal with this flood of information and its heterogeneous formats. For major capital projects information needs to be standardized and integrated across systems, disciplines and organizational boundaries. The semantic web promises a common framework that allows data to be shared and reused across application, enterprise, and community boundaries, leveraging ‘machine-operational declarative specification of the meaning of terms based on the Resource Description Framework (RDF).’
Oil country examples of semweb technology include Fluor Corp.’s Accelerating Deployment of ISO 15926 project targeting information handover of data throughout a plant’s life cycle, the Norwegian Daily Production Report project (currently under test on Hydro’s Åsgård field), the Active Knowledge Systems for Integrated Operations (AKSIO) project (knowledge management support for offshore drilling) and the Integrated Information Platform (IIP) project (OITJ Dec 04). For Chum, Ontologies provide ‘a shared understanding of data within a domain, and allow for better interoperability of information systems.’
The mooted Open Oilfield Repository (O3R) portal will provide search, navigation and delivery of the underlying resources via a process called ‘ontology-driven information retrieval.’ This betters current search techniques by decoupling the portal navigation from the semantics. Ontology-driven navigation supports serendipitous discovery, advanced drill down and aggregation of structured and unstructured information. The O3R is to leverage semantic web services embedded into OWL-S.
Seattle-based storage specialist Isilon reports sales of its Isilon IQ system to Denver-headquartered Tricon Geophysics and Houston-based Seismic Exchange Inc. Tricon has deployed Isilon IQ and its Isilon OneFS operating system to unify its seismic data storage into a scalable shared pool—streamlining processing workflows.
Tricon IT director Bryan Matthey said, ‘We have seen a massive increase in the amount of raw and processed seismic data we need to store and access and Isilon IQ provides the scalability, performance and ease of use we require to keep pace with our business and data growth and speed our operations.’ Isilon IQ claims to provide ‘ubiquitous’ access to the rapidly growing stores of digital content and unstructured data, eliminating the cost and complexity barriers of traditional storage architectures.
Seismic Exchange, Inc. (SEI) has deployed Isilon IQ as the central repository for its proprietary 3D seismic data library—comprising over 30,000 square miles of 3D seismic data into, replacing a ‘traditional’ SAN-based architecture. Isilon allows SEI to access and manage the data directly on its clustered storage, increasing productivity and project turnaround.
Petro-Canada is using Geovariances’ Isatis geostatistical package to model its oil sands mining property. Petro-Canada’s uses Isatis’ to create multiple realizations for uncertainty evaluation. Isatis provides stochastic modeling of large data sets and models containing tens of millions of cells.
Petro-Canada is applying increasingly sophisticated modeling techniques as its understanding of its oil sands property improves with ongoing drilling and data analysis. Isatis’ algorithms cover exploratory data analysis, data modeling, post-processing and visualization.
Petro-Canada plans to use its Isatis model throughout the project’s lifecycle. Usage is now extending to other oil sands and conventional oil and gas properties with the acquisition of a second Isatis license. More from Patrick Magne, firstname.lastname@example.org.
Landmark is developing a data-access adapter leveraging OpenSpirit’s integration framework to connect DecisionSpace to third party data stores. The adapter will allow Landmark’s products to update, read and delete data from data sources, including Schlumberger’s GeoFrame and Finder, PPDM-based data stores, Petris’ Recall, SMT’s Kingdom Suite, ESRI SDE and IHS’ Petra inter alia.
Landmark VP Doug Meikle said, ‘we understand that our customers’ environments include solutions from multiple vendors. By adding the OpenSpirit framework to the mix, we can help them preserve their IT investments and extract additional value from their existing data sources. It’s a big advantage when it comes to multi-vendor integration and openness.’
The OpenSpirit integration framework is a vendor-neutral integration solution that lets disparate applications and data stores work together, providing geoscientists and data managers with ‘out-of-the-box’ access to applications and data.
OpenSpirit president Dan Piette added, ‘DecisionSpace provides interoperability between multi-disciplinary Landmark applications and data stores and integration with clients’ in-house geotechnical environments. OpenSpirit will help DecisionSpace application users manage complex workflows and reach out to the other well, seismic, GIS and interpretation data stores.’ The adapter will be available in late 2007. More on DecisionSpace from www.oilit.com/ads/ds and on OpenSpirit from email@example.com.
A new release of P2 Energy Solutions’ (P2ES) Tobin GIS Studio offers direct map creation from P2ES’ Excalibur land data management system. Tobin GIS Studio Spatial Data Creator (TGS-SDC) creates maps and performs spatial analysis of land asset data.
TGS-SDC, the map-making module of Tobin GIS Studio, is an extension of ESRI’s ArcGIS Desktop. Any OLE-compliant database or ASCII data can be imported into the scalable, enterprise GIS solution for land asset management.
Creation of lease and contract maps leverages IBM’s UniQuery (Excalibur is built atop IBM’s Unidata/DB2 database) language. The system allows for access and analysis of lease and contract information stored in the Excalibur database from the map interface.
P2ES executive VP Darrell Jones said, ‘Integration of Excalibur with TGS-SDC supports spatial data analysis of data in Excalibur without the need for an intermediate database. TGS-SDC is a cost-effective means of creating and managing oil and gas land spatial data.’ TGS-SDC has been certified with ESRI arcGIS Desktop version 9.2 SP2.
MetaCarta has signed with IHS to add geographic and text-based search functionality to IHS’ oil and gas data. The MetaCarta/IHS Global Oil & Gas Geographic Data Module (OG-GDM) can now be used to search for energy-related information specific to a location.
GDM identifies and ‘disambiguates’ geographic references, assigns latitude and longitude coordinates to textual references. GDMs contain natural language processing that recognizes acronyms, jargon and information on geographic entities.
MetaCarta president and COE Ron Matros said, ‘The alliance with IHS has allowed MetaCarta to develop an industry specific GDM incorporating IHS’ rich global knowledgebase. Combining the definitive source of E&P information with our search tools allows customers to more accurately pinpoint and collect location-specific information.’
IHS VP Tim Hopkins added ‘Energy companies face the challenge of leveraging accumulated insights that relate to a particular geologic province, an individual well or other asset. With the MetaCarta offering, IHS customers can search across the proprietary archives of a retired, 30-year veteran geologist and the latest press releases and choose from a list of geographically verified matching results.’
ModViz has announced new graphics streaming products, StreamPlay and StreamPlayer that deliver application-independent 3D data visualization display for peer review, presentation, archiving and workflow based training. StreamPlay continuously records the output stream from a 3D graphics application. The results can be captured and replayed in the StreamPlayer client. StreamPlay offers 3D stream editing, interaction and lossless data compression. StreamPlay is compatible with any OpenGL application.
NVIDIA’s new ‘Tesla’ family of GPU computing products are set to transform workstations into ‘personal supercomputers.’ The top of the range Tesla GPU Computing Server houses up to eight NVIDIA Tesla GPUs—a total of over 1,000 parallel processors that add teraflops of parallel processing to clusters.
Roxar has announced a multi-year agreement with Calgary-based, Geomodeling Technology Corp., for the resale of its seismic interpretation software, VisualVoxAt. Roxar now offers a ‘full seismic through reservoir characterization interpretation solution.’
The latest release of Intervera’s DataVera adds a new ‘Match’ module. Match identifies duplicate records and applies business rules and matching algorithms to merge duplicates to a single record. Match can be used to create a master list of wells across different applications and databases.
Impress Software’s new ‘EPM’ release introduces support for Microsoft Project 2007 and ‘packaged processes’ to support a combined SAP plant and project integration scenario. Impress for EPM integrates SAP with Microsoft Project and Primavera, ‘eliminating custom coding.’
Encom’s ModelVision Pro 8.0 is a general purpose model-based potential field geophysical interpretation system. ModelVision Pro provides a wide range of import and export formats, utilities for gridding, filtering and numerical manipulation. An airborne survey can be created from a digital terrain grid or a set of synthetic drillholes created in a simulated geological model.
The new version of Mercury Computer Systems’ Open Inventor 3D graphics toolkit provides an easy-to-use API, extensible architecture and a large set of advanced components for rapid prototyping and development of 3D applications.
SensorTran has announced its DTS 5100-M18 long-range distributed temperature sensor (DTS) hardware which provides accurate temperature measurements over distances of up to 18 kilometers. The DTS product line includes an Ethernet-connected computer that is pre-loaded with DTS management software.
Sun Microsystems is to release its Solaris Cluster source code to the High Availability clusters community on the OpenSolaris site. Sun is releasing the source code for its Solaris Cluster Automated Test Environment (SCATE) along with agent related documentation. Sun also announced the commencement of ‘Project BlackBox’—a stand alone, customizable, container-housed high performance mobile data center for use—well, anywhere you like really!
Seismic Micro-Technology (SMT) which now claims to be ‘the global industry leader for Windows-based geophysical and geological interpretation software,’ has received a cash injection from Technology Crossover Ventures (TCV) and JMI Equity (JMI). The investment is SMT’s first institutional funding to date and will be used to enhance product development and expand international operations. SMT founder Tom Smith’s family will retain a ‘substantial’ stake in the company. SMT’s Kingdom software portfolio spans geophysical and geological interpretation, solid modeling and reservoir simulation.
Tom Smith, SMT founder and CEO said, ‘As the software needs of the upstream oil exploration and production industry evolve, we are committed to retaining our leadership position. TCV and JMI are experienced growth investors who understand our business and will help build on our legacy of success. This investment allows us to accelerate our international growth strategy and develop our E&P software products.’
TCV partner Jake Reynolds added, ‘This market continues to offer significant opportunities for innovative, best-in-class technology.’ Reynolds and other TCV/JMI partners are to join the SMT board.
UK-based HRH recently announced a new service, ‘Geostream’ a.k.a. ‘Geology for the Digital Oilfield.’ The wellsite and software service combines HRH’s wellsite geologists with HRH Gravitas software flagship. Data from multiple service providers is captured using the Wellsite Information Transfer Specification (WITS) and the Gravitas WinDART data acquisition module. Raw data is cleansed and quality-controlled by the rigsite geologist before batch transmittal to a Gravitas database at the clients’ location.
The office version of Gravitas offers password protected, multi-user access to geologists, geoscientists and other stakeholders for use in their databases and interpretation software. The Gravitas Winlog module provides lithology, core logs, multiwell correlations and composite logs. A ‘Repgen’ module adds custom prognoses, daily, summary and final well report templates all directly populated from the Gravitas database.
The floor plan of your regular trade show has the major vendors occupying the plushy carpeted central ‘high ground’ and the assorted minnows, dot orgs and consortia stuck out around the exhibit floor’s periphery. At the 2007 European Association of Geoscientists and Engineers (EAGE) Conference, held this month in London’s Docklands, the major vendors were joined by seven major oils—playing a newly ostentatious recruitment role. The oil and gas high ground is very busy these days. But you have to ask, how many really big stands should a trade show allow?
In the recent past, pre stack data was confined to the ‘silo’ of the processing house. Seismic processors would take the vast data volumes recorded in the field (some 300 terabytes (TB) for a Gulf of Mexico 3D survey) and process it down to a few hundreds of GB for interpretation. But interpreters interested in amplitude vs. offset, or other pre-stack indicators of the presence of hydrocarbons are increasingly leveraging pre-stack data—stressing data storage, network bandwidth and project data loading times. According to a NetAPP estimate, there are about 70 petabytes (PB) of upstream data stored on spinning disks today. Experience would tend to suggest that this is likely an underestimate now and that if it isn’t, it will be real soon!
Accessing considerable data volumes is not just a pre-stack issue. For performant visualization of a few hundred GB of stacked data, considerable hardware gymnastics are required. One such offering was on show on the Paradigm booth. French start-up Scalable Graphics was showing a cluster-based data service for visualization of 400 GB datasets distributed across eight machines. GOCAD connects to the cluster which renders the data and serves it up to the client workstation over 100MB Ethernet. A 64 nodes machine is in development with a 1.2 TB bandwidth. Doing the same kind of thing on a single client requires data decimation. The cluster solution works at full data resolution and offers a roaming frame rate of 60 frames per second. The overall result is performance akin to GeoProbe on a large shared memory architecture—without the data to memory load times.
Hats-off to Schlumberger for a ‘vendor-independent’ presentation, by Statoil’s Cathrine Gunnesdal, of the use of ProSource Results Manager (RM) for capturing interpretation results from applications including Landmark’s OpenWorks along with Schlumberger’s own Eclipse reservoir flow modeler. Statoil uses RM to capture projects at ‘decision gates’ such as a recommendation to drill, following a basin modeling exercise or prior to a license application. A high level Statoil governance ruling obligates knowledge workers to clean up their projects before storage in RM. According to Gunnesdal, standard nomenclature has been the key to success. Project data is kept for a year before deletion. ProSource is also used to create and QC OpenWorks projects and particularly, to track which seismic interpretation goes with which eclipse model—this is ‘impossible in normal workflow.’ The benefits to Statoil include an ‘awareness of doing things right,’ assuring data management and quality through a proactive approach and a strict nomenclature.
It’s nearly three years since Dalton Boutte made the bold claim that seabed logging ‘could replace seismics’ (OITJ Oct 04) and we thought that we’d see how things were progressing. EMGS claims market leadership with five crews and 250 projects to date ‘more that all our competitors combined.’ This survey count leaves the technology some way behind seismics today. But that didn’t stop PGS from picking up UK-based MTEM just after the show for a cool $275 million.
First announced in 2000, Geocap was founded by Olav Egelend formerly with Technoguide/IRAP. The geo-visualization toolbox has got some traction with use by the UN in law of sea arbitration. Geocap offers map calibration, color table management and customization through scripting. Visualization leverages the open source Kitware data model. Geocap’s seismic display capability was put to good effect by partner Roxicon whose ‘Seismic Super Survey’ sets out to emulate PGS’ Mega Survey success in the Norwegian OCS.
SEG D Rev 3
The SEG standards committee met at the EAGE to progress the SEG-D Rev 3 tape standard. This is looking beyond conventional seismics to new data types including passive ‘interferometry’ and seabed EM. The committee is planning a web service for software conformity testing. SEG-D has not embraced XML, but with more rigorous lock down of bit positions it should be easier to translate header information to a tagged format.
New OpenWorks data model
Landmark is counting down to the OpenWorks R5000 release which includes a new data model. This will bring new efficiencies for data managers and in particular, heralds enhancements to the treatment of coordinate reference systems—which has been an ‘area of frustration’ for users. The new OW data model introduces multi project management, security and the ability to subset data for distribution to partners. Landmark acknowledges that this will be a major disruption and that customers will need (and get) help with migration. Landmark has also come fully into the OpenSpirit fold in order to extend its DecisionSpace infrastructure and dev kit to third party data store access (see page 4).
IBM had an impressive smorgasbord of IT hardware on display—from its 15 teraflop Blue Gene ‘petascale’ machine to its Deep Computing Visualization (DCV) offering. This, like the Scalable Graphics solution above, offloads graphics to dedicated hardware and now supports a Windows XP client so that Petrel users can benefit. IBM was also showing a rather obscure technology leveraging the Cell Broadband Engine (BE) as used in the new Sony Playstation. The ‘evolutionary computing’ technique is used in seismic analysis and pattern recognition with the system ‘writing its own software’. Chevron and Statoil are said to be trialing the box. IBM also expects to have a GPU-based compute offering next year. By heading the TOP500 list (see page 9) IBM has a good claim on HPC leadership. By building high-end interconnect into the hardware IBM claims very high real bandwidth—280 out of theoretical 360 TF machine. Commodity-based clusters usually max out at around 10% of their notional peak.
Ovation’s Data Stewardship program, which hosts and refreshes companies’ E&P data has met with modest success with five US clients signed-up and management of CGG’s 120TB multi-client library.
NetApp is evolving its offering from hardware to application support. Upstream applications run across Oracle and flat files which NetApp consolidates to a single system, simplifying backup procedures. NetApp clients include Shell (4 PB), Aramoc (2½ PB) and Petrobras (5 PB). NetApp systems are also sold by IBM.
Headwave is now selling its pre-stack data access technology independently of Petrel as a stand alone ‘pre-stack for interpreters’ system. The compression-based technology offers access to terabyte size data sets without requirement for a storage cluster. The system targets pre-stack interpretation or processing.
Zeh is expanding its software lineup with Horizon’s ‘GIMS’ geological data management package. GIMS originated as a front end to Fugro-Roberston’s SE Asia dataset. The data viewer and service combo displays well spots with drill down to a pdf of say, a palynology report. ‘Worksets’, collections of links to data, are used to create farmout CD distributions. Zeh is working to integrate GIMS with its SeisInfo package.
Sun reforms oil and gas unit
Sun has reformed its recently disbanded oil and gas unit and is getting back market share with its x86 workstations and the ‘Thumper’ 24 TB storage system. Dual AMD processor workstations are used by Devon, BP ConocoPhillips and Chevron and WesternGeco. The top-flight U40 workstation with dual NVIDIA 5600 is in the process of certification with Landmark. Sun’s acquisition of SeeBeyond has given it an SOA offering. SeeBeyond is used by BP for global identity management.
This article is a summary of a longer, illustrated report produced as a part of The Data Room’s Technology Watch Service. For more information please email firstname.lastname@example.org.
Energistics (formerly POSC) held its EU Regional Meet last month in Total’s Paris offices. CEO Randy Clark stated that the rebrand was initiated because POSC’s market identity had faded. Energistics now has 73 member companies. Focus remains with its ‘signature’ brands WITSML and PRODML. There is a belief than billions of dollars can be saved through standardization. Even a small amount of take up equates to ‘billions of dollars of value.’ Data volumes, silos, quality and complexity make for a need for ‘interface standardization.’ There is a sense of urgency—it is not about the ‘Field of the Future,’ but about the Field of Today! Time to market is critical and companies should get involved early in the development process. ‘If we don’t do it, Oracle, SAP and Microsoft will do it for us!’ POSC’s plan for 2007 and beyond is to build a ‘value based business model’ leveraging member resources and community knowledge. ‘Priorities, initiatives and solutions are in your hands—it’s your community.’
Philippe Chalon (Total) has a long history of involvement with POSC—he was project manager of the Epicentre data model in the 1990s and Elf, Chalon’s employer of the time (now Total) was a POSC founder. The original plan was to develop a ‘plug and play’ software integration platform (SIP) to enable off-the-shelf software and a ‘buy not build’ model. This model failed because the model’s scope was too comprehensive. Today, Total’s focus has moved on from data integration to data sharing across multiple data sources. This requires semantic equivalence of objects, data quality, availability, training and global agreement with suppliers and governments. POSC failed to address the vendor problem. Halliburton and Schlumberger must be on board any new standards initiative.
Rick Morneau (Chevron) provided an update on ProdML. Originally focused on production optimization, ProdML has also proved useful in motivating vendors as they acquire more software and companies and are faced with their own integration issues. Problems amenable to a ‘community solution’ like ProdML are typically those concerning daily decision making—somewhere between real time/SCADA and longer term reservoir modeling. ProdML will be integrated into Chevron’s global ‘Jupiter’ standards in 2008.
Ron Montgomery briefly outlined the Norwegian Integrated Information Project as leveraging XML schemas, the ISO 15926 reference data library and a whole smorgasbord of standards—WITS, ProdML, ISO 15926, ISA, Mimosa, IEEE 61970/68 etc. IBM’s offering in this space is the Chemicals and Petroleum industries model-based, service-oriented architecture. By mapping to a reference semantic model it is possible to visualize all enterprise data without ‘large, bulky data models.’ A ‘piecemeal’ naming convention allows ad-hoc attachment of documents, linking asset maintenance management systems through the Mimosa standard.
Montgomery cautions, ‘IBM is always raving about SOA, but without semantic relevance to your industry it is just another point to point solution.’ SOA solves taxonomy with enterprise namespace management and industry specificity. One pitfall is to work with a single vendor solution because ‘nobody has a decent answer in house.’ IBM is ‘agnostic’ when it comes to portal and database selection which are ‘religious decisions.’ IBM spent $350,000 with standards bodies to link ISA 95 and 98—creating an explicit model, ‘BatchML’.
Google has acquired PeakStream, closed the Peakstreaminc.com website and removed all content from the Google cache! PeakStream made a brief foray into seismics at last year’s SEG, claiming a 20 fold improvement in seismic imaging on its GPU and Cell BE-based clusters.
Terri Ivers has joined AMEC Paragon as president. Ivers was previously COO of Alliance Wood Group Engineering.
Rob Glasier has been promoted to Executive VP and Head of The Americas for AVEVA with responsibility for AVEVA’s software and services.
Stephen O’Rourke has been appointed as president exploration for BHP Billiton Petroleum’s global petroleum business. O’Rourke was previously with Shell.
BP has awarded a five-year, $7.5 million grant to Stanford University’s Program on Energy and Sustainable Development to support research on modern energy markets.
Greig Henderson has joined Cairn Energy as Geoscience Data Co-ordinator. Henderson hails from Perigon Solutions. CGGVeritas unit Veritas Caspian has opened a seismic data processing centre in Almaty, Kazakhstan.
Willy Simons is MD of ESRI’s new Eastern Africa distributorship. Clive Ondimu is sales and marketing manager.
Stuart McGill is to retire as senior VP of ExxonMobil after 38 years of service.
Mohammad Enteshami is now VP of GE’s Oil and Gas Engineering business. Enteshami has been with GE for 22 years.
George Sarkisian has joined the American Gas Association as VP communications and marketing. Sarkisian was previously with the Salt River Project.
Alisa Osemwengie has joined Geotech as a software developer.
Hess Corporation announced that Bill Drennen has been appointed Senior VP, Global Exploration and New Ventures. Drennen was previously VP Americas with ExxonMobil. Drennen replaces Bob Strode who is retiring.
Ikon Science has appointed Peter Dolan as Chairman. Dolan was founder and director of JEBCO Seismic, Dolan & Associates, IKODA and Fusion Investments Limited. David Gawith and Pamela Gutteridge also join Ikon as principal geoscientists.
Liquid Computing announced the appointment of Nick Weston to lead the company’s World Wide Energy Sales team. Weston was previously wide sales director for HP’s oil and gas unit prior to which he spent 18 years with Sun Microsystems. The company also announced the appointment of Michael Bohlig as VP of Service Provider Sales. Bohlig also hails from Sun Microsystems.
Open Spirit has hired Nick Cabot as new Business Manager for Europe, and Sebastien Ferriera as new Middle East/Asia Account Manager. Cabot comes from Geotrace and Ferriera from Schlumberger. OpenSpirit president Dan Piette is now on the board of PGS.
Paradigm has appointed Andrew Stein as chief marketing officer to join its management team. Stein joins Paradigm from Geomagic, and before that was vice president of marketing, product and business development for Leica Geosystems.
Christian Giraud, formerly of CapGemini, is to head up the Eurostep standards body’s new French subsidiary. Eurostep also announced the appointment of Stephen Daimler as president of Eurostep America.
George Dames, managing senior VP, has been elected to the Ryder Scott board.
SAIC has named Amy Alving Chief Scientist and Greg Henson Head of Business Development.
Sensornet has been awarded a multimillion dollar contract by a major Middle East operator for the provision of a distributed temperature sensor (DTS) system to monitor a waterflood program on a large offshore field.
Artyom Sibgatullin has joined Techsia Middle East.
Statworks is to distribute Tecplot products in the Southeast Asian region.
Verano has changed its name to Industrial Defender.
Visean has rebranded its internet-based real-time data visualization solution as ‘PulseOmni.’
Aspen Technology has acquired its Australasian alliance partner Plant Solutions Pty.
IHS Inc. has acquired the inventory and assets of Geological Consulting Services–Houston.
The Houston-based Gas Certification Institute is offering Sarbanes-Oxley-compliant standard operating procedures (SOP) for gas measurement.
The ISA’s electronic device description language (EDDL) has been adopted as American National Standard IEC 61804 for device integration, establishing an OS-independent language for parameters, functions, graphical representations and interactions with control devices. EDDL creates description of intelligent devices for integration with control systems and handheld devices for field use.
de Groot-Bril has just announced a new release of its collaborative seismic integration platform, OpendTect. The major upgrade adds new 2D functionality, wavelet support, flattened scenes, horizon snapping, filtering and interpolation and several data management features.
The company also reports that Sinogeo, a Chinese E&P contractor, has joined phase II of dGB’s sequence stratigraphic interpretation system (SSIS) consortium. SSIS displays seismic in a Wheeler diagram or chronostratigraphic framework. SSIS II partners include Wintershall, ENI, BG Group and TNO.
Sinogeo president Jinming Zhou said, ‘We first performed sequence stratigraphy into China doing the work manually. While successful, this proved time consuming because of the lack of proper software. We joined the SSIS II consortium so that we can use the software and steer future developments to fit our needs.’
dGB is also planning to integrate the open source ‘Madagascar’ seismic processing project with OpendTect. Users will be able to launch a Madagascar processing job from within OpendTect with input and output files in either OpendTect or Madagascar formats. The project will span pre and post stack workflows. dGB is inviting interested parties to join the project.
Rob Brook (ESRI) sees GIS as federator of exploration, land, pipeline, and regulatory data. GIS offers intuitive access to work order, maintenance, CAD, environment, engineering, cost and HSE data. GIS underpins pipeline asset management and high consequence area (HCA) analysis and reporting. GIS is evolving and converging with IT. Standards from the W3C, ISO and OGC are making for ‘open and interoperable GIS’. Enterprise GIS needs a methodical approach—such as ESRI’s!
Leith McDonald described how GIS is used for integrity management of BP’s Mardi Gras Transportation Management System (MGTS). MGTS is the largest and most complex deepwater system in the world. The MGTS Data Management System was implemented in an Enterprise Geodatabase leveraging the Pipeline Open Database Standard (PODS) with offshore extensions for facilities, flexjoints, strakes and fairings and physical inspection data. The aim is for ‘a level of access to pipeline design, environmental and integrity data that will set the standard in the offshore pipeline industry.’ The system offers integration with video feeds, BP’s Documentum repository, inline inspection data and hurricane forecasts. A traffic light dashboard flags issues according to their severity. Raster imagery of landfalls and facilities , sidescan sonar surveys and ROV inspection videos are all accessible through the GIS front end. New Century Software’s PODS data browser is used for web-based data visualization. Lloyds List also participated in the development.
Scott Hills (Chevron) demonstrated the interrelationship between pipeline incidents, regulation and GIS. Onshore US incidents have halved since 2001 as successive hazardous liquids and pipeline mapping regulations came into force. US regulations are now influencing other legislation’s safety initiatives. The link with GIS is underlined by a quote from the US department of trade program director Jeff Wiese—’Our intent in the pipeline regulations was not to require the use of GIS, but I frankly don’t see how any operator can meet the requirements without GIS.’ A sentiment echoed in Chevron’s own HSE/integrity initiatives. Chevron Pipline Co. has build its pipeline integrity management system around the PODS standard alongside the ubiquitous ESRI ArcSDE spatial data store.
Wetherbee Dorshow (Earth Analytic) outlined a lightweight GIS field office solution developed for EnCana. This leveraged the ESRI geodatabase and ESRI’s ArcGIS for Petroleum data model. The presentation also covered field GPS survey data integration and pitfalls and use of the US Natural Resource Conservation Service’s soil survey geographic database for corrosion risk assessment. Geogathering presentations are available on www.geogathering.com.
Deloitte Canada, on behalf of the Energy Council of Canada, has just released a report on the ongoing ‘talent’ shortage in the Energy sector. Canada is on the verge of ‘a serious talent shortage that is expected to last for decades,’ with the energy sector expected to be hit particularly hard.
Over the past 15 years, career opportunities and market growth in oil and gas have stagnated and young people have chosen other careers. General interest in skilled trades has declined as more pursue white collar jobs. The result is a chronic shortage of qualified workers—and the problem is going to get worse.
A 2006 study by the Conference Board of Canada predicted, with astonishing precision, a labor shortfall of 332,000 workers by 2025. Respondents to the Deloitte study identified the three most critical ‘people issues’ as the difficulty of attracting specific types of labor, attracting new talent and the retirement of the baby boom generation.
Deloitte partner Dick Cooper gives the following advice, ‘Talent programs need to move from expensive band-aid solutions to becoming long-term solutions that will revive the industry.’ The report compares the current situation in the Canadian energy sector with the dot com boom of the late 1990’s but warns against the ‘failed strategy’ of large bonuses and salaries. Pressure is on organizations to find more permanent and sustainable solutions such Deloitte’s ‘Develop-Deploy-Connect,’ a ‘new model for talent management.’
The 29th ‘TOP500’ list of the world’s fastest supercomputers has just been released—with the largest turnover among list entries in the list’s history. IBM’s BlueGene/L system at the Lawrence Livermore National Lab keeps the No. 1 spot with 280 TeraFlops. Systems from Cray and came second and third.
The fastest European supercomputer is the IBM JS21 cluster at the Barcelona Supercomputing Center—used by Repsol for its Kaleidoscope seismic project (OITJ December 06). This comes in at N° 9 with 63TFlops. 58% of the TOP500 systems use Intel processors, 21% use AMD Opterons and 17% use IBM Power processors. IBM is the clear leader among the Top 50 systems, with 46% of the systems and 49%of performance. HP is currently absent from the Top 50. More from www.top500.org.
The Society of Petroleum Engineers’ first R&D Conference’s theme was ‘The Third Trillion* and Beyond—The R&D Challenges to Meeting Expanding Energy Needs.’ In the keynote, BP technology manager Tony Meggs considered the key technologies and skills required to access the next trillion barrels and how the industry’s R&D landscape needs to change to them. Meggs offered three routes to the Third Trillion—enhanced recovery of already discovered reserves, more conventional exploration and new, unconventional sources. The fourth way, energy conservation was considered off topic for the SPE. A conservative 5% hike in recovery worldwide would add 300—600 billion barrels. In Prudhoe Bay, horizontal and coiled tubing drilling, miscible gas EOR, and gas cap water injection have increased the recovery from 40% to over 60%.
4D seismic surveying and ‘massive digitization’ will make the oil field a ‘digital virtual reality’ and ‘greatly enhance our ability to optimize reservoir depletion and field management. BP also expects that the novel technique of reducing residual oil saturation (for instance by lowering injected water salinity) may add up to one billion barrels to BP’s proved reserves. CO2 sequestration will help oil recovery as large scale carbon capture leads to massive supplies of CO2 for EOR purposes. Meggs sees the future as digital as ‘distinctions between classical science, engineering and the IT department will disappear as these skill sets cohabit in a digital world.’ The R&D landscape needs to change ‘to engage in cross industry collaboration to accelerate new solution development.’
Muhammad Saggaf, Manager of Saudi Aramco’s EXPEC R&D described ongoing research into ‘extreme reservoir contact’—very long producing intervals in horizontal wells, the ‘i-field’—with smart wells and real time reservoir management. Aramco is also digitally extreme with ‘gigacell’ simulation. A current model comprises 172 million active cells, 3000 wells and 60 years of history. A match is achieved in a couple of days on a 340-node Linux cluster.
Schlumberger’s MEA chairman Mohamed Awad addressed the ‘people’ side of the R&D equation. With the current stretched employment situation, there is a ‘global need for people and education.’ ’ No country, nationality or gender has the monopoly on creativity and innovation.’ Schlumberger offers equal opportunity training and job progression and uniform compensation and benefits package for international careers.’ On the educational side, Awad cited the Middle East & Asia Learning Centre in Abu Dhabi, a $100 million investment and Schlumberger’s ‘Ambassador Program’ for university relations. This includes regular meetings with faculty and students, scholarships, hardware and software donations and R&D internships.
* The third trillion refers to current estimates that put world historical consumption to date at one trillion barrels with another trillion already found. The ‘third trillion’ refers to the remaining ultimate reserves that are believed to exist.
Amata Inc. is to provide a state-of-the-art security system for Pacific Texas Pipeline & Transportation Company. The $152 million contract includes integrated electronic security systems, electrical distribution and generator backup for two major pipelines and four tank farms. The two year project spans 900 miles of new pipeline in the western US.
Pacific Texas Chairman Cecil Owens said, ‘We are looking for best of breed in physical security and electrical distribution. Amata has the flexibility of a small company but the technological expertise to provide cutting edge security solutions for large installations. The project will serve as a model for future pipeline construction.’
Amata president Shawn Wurtsmith added, ‘Amata protects national and international pipeline infrastructure with state-of-the-art technology.’ A layered approach begins with physical barriers and surveillance with day/night thermal imaging cameras. Detection systems monitor motion and pressure while tracking systems leverage radar and ‘intelligent video’ with facial recognition and an ‘interactive data base.’ Other Amata high-end solutions include infrared pulse intrusion detection.
OSIsoft and Enspiria Solutions have teamed to offer real-rime reliability and asset management, linking OSIsoft’s PI System data historian with Enspiria’s asset management and business intelligent solutions. Enspiria is to integrate real time data in PI System with business systems to support reliability and risk management, operational data management for AMI, Key Performance Indicators (KPI) analysis and enterprise integration.
OSIsoft director Patricia Garner said, ‘This partnership will help customers maximize the value of their PI System data. Enspiria combines proven software and technology components and frameworks to deliver complete business solutions to its clients.’
Enspiria president Ivo Steklac added, ‘As we expand our services offering we recognize the importance of integrating real-time data with traditional back-office systems. Our partnership with OSIsoft means we can address the reliability and asset management issues our clients are facing.’ Last year Enspiria provided a GIS-based pipeline integrity management system to Piedmont Natural Gas.
Hot on the heels of the deal with IBM (OITJ March 07), Aker Kvaerner has extended its oil country condition-based maintenance (CBM) offering with a partnership with Sweden-based bearing specialist SKF. CBM leverages real time equipment monitoring to plan just-in-time maintenance activity.
The initial focus of the partnership will be to secure customers and installations in the North Sea and onshore in Norway, with plans to expand internationally. The agreement integrates Aker Kvaerner’s engineering and maintenance expertise with SKF’s knowledge of rotating equipment, condition monitoring and analysis.
SKF’s Ole Kristian Joedahl said, ‘Both our companies have global presence and provide complementary technology and customer portfolios. By joining forces we can provide cross border services and augment customer value.’ SKF has a 100 year history of bearing manufacturing experience and now provides CBM solutions across a range of industry segments.
Petris Technology has acquired the software and support assets of Houston-based Production Access (PA). Petris assumes worldwide responsibility for marketing development and support of PA software. Last year (OITJ Nov 06) PA’s flagship Operations Center was combined with PetrisWINDS DrillNet—combining Petris’ technical expertise management with PA’s financial and operational data management. Operations Center streamlines the acquisition and management of drilling and production operations information and incorporates it with financial data.
Petris CEO Jim Pritchett said, ‘With the synergies between the PA software and PetrisWINDS DrillNet, we see a great future for this combination. Our integration PetrisWINDS Enterprise integration platform will further enhance interoperability, performance and ease-of-use.’ Petris’ application and data management footprint now includes geoscience, drilling, production and pipelines. Most PA employees are to join Petris.
BP has signed a six-year extension to its contract with AspenTech for the deployment of the AspenOne engineering solution as a global standard across its E&P and refining business units. The agreement extends BP’s commitment to AspenTech’s engineering solutions for process simulation, design and performance optimization. BP uses AspenOne to optimize the design and efficiency of production processes at new plants and in existing facilities worldwide.
Paul Maslin, Technology Vice President with BP said, ‘We plan to continue with our use of AspenTech’s engineering software which has delivered significant value in many areas of BP’s business, improving operational efficiency and providing assurance on new plant design.’
AspenTech senior VP Manolis Kotzabasakis added, BP’s continuing commitment to our engineering solutions is a reflection of the value that these integrated applications can deliver to leading process companies throughout the plant lifecycle. Our Process Engineering suite helps companies streamline their engineering workflows and establish best practices across their organization based on consistent data and processes.’ AspenOne supports engineering simulation, costing and sizing activities, and aids decision making by providing insights into plant and equipment behavior. More from www.aspentech.com/oilitjournal.
In its annual report for 2006, the International Accounting Standards Board outlines an ongoing research project that considers accounting issues that are unique to upstream extractive activities in the minerals and oil and gas industries. The research is being undertaken by a team of national standard-setters from Australia, Canada, Norway and South Africa. The team’s preliminary findings include ‘suitable measurement bases’ for accounting for assets including minerals and oil and gas reserves and resources.
The IASB team also worked with members of another IASB unit, the Australian Committee for Mineral Reserves International Reporting Standards (CRIRSCO) and of the Society of Petroleum Engineers Oil and Gas Reserves Committee to identify the potential for achieving greater convergence or common understanding between minerals and oil and gas reserve and resource definitions. When completed, this review will assist the Board’s future deliberations on the use of the reserve and resource definitions in international financial reporting. IASB oil and gas members include Gaz de France, Nippon Oil, Petrobras, RWE and Shell.
Speaking at the World Oil Visualization meet in Barco’s offices at Kuurne, Belgium last month, Holografika CEO Tibor Balogh showed a prototype holographic display, the HoloVizio System (HVS) used as a front end for Shell’s in-house developed 123DI seismic interpretation package.
HVS is a hardware/software combination that offers very high resolution, glasses-free stereoscopic visualization for a number of verticals including oil and gas. Top of the HVS range is the HoloVizio 640RC, 72 inch 16:9 ratio screen with a 50 megapixel 3D resolution and dual gigabit Ethernet.
The HVS has been successfully integrated with Dynamo reservoir simulator and Shell’s 123DI package using Holografika’s OpenGL interface. General purpose visualization of CAD models in OpenInventor format is also possible using the IvTuneViewer and an OpenGL wrapper.
Holographika is now working on a networked holographic audio-visual platform for multi-user geographically distributed teams. Next generation displays of upwards of 100 megapixels will leverage LED technology and 3D image transmission (IPTV). More from email@example.com.
Landmark and Statoil unveiled a three year, $13 million project at the EAGE this month that sets out to build a comprehensive basin scale interpretation system. Statoil’s Jon Reidar Granli sketched out the project’s objectives to enhance Statoil’s E&P portfolio through cross discipline, interactive modeling of petroleum systems, expanding reservoir modeling to basin-scale. The system will support prospect evaluation over very large study areas as part of Statoil’s goal of leveraging technology to gain a competitive advantage.
Landmark president Peter Bernard described the 50/50 joint venture as Landmark’s largest to date. Technology to be embedded in the Statoil system is to include basin modeling, seismic processing and stratigraphic modeling—along with a comprehensive visualization and a data management upgrade. New workflows will support basin architecture and basin fill, as well as standardized-mapping workflows for ‘yet-to-find’ and ‘sweet-spot’ reserves.
Landmark’s DecisionSpace framework will be at the heart of the new solution and will be used to integrate software functionality across multiple disciplines. More from www.oilit.com/ads/ds.
The FIATECH standards body, based at the University of Texas at Austin has just unveiled a roadmap for the next three years of its activity in support of capital intensive projects. FIATECH provides standards used by owner/operators engineering, procurement and construction (EPC) companies and others involved in major construction projects including oil and gas facilities. FIATECH members share in project deliverables including intellectual property (IP) in the form of technology forecasts, XML schemas for data interchange and reports of on-site tests of new technologies.
A key objective of the 2007-2010 Roadmap is the ‘acquisition and creation’ of IP such as its Smart Chips RFID project (OITJ April 07), XML reference data libraries used for the procurement of equipment such as hydraulic pumps and control valves, and reference implementations leveraging the ISO 15926 life cycle data exchange spec.
An area of special focus is integrated, automated procurement and supply networks with focus on interoperability technology that relieves constraints and bottlenecks in the supply chain. Another project targets Automating Equipment Information Exchange.
UK-based startup Digital Earth is to use Kadme’s search and mapping technology to power its global energy industry information portal. In return, Kadme will have access to Digital Earth’s (DE) ‘social search’ and unstructured data solutions for its own clients. DE’s game plan is to develop ‘next generation’ search and collaboration tools for the energy industry. DE sources its information with a network of regional scouts, the internet, telephone calls, email enquiries, personal visits and trade shows.
Digital Earth chairman John Redfern (formerly president of IHS Energy) said, ‘Kadme’s software provides the functionality we need for our global energy-centric search portal and will accelerate our time to market. Kadme’s domain experience with oil company data and national archives will help create a data service that brings public and enterprise data together.’
Kadme MD Gianluca Monachese added, ‘DE is the perfect showcase for our software. The Amazon-derived engineering knowledge is the ideal complement to our own development and will sharpen our e-commerce capability.’