The rapidly rising sensor count on new offshore developments is having some exciting spin-offs in the maintenance sector. Sensors are primarily deployed to monitor production and other key performance indicators, but coupled with a decent historical data set and some data mining, maintaining the ‘digital oilfield’ is now moving from a set interval-based model, to a ‘just in time’ condition-based model.
Condition-based maintenance (CBM) is the focus of a new alliance between Aker Kvaerner and IBM’s Norwegian unit targeting both offshore and onshore oil and gas production and processing facilities. Under normal circumstances, CBM leverages monitoring to minimize plant downtime, but could also avoid serious incidents when a scheduled maintenance program might fail to catch an unexpected equipment failure.
The agreement integrates Aker Kvaerner’s (AK) competence in engineering, planning and maintenance with IBM’s expertise in data acquisition, integration and analysis. The initial focus will be on installations in the North Sea and onshore in Norway, with plans to expand internationally. AK claims leadership in the provision of operation and maintenance services in the North Sea and has long term contracts with many Norwegian operators. The company also provides operation and maintenance services in the UK, Canada and in the United States.
Center of Excellence
The alliance will leverage work done in IBM’s Stavanger ‘Center of Excellence’, a joint venture with Statoil, ABB, Aker Kvaerner and SKF that opened last year with a three year, €15 million budget. R&D at the center targets Statoil’s use of new technology to ‘extend oilfield lifespan, increase safety and reduce environmental risks.’
The CBM project is one the first fruits of the Center’s work on ‘smart sensing’ and integrated operations. The Center is one of five such IBM facilities around the world – the most recent IBM Oil Sands Centre opened in Calgary this month (see back page). According to the Norwegian Oil Industry’s Association, OLF, it will be possible to ‘extract increased revenues of NOK 250 billion’ through the use of integrated operations, a significant part coming from efficiency gains in the maintenance process.
AK President Tore Sjursen said, ‘This deal implies outstanding business opportunities in our home market and internationally.’ For more on the AK/IBM alliance and on CBM’s potential read our interviews with AK’s Arne Bjorlo and IBM’s Arild Kristensen on page 3 of this issue.
E&P Data Management consultants Stephenson and Associates (S&A) have just announced a re-write of their PSApphire data access tool in PHP. The open source GIS toolset Cartowebs/Mapserver provides entry-level GIS access to E&P data.
Currently the system is deployed on Oracle’s free 10g ‘Express Edition’ but an abstraction layer provides database independence. Data is stored in its original format and coordinate data system. Re-projections leverage the authoritative EPSG coordinate database.
S&A also offers a range of media transcription and storage services including scanning, vectorizing and indexing. The solution targets smaller companies and countries embarking on data room promotional activity.
S&A clients include Amerada Hess, the African Petroleum Producers Association (APPA), BHPBilliton, BP, ExxonMobil, IHS Energy, Maersk, Marathon, Hydro, Petronas, Premier Oil, Schlumberger/PDVSA, Shell, Total, Unocal and Woodside.
The Microsoft-sponsored survey by Gulf Publishing unit Gulf Research* we report on on page 8 found that ‘Microsoft operating systems dominate high performance computing (HPC) in upstream oil and gas.’ More precisely, 96% of the respondents reported HPC use of Microsoft Windows, ‘on a daily basis,’ while 73% of the sample reported that they ‘never used’ Linux for HPC.
When I read this, I thought I had woken up from a dream to find myself in a parallel universe! My impression, having tracked this issue for some time (see our Technology Watch report from the 2000 Microsoft in HPC meet, now in the public domain on www.oilit.com), was that Microsoft was in the position of a ‘wannabe’ trying to gain ground from Linux’ huge success in this space.
Only last December, we reported from the High Performance Computing Session at the SEG (Oil ITJ Vol. 11 N° 12) where IBM HPC guru Earl Dodd who stated that ‘Commodity Linux clusters have repealed Moore’s Law and now dominate the HPC landscape.’ Dodd noted extremely high take up in oil and gas, the second HPC vertical after government.
Reading as I do, press releases on the results of surveys on this, that and the other, one is inured to the bold forecasts made for, say the take-up of RFID technology over the next 10 years, the monetary benefits of full deployment of ‘digital oilfield’ technology or even how many engineers the oil and gas business will employ a few years down the road. Generally, such burning issues are addressed by ‘researchers’ phoning round some folks who a) have a job to do and b) don’t know the answer to any of the above questions. The results are predictably random. You might as well ask people to give their opinions on successive digits in the expansion of Pi, although it might be harder to monetize the results.
But the Microsoft survey is not a forecast, but ostensibly a survey of the status quo. I had to check my facts. When the old brain is called into question, I turn to Wikipedia, font of a lot of, if not all, knowledge and particularly well informed in IT matters. What does Wikipedia have to say about HPC? I quote, ‘The term high performance computing (HPC) refers to the use of (parallel) supercomputers and computer clusters, that is, computing systems comprised of multiple processors linked together in a single system with commercially available interconnects. [...] Because of their flexibility, power, and relatively low cost, HPC systems increasingly dominate the world of supercomputing. Usually, computer systems in or above the teraflop-region are counted as HPC-computers.’
Well there’s a clue to the mystery. If you phoned around folks engaged in upstream IT, how many would have a Teraflop at their disposal? How many would know what a Teraflop was even? But I digress. Returning to my Wikipedia oracle, I asked, ‘where do I go to learn more about HPC and what operating systems are used?’ The response was, ‘The most powerful high performance computers can be found on the Top 500 list—www.top500.org.’
Hey, if it’s that easy, I thought, why didn’t Microsoft just publish the URL of the Top 500 and save all that money on the ‘Survey.’ The answer could be that the Top 500 list paints a very a different picture of HPC. The latest survey, carried out in November 2006 (the list is updated every six months) confirms Linux’ domination of HPC. Approximately 75% of the machines on the TOP500 list are running one flavor or another of Linux. The only other systems that rate anywhere are IBM’s AIX and HP-UX with 8% and 5% respectively. Microsoft does not even make it onto the list!
Compute Cluster Server 2003
This got me wondering if Microsoft’s definition of HPC was somehow different from the TOP500 list. So I went onto Microsoft.com and did a search for ‘HPC’. As I expected, this brought up a list of references to Windows Compute Cluster Server 2003. So HPC for Microsoft is about clustered COTS** computing—just like the TOP500.
Oil and gas
The TOP500 list is self reported and probably does a poor job of counting industrial deployment of HPC. Many companies consider the details of their HPC deployment as commercially sensitive and keep the number to themselves. Could oil and gas be different from the Top 500? Certainly not seismic processing, which has spearheaded the move from the large ‘shared memory’ architectures of the past to clustered COTS architectures running Linux. Reservoir engineering has seen a less spectacular paradigm shift. Flow modeling is less ‘embarrassingly parallel’ than seismic processing.
Compute Cluster Server 2003
Microsoft could have solved the mystery by instructing its researchers to ask the simple question, ‘How much of your computing is performed on Windows Compute Cluster Server 2003 (CCS)? With the ‘march of the Petrel’ into reservoir modeling, and the enthusiasm for Microsoft’s solutions demonstrated by some major oil and gas accounts, there must be good reasons for departmental level HPC leveraging Windows CCS. It would have been more credible for Microsoft to have reported even modest numbers on such possible deployments, rather than the marketing delirium of the ‘Survey.’
** Common off-the-shelf i.e. commodity hardware—a large number of interconnected PCs.
Bjorlo—This deal is a component of our modification and maintenance strategy. We believe that with the increasingly instrumented plant, the possibilities for condition-based maintenance (CBM) are growing. New facilities are extensively monitored for equipment performance—and we are now in a position to leverage this information in maintenance.
OITJ—Isn’t this done already?
Bjorlo—CBM is performed on heavy rotating equipment, but we want to extend this to the rest of the plant. Downtime is often related to static equipment maintenance or failure. We need to devise methods to monitor tanks and piping. This is a promising new business for us.
OITJ—How does CBM work?
Bjorlo—Often operators engage niche companies for online monitoring of heavy rotating machinery. Data from Statoil’s producing assets is analyzed onshore in Trondheim. Vibration patterns are monitored to detect abnormal events which can be precursors of failure. This allows for timely intervention.
Kristensen—Today most maintenance is schedule-based. You may take down a pump four times a year to find that it only required two overhauls in that time frame. This is in the context of downtime being the biggest cause of lost production. We plan to leverage sensor measurements to significantly reduce maintenance downtime maybe to the two interventions that were really needed. CMB is increasingly relevant as sensor count rises both onshore and on new offshore fields like Gjoa (Statoil/GDF) and on the BP/Statoil Valhall new development.
Bjorlo—As much as 80% of scheduled maintenance interventions could have waited another year. But you never know for sure. Now we have other monitoring technology—ultra sound, tank radar and temperature which can be used to monitor sediment build up in tanks and pipes—enabling true CBM.
Kristensen—The Norwegian industry body OLF’s report on Integrated Operations* found potential savings of 250 billion NOK in four areas, one of which was maintenance. Today, niche vendors like SKF do a great job. But with the expanding potential of CBM, companies are looking for a single point of contact. This is why AK will be acting as a ‘prime integrator,’ not necessarily doing the work. AK has experience of maintenance, but needed an IT partner to help take data from sensor streams. Which is where IBM came in. For us, this is a classical infrastructure play calling for a SOA feeding data to AK’s or customers’ systems.
OITJ—That’s quite a leap to go from SCADA to a ‘classical infrastructure’ play!
Kristensen—It is, but we are working in several verticals to connect SCADA to SOA. We are also partners in Statoil’s TAIL project and have developed a WebSphere-based platform for integrating telemetry and SCADA data into enterprise asset management systems, mapping industry standards like ISA 88/95 and Mimosa. We are also closely involved with the Integrated Information Platform (OITJ Vol.10 N°6). IBM is a big proponent of semantic technology.
OITJ—So far no mention of Maximo!
Kristensen—IBM works with both Maximo and SAP and in Norway, SAP is far the larger business. AK is on SAP. OITJ—Now you map SCADA to SOA this opens the way to many other applications.
Kristensen—Yes, this is a major play in integrated operations. IBM has activity in drilling, in process (with ABB and Honeywell) and we are working with real time all over the world.
OITJ—You mentioned semantics. Do you leverage the Semantic Web?
Kristensen—Yes we see it as key future technology. The IBM Semantic Engine is just out of research.
OITJ—So you are putting RDF tags in you standards so that they can be consumed by future semantic services?
Kristensen—[Laughs] I hope so!
IHS has just announced Enerdeq Web Services (EWS), a services-oriented-architecture (SOA) for IHS’ data access. EWS is a software development kit (SDK) for creating ‘live’ data links between interpretation projects, modeling software and corporate data stores with current IHS data. SOA creates dynamic links to IHS’ huge dataset of production data from hundreds of thousands of active wells across the USA.
To date, companies have had to develop their own tools for re-populating spreadsheets, updating models, or refreshing a well data repository with a combination of proprietary data and IHS’ data. The resulting interfaces have proved hard to maintain. EWS reduces development time and ensures that the data seen by an application is the current information on IHS’ servers.
Early adopters of EWS include the Timoney Group which has integrated IHS data from the Jonah Gas Field in Wyoming into Google Earth. The prototype can be seen at www.jonahgas.com. Timoney Group president Brian Timoney said, ‘This project shows one of the emerging trends in the mapping industry, letting users decide what interface they want to use. EWS lets developers serve different audiences through a single API.’
IHS VP Rich Herrmann said, ‘We’re providing flexibility to tailor data access to customers’ processes. We use EWS to integrate our data and with own software, PowerTools and Petra. We’re leading the industry in the application of web services to streamline our clients’ management of project data.’ IHS uses EWS to serve data into its Petra applications. The latest Petra release includes a ‘Direct Connect’ feature, developed with the EWS SDK, to creating or refresh interpretation projects with the current IHS data.
You’d think that, in the run-up to a major corporate merger, software decisions might be on hold, pending tussles over rationalization and future toolset ‘standardization.’ Not so for (ex-Norsk) Hydro, which has engaged on a veritable software shopping spree since the announcement of its merger with Statoil. The acquisition program has been accompanied by a vociferous campaign of press releases.
Latest in the series is the announcement of a ‘global software agreement,’ a $13 million deal that establishes Petrel as Hydro’s subsurface reservoir characterization standard. Hydro chose Petrel for seismic interpretation and well log correlation and as a complementary application for reservoir modeling.
Justin Rounce, Schlumberger Information Solutions VP Software, said, ‘The ability to share knowledge and update models with new data allows Hydro to optimize exploration and development plans based on the analysis of numerous scenarios. Petrel’s model centric workflow also supports greater collaboration than has traditionally been possible.’
In a separate announcement, Statoil presented its view on the synergy potential from its merger with Hydro’s E&P arm. Merger synergies are expected from cost reduction, higher revenues from better resource deployment and new growth opportunities thanks to larger organizational and financial clout. Overall cost synergy potential is estimated at NOK 4 billion per year before tax.
Textual and geographic information blending specialist MetaCarta now supports Google Earth’s data format in its Geographic Text Search (GTS) and geOdrive products. Users can now search text documents for geographic locations and visualize references within Google Earth.
Google Earth (GE) uses the stripped down geographic data format, Keyhole Markup Language (KML) named from the original developers of GE that was acquired by Google in 2004. KML has a tag-based structure with names and attributes used for specific display purposes.
GTS identifies geographic references in text documents such as RSS news feeds, Microsoft Office, PDF and HTML, assigning geographic references that can be viewed on a map. Other MetaCarta GIS interfaces exist for ESRI ArcGIS Explorer and NASA’s World Wind.
Saudi Aramco has weighed into the standards arena by teaming with the UK-based Energy Institute (formed in 2003 the merger of the Energy Institute and the Institute of Petroleum) as a new technical partner. Aramco’s Netherlands-based unit Aramco Overseas Company (AOC) is to be involved in and support the work of the EI in ‘developing industry standards, guidance and codes of practice.’
more standards ...
Saudi Aramco organized its first International Standards forum and workshop last November in Saudi Arabia. The knowledge sharing event provided participants with an opportunity to broaden their understanding of the International Standards Organization (ISO) and its activity relating to the oil and gas industry. The meeting was attended by experts from 34 engineering standards committees.
Earlier last year Qatar Petroleum held yet another standards meet in the Gulf, this time with help from the International Association of Oil and Gas Producers (OGP). The workshops focused on standardization and cooperation between operators and standards bodies in the Middle East and Gulf region.
E&P data quality specialist InnerLogix has signed-up Houston-based Newfield Exploration as its latest client following a pilot test of its QCPro data quality solution for upstream data and information.
Newfield Geoscience Application and Data Coordinator Jim Day said, ‘QCPro provides an efficient way of assuring data quality. Our geoscientists have already seen improvement in both data validity and availability. We now plan to utilize InnerLogix’ solution to apply advanced business and quality rules for maintaining consistency between data stores.’
InnerLogix’s proprietary software suite, QCPro, automates assessing, correcting and exchanging upstream data. QCPro simultaneously supports a combination of multi-vendor and proprietary data sources through a large set of plug’n play adaptors.
Innerlogix CEO Jack DeLage added, ‘This initiative will ensure
that Newfield’s data is consistently managed through its entire life cycle from
corporate to project data
OpenSpirit has announced enhanced workflows for Petrel including improved data management, a ‘refresh’ option for existing data, explicit indication of data tied to external datastores and use of more OpenSpirit data types including 2d seismic horizons. Petrel projects can be tagged with original units and coordinate system, reducing the chance of data busts, although the onus is still on the Petrel operator to select the correct CRS.
Interactive Network Technologies (INT) has released GeoToolkit.NET (GTK.NET) 2.2, a package of C# libraries for seismic, contour, well log, and well schematic displays. GTK.NET provides multi-language support, simple web deployment and sophisticated vector object handling. Scaled display and hardcopy (including CGM output) are included. WellSchematic libraries now read and export SVG image files. Windows XP and Vista, 32bit or 64bit versions are available.
Tecplot has negotiated exclusive worldwide distribution rights from Harmonic Software for its O-Matrix Analysis package. O-Matrix is an integrated analysis environment for scientific and engineering technical analysis and simulation. O-Matrix performance is claimed to better C++ , Fortran and integrated math and engineering software tools.
Iron Mountain Digital has extended its eRecords management portfolio with a new ‘Active Archiving Service’ (AAS) for email. AAS is a comprehensive solution for managing email storage, retention and legal discovery, reducing the size of email stores by as much as 80%. AAS completes Iron Mountain’s Total Email Management suite of email storage, discovery, disaster recovery and security solutions. AAS, delivered through a partnership with MessageOne, includes a ‘continuity service’ that can be rapidly activated following a network interruption, facilities outage or disruption. This package is claimed to help users satisfy new ‘eDiscovery’ regulations concerning retention policies and access to emails.
The Transport Department of Valencian Government has released gvSIG (www.gvsig.gva.es/index.php?id=gvsig&L=2) a free Open Source application for spatial data management. gvSIG can display local and remote data in the same window, leveraging standards such as WMS, WFS, WCS and JDBC. A coordinate reference system database module embeds the EPSG, IAU2000 and other transformation. Maximum precision is claimed for gvSIG CRS transforms.
ER Mapper has just released its Image Compressor, a low-cost effective desktop application for high-speed JPEG 2000 or ECW compression of geospatial imagery. Using the open JPEG 2000 format ensures maximum data interoperability between applications and organizations. JPEG 2000 is said to be attractive to government agencies wanting to hold public imagery assets in an open and accessible format. Image Compressor provides color balancing, mosaicing, cropping and reprojection. A trial version can be downloaded from www.ermapper.com.
As this issue has a distinct GIS bias, we thought a mention of a new book from O'Reilly Media would be appropriate. Andrew Turner’s Introduction to Neogeography is a bang up to date exposition on the topics of GIS, GeoRSS, KML and other ‘microformats.’ ‘Neogeography combines the complex techniques of cartography and GIS, placing them within the reach of users and developers.’ The O’Reilly ‘Short Cut,’ (ISBN 9780596529956) is a $10 download from www.oreilly.com/catalog/neogeography.
Phase II of de Groot-Bril’s (dGB) OpendTect Seismic Stratigraphic Interpretation System (SSIS) project has secured funding from ENI, Wintershall and BG International with technical support from TNO and Brad Macurda of The Energists.
Phase II is an 18 months project which will extend SSIS with support for 2D seismic data, better 3D chrono-stratigraphic horizon tracking and a new tool for manual interpretation of sequence boundaries. SSIS II will also introduce calibration with respect to absolute geologic time.
The first SSIS phase saw the release of the OpendTect SSIS plug-in, now nominated for the Lillehammer 2007 award for Eureka projects. dGB is seeking additional sponsors for the project. The official start date is 1 April 2007.
UK-based data management specialist DPTS has been awarded a contract from Ghana National Petroleum Company (GNPC) to assist with the transcription of its data archive in Tema, Ghana. DPTS has been working with GNPC since 2003 when the company supplied a tape transcription system and its Diplomat software for GNPC’ own use.
A second contract was awarded 2005 for the transcription of 6,500 9-track tapes which were stored in the UK. Two members of GNPC staff worked very closely with DPTS on this project and thus gained experience in tape transcription methods and procedures.
The current phase adds a further 13,500 tapes and cartridges and involves secondment of a DPTS project supervisor to Ghana along with an additional transcription system. Project duration is five months.
Now in its 17th year, the ESRI Petroleum User Group is quite a phenomenon. BP’s Charles Fried, PUG chairman, announced a record registration of 1,300 individuals from 350 organizations and 28 countries. The PUG’s active working groups, on 3D, metadata, geodetics and more feed into the PUG ‘LIST’ which has now expanded to a full day session.
ESRI founder and President Jack Dangermond spoke of the ‘openness’ of knowledge sharing through geographical information systems (GIS). GIS impacts every facet of the petroleum business, from E&P to environment, facilities. GIS is emerging as a core IT system in oil and gas moving from desktop to department and the enterprise. For Dangermond, GIS’ ‘business model’ of the world is ‘our best shot at integrated knowledge.’ GIS provides an ‘intuitive and analytical framework’ that defines interconnections between things leading to ‘oh now I understand’ moments. ESRI is moving towards services-oriented architecture (SOA). SOA fits the GIS data delivery paradigm with a multiplicity of publishers and shared interconnections.
Google Earth has opened the worlds’ eyes to the possibilities of GIS. Individuals are now authors and consumers, sharing knowledge with a whole new world of users thanks to SOA. In oil and gas, land, upstream, and other departments will all be able to leverage each others’ data thanks to the GIS. The IT stack is ‘evolving nicely’ and web services XML, SOAP are gaining acceptance. The move to a SOA is the ‘icing on the GIS cake.’ ESRI has worked hard on openness and interoperates with Autocad and Google Earth. Dangermond remarked, ‘As Google opens their content and I’m sure they will, you’ll be able to access more of their content from within ArcGIS.’
ESRI’s latest 9.2 ArcGIS release adds several functionalities that originated as PUG List requests. 3D contouring has been enhanced with management of triangulated irregular networks (TIN) surfaces of elevation data. Image and raster capabilities have been improved with the new ArcGIS Image Server. A new ArcGIS relational schema offers better integration with data stored in Oracle—allowing for direct SQL access (as opposed to SDE). New API functionality lets developers blend ESRI, CAD and other data formats. The new technology platform is set to ‘do for enterprise what Google Earth has done for consumers.’
ArcGIS Explorer a.k.a. ‘GIS for everyone’ is a ‘free’ Google Earth-like client. The ‘globe’ interface and ‘swipe’ tool comparator were used to good effect in a variety of oil-related demos including the geoprocessing of a real time weather data feed to plot a plume of fumes from a fire.
But the star announcement was ArcGIS online, ESRI’s answer to the server side of Google Earth, to be released this month along with a high resolution set of US imagery. There will be a fee for some usage types but it will be ‘unbelievably cheap.’ A deal with PenWell adds pipeline data to the mix and the inclusion of National Geographic data was mentioned. Layers can be imported and saved to the desktop. And in a ‘couple of mouse clicks’ you have a map (applause).
Keith Everill traced BP’s history of GIS. BP started with ArcInfo in 1989, but has struggled to promote use beyond specialists and small, unsupported communities. GIS take-up has been handicapped by multiple toolsets and deployment by discrete business units, leading to duplication and little data sharing. BP successfully deployed a single version of Microsoft Office, but there is no similar ‘common operating environment’ for GIS.
This changed when a BP senior manager saw the GIS light and initiated the ‘GIS Appraise’ project, ‘a high level strategic statement to govern and articulate the vision going forward.’ In September 2006, a study determined that BP has been ‘slow to realize the value of GIS.’ There was also a realization that putting a dollar value on the benefits of GIS ‘was no longer the driving force.’ BP is now in the final stages of specifying its solution for a ‘full business lifecycle GIS.’
Keith Fraley (Shell) believes that the current situation regarding the consumer and corporate approaches to GIS is both paradoxical and ironic. Google Earth has heralded the dawn of ‘geospatial enlightenment’ and the recognition that geography is fun. On the heels of GE and its vector brother, Google Maps, comes the ‘mashup ,’ ad-hoc combinations of spatial data layered onto a map (e.g. www.wikimapia.org). Wikis (and GE) contrast with the ‘complex’ windows icons menu pointer (WIMP) paradigm of corporate geospatial solutions. This more specialist approach ‘inhibits mass adoption.’ Silo-oriented software solutions compound the problem. Engineering chooses Autocad, the business goes for Mapinfo and geotechnical deploys ESRI. This leads to a ‘redundant asynchronous, heterogeneous geospatial technology landscape’. Naturally, everyone wants to consolidate to their own toolset. This becomes a ‘very political and costly trek towards the enterprise geospatial paradigm.’ A general lack of understanding between GIS and the IT department compounds the problems.
The answer is the services-oriented architecture (SOA) with geospatial web services serving a ‘true enterprise geospatial infrastructure.’ Using slideware from IBM and ESRI, Frayley showed how ESRI’s latest marketing is aligned with IBM’s view of SOA ‘an awesome slide’. Arc 9.2 serves catalogs, maps etc as XML/SOAP service. Google Earth and Autocad can integrate this framework.
One of BP’s GIS implementations was the subject of Nisha Punchavisuthi’s presentation. BP’s strategic planning and integration tool (SPIT) ‘facilitates strategic thought and enhances visualization with an integrated platform for exploration data.’ Dynamic data is served with Microsoft SharePoint. SPIT includes BP’s prospect inventory and spreadsheets for data input. These are dynamically linked to an Access database whose tables are recognized in Arc GIS. End users see a simplified map view in ArcReader with a tree-view of layers, allowing drill down to OilOnline, and other data sources.
Mark Dumka (Talisman) and Kenyon Waugh (Valtus) described how Talisman has moved to a hosted paradigm for its spatial and image data. Data such as 1m resolution satellite imagery, infrared, Landsat 7, Canadian NTS scanned topographic maps and USGS DRG are all served from Valtus’ in a ‘one stop shop.’ One client has 1½ petabytes of data stored with Valtus. Valtus has relationships with aerial and satellite data collectors. By using Valtus, Talisman has greatly reduced internal infrastructure requirements and is leveraging Valtus’s experience of ‘troublesome’ raster datasets. The hosted model also makes it easy to deploy new technologies for storage and delivery.
A presentation by Brett Vidican (Universal GeoSystems) showed how a nifty device, the ‘Dodeca’ video camera from Immersive Media, is transforming pipeline project management. The 11 lens camera is slung beneath a helicopter and records continually along a proposed route. After processing, the flight can be viewed as a VRML file with zoom, pan user interaction.
Immersive Media’s Dodeca Videocam
Geodetics, datums ...
No PUG would be complete without warnings about correct use of geodetic datums and transforms. While not actually a PUG presentation, we were pointed to a telling argument in favor of doing your geodetics right. This is from the USGS National Imagery and Mapping Agency, ‘When a geodetic datum is changed, coordinates of a point will usually change too. In some cases, the differences can be as large as 900 meters. Why is this important? If a soldier calling for close air support has his coordinates with respect to one datum and your coordinates are with respect to a different datum, you could fire at the targeted location and miss the requested location by hundreds of meters. The most severe consequence of your action being friendly fire!’ Now do you understand?
This article has been abstracted from an illustrated report from the PUG produced as part of The Data Room’s Technology Watch reporting service. You can now view the full text of our report from the 2005 PUG in the TechWatch section of www.oilit.com. For more on this subscription-based service please visit the Technology Watch home page or email firstname.lastname@example.org.
The Abu Dhabi National Oil Company (ADNOC) has teamed with Schlumberger to launched the Schlumberger Middle East and Asia Learning Center, in Abu Dhabi, United Arab Emirates. The Center will provide advanced training for the oil industry through state-of-the-art facilities including a custom-built training rig.
Omair bin Youssef
Officiating at the inauguration, ADNOC CEO Youssef Omair bin Youssef said, ‘ADNOC takes great pride in having this facility in Abu Dhabi where our engineers and technical staff can acquaint themselves with latest technology.’ The final cost of the Center, when completed in 2008 will be in the order of $100 million.
The facility will provide Schlumberger and ADNOC professionals, along with students of the Abu Dhabi Petroleum Institute with training in data services, software, seismic, reservoir evaluation, cementing, stimulation, directional drilling, measurements while drilling and artificial lift.
Schlumberger chairman and CEO Andrew Gould who also officiated at the ceremony added, ‘People are one of Schlumberger’s key values. Their motivation and dedication to customers are our greatest strength. Every year, we hire many newly-qualified engineers and technicians who have yet to acquire industry experience. Our goal is to ensure that they have access to the same training and development opportunities regardless of where they come from and where they are going to be assigned.’
The Schlumberger Middle East and Asia Learning Center (MLC) is the newest and largest multi-disciplinary oilfield services training center in the world. The MLC is the fourth such facility, joining existing Centers in the USA, the UK and France.
In his address to the UK-based Energy Institute last month during IP Week, Andrew Gould explained why training has come to play such a critical role to Schlumberger and to the industry at large. Noting the ‘extraordinary’ turn around that has taken place over the last two years, Gould opined that, ‘The only serious constraint to a smooth, steady increase in new supply is in the availability of people with proper experience and education.’
Gould noted a personnel shortage ‘at almost all levels of our industry,’ the result of 20 years of under-investment in new talent. During the period from 1993 to 2000, Western oil companies shed some 200,000 jobs. In 1984, some 1,500 petroleum engineers graduated in the USA. By 2000 this had dwindled to 260!
A study by Schlumberger’s Business Consulting arm determined that ‘although the supply of technical professionals may well be sufficient to meet demand at a global level, major shifts in recruitment patterns will be needed. These shifts present challenges for the today’s competency development and career models.’ Gould noted that attracting the best talent will require acceptance that career advancement be open to all nationalities, ‘something that all companies in the industry need to take very seriously.’
Oil and gas still operates by taking the expertise to the problem rather than bringing the problem to the expertise. But today’s information and communications technology (ICT) are increasingly making remote job monitoring a possibility. Remote drilling operations centers have multiplied drilling engineers’ productivity two or three-fold, measured by the number of wells that they can supervise simultaneously. Similar productivity gains are likely to accrue from the ‘digital oilfield.’
Remoting supervision does not mean doing everything from Houston. Aside from the Abu Dhabi training center, Schlumberger has opened R&D centers at Moscow State University and in Dhahran close to the King Fahd University. Schlumberger’s primary research center is moving to Boston, Massachusetts, next door to MIT.
Japanese auto industry
Gould concluded with a warning to a complacent West, ‘Demographics and the need for producing nations to access technology could well shift a large part of the technology advantage to those producers. I am sure the skeptics out there are thinking ‘impossible’. But I would remind them that the US automobile industry did not take the Japanese seriously until it was too late.’
WellPoint Systems is to acquire South African software house iSoft Technologies. iSoft develops enterprise asset management solutions based on Microsoft Dynamics.
Verano has appointed Todd Nicholson and Chris Martin to its marketing team. Nicholson hails from EMC and Martin, from AccuSoft.
Toronto-based Geosoft has launched a Global Solutions Group, headed by John Mertl, to deliver IT solutions to the mining and explorations industries.
Teledyne is to acquire the assets of electrical and fiber optic interconnect specialist DG O’Brien.
The company to be established through the merger between Hydro’s oil and gas activities and Statoil will be named StatoilHydro.
Rick Robinson has joined Ryder Scott as a petroleum engineer from Exxon Mobil. The company also reports the death of Harry Gaston, president Emeritus. Gaston started with Ryder Scott in 1967 and pioneered the use of computers in engineering, developing a cash-flow program that became a company standard.
The Petroleum Technology Transfer Council has promoted Lance Cole to Executive Director following the departure of Don Duttlinger.
Pitney Bowes is to acquire MapInfo Corp.
Petrolink now offers a real-time data hub in Perth, Australia.
Paras Consulting has appointed Michael Woodward (ex-CapGemini), Phil Challis, Carol Dye and Vicky Garrard (ex-Tribal Technology).
Oracle is to acquire Hyperion.
Pioneer and Denbury Resources have signed with supply-side e-commerce grouping OFS Portal.
Knowledge Reservoir (KR) has teamed with Modern Petrotech and opened a new office in Muscat, Oman. Radix Technologies is to represent the company in Mumbai, India. Ian Lilly has joined the company as Region Manager, Asia Pacific located in Kuala Lumpur, Malaysia. Chad Brown has been appointed to KR’s new Midland, TX office.
Ikon Science has acquired Edinburgh-based Anitec, a seismic anisotropy specialist. Anitec founders Colin MacBeth and Phil Wild are to join the Ikon team.
David Liddle has been appointed Production and Facilities Technology Manager for the Industry Technology Facilitator (ITF) in Aberdeen. Liddle was previously chairman of the Aberdeen branch of the Society for Underwater Technology.
Justin Barr is to head up ISS Group’s new Adelaide office and Chief Executive Officer Abe Shasha is relocating to the new Houston office.
Tim Conn is to head up GeoMechanics International’s (GMI) new office in Calgary, Alberta. GMI has also named David Bowling as business development manager in Kuala Lumpur, Malaysia. Conn hails from CoreLab and Bowling from RPS Energy. GMI has also added Dan Jezerinac (Dresser) and Randy Keys to its board.
Fugro is to acquire aerial survey specialist MAPS Geosystems.
Brian Boulmay has joined facilities GIS specialist Telvent Miner and Miner. Boulmay was previously with ESRI.
Assistant Secretary for Fossil Energy Jeffrey Jarrett has resigned from the Department of Energy.
Divestco is to acquire BlueGrouse Seismic Solutions of Calgary.
ConocoPhillips has given $6 million to the University of Oklahoma’s School of Geology and Geophysics.
Kevin Colburn is now sales manager at CEI/Ensight’s Houston office.
Aspen Technology’s ‘filing delinquency’ has been rectified and the Nasdaq’s hearing request has been canceled.
Marilyn Bier has been named Executive Director of ARMA International.
Andrew Soto has joined the American Gas Association as senior managing counsel of regulatory affairs.
A2D Technologies’ LogLine online database now holds 3.5 million US logs. LogLine made its first sale over the internet 10 years ago.
Perficient has acquired E-Tech Solutions for approx. $12 million.
Energistics (formerly POSC) has removed the $10 price tag from its comprehensive units converter which will likely be aligned with the POSC/WITS/PRODML units at a future date. More from www.energistics.org/posc/UniversalUnitsConverterV11.asp?SnID=994389.
A Microsoft-sponsored survey by Gulf Research* has found that ‘Microsoft operating systems dominate high performance computing (HPC) in upstream oil and gas.’ The survey, carried out last month, quizzed 104 users in service companies, oil and gas companies, consultants and academia. A ‘95% confidence level’ for the results was claimed.
Apart from the staggering claim of ‘dominance’ in HPC, the study found that Microsoft ‘applications’ were used most often for data manipulation and reporting with 50% of the sample using internally or externally developed bespoke applications. About half of the sample spent 35% or less of their time on ‘high-performance technical computing functions.’
Users reported ‘satisfaction’ with current level of access to compute power, most having ready access to sufficient compute power on their desktop. Data integration was mostly provided by systems within the company. Half the sample spent less that 20% of their time manipulating data and preparing final reports.
Users of custom or internally developed software users were more satisfied than users of third party software with the highest satisfaction reported from users of volume interpretation, geological modeling, mapping and well planning. Lowest satisfaction was reported from uncertainty management, rock physics, and data integration applications.
In answer to the question, ‘Which operating system do you use for your HPC environment?’, 96% reported use of Microsoft Windows, ‘on a daily basis.’ Moreover, some 73% of the sample reported that they ‘never used’ Linux for HPC.
The results from Microsoft’s promotional study fly in the face of reality. HPC is a huge Linux success story to the virtual exclusion of Microsoft’s technology. For more on how to get the answers you want from a survey by asking the wrong people the right questions, see this month’s editorial.
Houston-based Technical Toolboxes has just released the 2007 version of its Pipeline Toolbox, an integrated pipeline industry software package with some 60 modules designed specifically for the pipeline professional. Pipeline Toolbox 2007 (PT 2007) comes in Gas, Liquid and ‘Enterprise’ (Liquid & Gas) versions.
PT 2007 embeds a customizable database of commonly transported liquids which includes multiple property information and calculations. Computations embed physico-chemical properties from standards bodies including the API, ASTM and IAW. The new release adds computation for maximum impact load and penetration depth, enhanced design of uncased crossings and an external corrosion direct assessment toolset to estimate the remaining life of corroded sections. Technical Toolbox software is available as a hosted service from Petris.
Satellite logistics specialist Blue Sky Network (BSN) has selected Upland Consulting Nigeria (UCN) as its local reseller. The partnership enables Blue Sky Network to expand its business in this important developing region, and bring a new level of security and logistics to fleet operators in the field.
Upland Consulting initiated the sale of BSN’s D2000MD to a major upstream Nigerian oil company, which is to deploy the satellite-based tracking equipment to surmount the present Nigerian security situations and allow for accurate vehicle, marine and personnel tracking. BSN’s SkyRouter asset management web portal lets users track and communicate with remote land and sea-based assets and workers. Movements and mayday alerts of vessels and personnel can be tracked from any of the oil company’s dispatch centers in Nigeria.
UCN president Bola Awobamise said, ‘BSN’s equipment is important in challenging regions of the world like this. This deployment showcases the safety and operations optimization that this solution brings to a fleet of remote mobile assets.’
BSN’s SkyRouter leverages Google Earth to link people and assets anywhere on earth over the Iridium satellite network. FAA-certified data and voice products enable users to customize features including safety and event reporting.
Shell’s North American unit is to deploy Okland, CA-based ModViz’ Virtual Graphics Platform (VGP) 2.0 package to visualize complex or very large 3D data sets. The agreement enables users of Shell’s in-house developed 123DI seismic interpretation package to leverage multiple graphics processors and multiple CPU cores in a single workstation. 123DI is utilized throughout Shell on the desktop, in collaboration sessions or ‘Team Rooms’ and in Visualization Centers.
ModViz VP sales, Richard Thomas, said, ‘The Shell agreement was the culmination of successful evaluations by Shell’s 123DI Development Team and 123DI end users. The initial evaluation concluded that VGP enables them to work interactively with much larger objects, thus greatly improving day-to-day productivity and gaining better perspectives on their data.’ ModViz’s OpenGL-based software transparently virtualizes 3D graphics-intensive applications across multiple GPU/CPUs.
ModViz CEO Tom Coull added, ‘Oil and gas has some of the most demanding 3D visualization needs of any industry. VGP adds cutting-edge virtualization technology customized for today’s complex architectures to bring the power of such advanced hardware to users of mainstream and custom applications.’
Originally launched at the 2006 SPE ATCE (we missed it!) Knowledge Reservoir (KR) is reviving its ‘RightTime’ Analysis Services (RTAS), a customizable software and services package of analysis and interpretation of production data. The advent of affordable surface and downhole gauges and sensors means that huge volumes of data are being collected, often in recognized industry standard protocols that enable software interoperability and data sharing. A large offshore field can produce 5 million data points per day.
KR CEO Ivor Ellul said, ‘We recognize the need for ongoing reservoir surveillance as important events can occur later in the life of the reservoir. Our RTAS offering builds on our formalized, proven workflows to unlock the secrets of production data anomalies.’ RTAS is the first of a series of ‘right time’ solutions.
The University of São Paulo, Brazil reports a ‘100 fold’ speed increase since it replaced its home-grown compute cluster with a 16 processor, SGI Altix shared memory machine. The computer system is used by the university and in Petrobras to speed analysis and development of offshore oil and gas production systems.
The SGI Altix has 16 Itanium 2 processors and runs Novell SUSE Linux Enterprise 9 and SGI ProPack 4. USP also deployed a 2.4 TB SGI InfiniteStorage disk array. The system will be housed at the ‘Numerical Offshore Tank’ (TPN) laboratory at the University whose main focus is the development and analysis of Petrobras’ deepwater offshore production systems.
TPN sytem engineer Antonio Augusto Russo said, ‘We need very large amounts of memory to run our codes which scale very well on the Altix’ shared memory architecture. We also selected the Altix because of its OpenMP API which helped a lot with code development. We even use SGI’s robust storage for our financial data.’
Researchers in computational fluid dynamics (CFD) use the Moving Particle Semi-explicit (MPS) Linux toolkit to solve problems like sloshing water and fluid-solid interactions. The TPN ‘tank’ extends the MPS code to simulate floating oil production platforms.
Invensys Process System has teamed with Pleasanton, CA-based Transpara to offer clients visual key performance indicator (KPI) technology for remote monitoring of process variables from mobile communications devices. The deal means that Invensys is now a global reseller and certified implementer for Transpara’s Visual KPI operations intelligence package. Visual KPI delivers on-demand, business-critical data about process status to handheld devices and desktop browsers.
Invensys VP Chris Lyden said, ‘Transpara’s innovative Visual KPI technology will provide incremental value for our large installed based. This new technology will enable personnel in our customers’ process and utility plants to remotely monitor, in near-real-time, both process variables and custom-defined KPIs from their familiar handheld mobile computing or communications devices, including both PDAs and smartphones.’
Composite source data for the KPIs can come from a variety of different data sources including OSIsoft’s PI System, Invensys’ own InSQL or InFusion plant historians as well as other automation, information, or computerized maintenance management (CMMS) platforms. Visual KPI offers insight into organizational performance by delivering actionable decision-support information ‘directly to process owners.’
Spectra Energy, which was spun out of Duke Energy’s gas businesses earlier this year, has just gone live with a state-of-the-art SCADA system from CygNet Software of San Luis Obispo, CA. The deployment is to be the primary SCADA system for Spectra’s Western Canada transmission and processing system.
Spectra’s integrated gathering, processing, and transportation system stretches from Fort Nelson, in northeast British Columbia, to the British Columbia/U.S. border at Huntington/Sumas. The system includes 3,000 kilometers of gathering lines connecting some of the most active areas of exploration and production activity in western Canada, 2,600 kilometers of transmission pipeline and five world scale natural gas processing facilities with a capacity of more than 1.8 billion cu. Ft./day. CygNet’s SCADA products and application suite are tailored to the production and pipeline business. The company claims to be the only ‘fully distributed network centric SCADA system’ offering ‘guaranteed access to real-time data anytime, anywhere.’
The European Petroleum Survey Group has released Version 6.9 of its geodetic parameter dataset which is available for downloaded from www.epsg.org. The new release includes transforms, datums and coordinate reference systems from the US National Geospatial Intelligence Agency for some oceanic islands not previously available. Records for maritime countries based on ISO 3166 have been clarified.
The new release is available as both a Microsoft Access database, with some data reporting capabilities and as a series of SQL scripts for populating other relational databases. It is our understanding that the EPSG geodetics database will also be available as a web service at a future date.
The EPSG has been reformed as the Surveying and Positioning Committee of the International Association of Oil and Gas Producers (OGP). The former Geodesy Working Group is now the Geodesy Subcommittee of the OGP Surveying and Positioning Committee. The Geodesy Subcommittee will continue to maintain the EPSG dataset.
M2M Data Corp. has announced an expansion of its hosted services for operators of remote assets. The new ‘iServices’ offering is a suite of web-based, hosted operational services that allow subscribers to monitor and control their assets wherever they are located. The iServices suite comprises: iSCADA for remote monitoring and control service, iPM, for preventive maintenance, iTrac, the resource optimization service and iPortal, a customer-specific web portal providing access iServices.
M2M Data CEO Donald Wallace said, ‘smart services respond to customer demand for higher uptime and improved service. iServices provide a quickly deployed, low risk, secure, outsourced implementation of smart services.’
iServices offers ‘multi-waveform’ communications through internet, wireless, satellite, smart field devices, and industrial M2M technologies. The turnkey service is available with minimal or no capital expense. iServices offers an outsourced alternative to the traditional internally-developed and maintained systems.
Chicago-based SmartSignal has announced EPI*Center, a predictive analytics package that provides early warning of equipment and process throughput issues. Condition-based maintenance (CBM) eliminates unnecessary maintenance and ensures timely intervention in the event of imminent failure.
SmartSignal CEO Jim Gagnard said, ‘EPI*Center will enhance competitivity with better support for ‘lean’ initiatives and help with aging workforce issues. SmartSignal is committed to helping companies close the gap between current performance and corporate expectations by reducing operational constraints due to equipment problems.’
Epi*Center analyzes sensor data to identify emerging problems that traditional monitoring systems cannot detect. A ‘WatchList’ provides an exception-based list of systems that show anomalous behavior, enabling companies to optimally allocate their maintenance effort.
A ‘rules engine’ identifies impending faults and users can compare behavior across different assets or aggregate sensor information for greater precision. EPI*Center integrates with third party solutions like OSIsoft’s RTPM. A WorkBench environment provides users with a ‘build once, deploy to many’ functionality.
Qatargas has named Emerson Process Management (EPM) as its ‘preferred supplier’ of digital automation solutions to be deployed at its oil, gas and liquefied natural gas (LNG) facilities. The Qatargas-Emerson alliance builds on a four year collaboration between the two companies on automation of six multi-billion dollar facilities in Ras Laffan. The deal includes Emerson’s PlantWeb digital plant architecture, Foundation Fieldbus communications, integration with third-party equipment and the provision of information technology services.
Qatargas COO Jacques Azibert said, ‘Qatargas pioneered LNG Qatar, now home to the world’s largest processing facilities. We originally selected Emerson as our main instrumentation and controls supplier because of its leadership in automation and its business model that emphasizes both project and operations needs, including continuous improvement and long term operational support. The new alliance secures our future success through best in class automation.’
EPM president John Berra added, ‘We are committed to continued, effective and efficient project development of Qatargas 2, 3 and 4, as well as long term project and operational excellence at all Qatargas’ facilities.’
Upstream ERP software specialist WellPoint Systems has received certification from Microsoft’s Industry Builder program. WellPoint’s financial management product will be published on Microsoft’s price lists in the USA, Canada and the UK, and be available for resale by Microsoft and the Microsoft partners in those regions under the brand name ‘Energy Financial Management for Microsoft Dynamics AX.’
WellPoint President Tom Mawhinney said, ‘Our next initiative is to ensure that we align with the global partners best suited to successfully implement this solution in our focus regions. With appropriate training, we expect our partners to capture significant market share whether in a competitive region such as the USA or in an untapped market like China.’ WellPoint now has Microsoft ‘preferred partner’ status in oil and gas for its energy solutions that leverage Microsoft Dynamics (formerly Axapta) and claims 250 clients in sixty countries. Last month WellPoint announced a marketing agreement with Deloitte Touche Tohmatsu.
IBM has opened an ‘oil sands center of excellence’ located in its Calgary offices. The $2.6 million facility is to assist operators test and use new technologies to ‘lower costs and make oil recovery easier, more efficient, and more intelligent.’
IBM Canada president Dan Fortin said, ‘IBM has been active in Alberta’s oil patch for over 40 years and now has over 2300 employees. The Centre of Excellence will accelerate the development and adoption of innovative technologies and business strategies.’
Technologies on offer include radio frequency identification, 3D data visualization and software integration targeting construction and project management, productivity, business processes and environmental management. The Centre is the fifth IBM oil and gas facility. The others are located in Stavanger, Abu Dhabi, Beijing and Moscow.
The American Petroleum Institute (API) has just released a new recommended practice specification (API RP 1165) for Pipeline SCADA Displays. The document covers design and implementation of pipeline control center displays and provides guidance to pipeline companies or operators selecting or upgrading their SCADA systems.
The specification derives from work done by the API’s Cybernetics Subcommittee on human machine interfaces used to display control information from pipeline systems. The RP considers ‘human factors’ like short-term memory limitations, and fundamental human information processing capacity limits. Display design also needs to allow for real world environments and eye scan patterns.
The RP works through considerations of display layout and hierarchy, number of screens, number of items per screen, menus, navigation techniques and input devices. Color usage, animation, recommended fonts are also covered and an appendix of sample displays show typical deployment in pumping unit activity, valve status, set point status, alarms, event summaries, detail point data and drill down tends. Samples of station displays combining all of the above are also provided. The 58 page document is available from http://www.api.org/Publications/new/rp-1165-pa.cfm at a cost of $135.
Innotec’s Brazilian unit has struck a ‘multi million dollar’ deal with Petrobras for the supply of its life-cycle asset information management software, Comos. Comos targets the economics, planning and maintenance of a facility, collecting all relevant data and documents into a single data repository, allowing for consistent data administration and a high degree of transparency for all stakeholders.
Innotec MD Jochen Schüler said, ‘Operators are using our state-of-the-art asset information management system to address the challenges of plant design and construction as well as economic operations throughout the plant life-cycle. Comos maintains all relevant data and documents in a single database—this is our key to success.’ Comos covers the whole engineering activity spectrum from development, piping layout, automation and instrumentation/control planning, electrical, measuring and control engineering, maintenance, plant downtime planning, inspection including reconstruction planning, documentation and project management. Petrobras is to deploy the complete Innotec portfolio including Comos FEED, P&ID, E&IC, piping and document management. Support is to be provided from Innotec do Brazil’s São Paulo-located experts. More from email@example.com.
BP International awarded UK-based Tadpole Technology a contract for the development of a ‘proof of concept’ application utilizing Google Earth (GE) to assist BP Group Fire Advisors’ management of system integrity and regulatory compliance across its worldwide assets. The project builds on Tadpole’s ongoing relationship with BP in the UK, particularly with its iPlan asset management system. GE provides a world map display of the location of BP sites across the globe. Sites on the map link to ‘virtual filing cabinets’ displaying up to date site-specific data and compliance documentation.
BP Group Fire Advisor Kevin Westwood said, ‘Tadpole’s expertise in developing innovative geospatial solutions enabled us to make use of leading edge internet technology to create a detailed map of our sites, linked to compliance data. The system will be accessible by authorized BP personnel from virtually any location and will let us remotely manage compliance and integrity data.’
Tadpole’s James Blackwood told Oil IT Journal, ‘Traditional geospatial applications run into trouble because of their complex user interface. GE removes this complexity, distilling essential functions for standard users. We supply BP with a kmz file which is loaded by users who then access the public raster data over the internet.’