Once again, the pundits have been wrong footed by events. Not long ago the buzz was that oil and gas was a shambles regarding new technology and needed to sharpen up its image to survive. Worse, it was populated by a bunch of bumbling old baby boomers who were both technophobic and at the same time, possessors of corporate knowledge that was about to be lost as they all cashed in their 401ks. Then the oil price went up and suddenly, it wasn’t new technology we were short of, it was people! How were the young enthusiasts going to be encouraged to join a ‘technophobic’ industry? A quick change of the game plan was needed—enter the ‘digital engineer, oilfield’ etc. And bring back the baby boomer retirees quick. Now that the oil price is down, the message seems to be, we don’t need all those people any more! So let’s fire some! We didn’t really need all those ‘trained’ folks anyway because we have the wisdom of crowds.
I confess that I have not actually read James Surowiecki’s book of that name*. But from Wikipedia*, plethoric blog postings and references in various talks I gather that a relative of Charles Darwin, Francis Galton, noted that at a country fair, the crowd collectively guessed the weight of a pig with more accuracy that individual experts. Galton’s somewhat provocative** conjecture has not been backed up by subsequent scientific investigation. Attempts to derive the 657th digit of pi by surveying very large crowds have failed as has the notion that revisions to the standard kilogram mass should be based on a poll of passers by.
I heard on the radio this month somebody bemoaning the fact that none of the silk suited CEOs of British banks who were apologizing for having run the country down the tubes actually had any banking qualifications. Someone commented that businesses today are more often than not run by MBAs rather than by internal promotion of people who understand the business. It struck me that this is a kind of ‘wisdom of (MBA) crowds,’ the idea that we don’t actually need to know how a business works, just apply the collective wisdom of the pundits.
A related phenomenon is the current enthusiasm for ‘collaboration.’ This is a motherhood and apple pie sort of notion. I mean you can hardly be against it. But ‘collaboration’ of wise crowds can end up being a way of trashing the hierarchical model of doing business—with more experienced managers telling younger hires what they should be doing.
The IT community, ever a proponent of ‘fear uncertainty and doubt’ (FUD) is hard at work pushing the opposing notion that it is the young hires who should be educating the old farts about collaboration technologies.
At the risk of old fartism, I submit that the current enthusiasm for Web 2.0, Web 3.0 and Web x.0 has much more to do with FUD than technology breakthrough. The thesis, for which belief has to be suspended, is that young hires need Twitter to work, that old farts haven’t ‘got it’ and that the ‘cool stuff’ is key. Web x.0, even though it is really about having fun, has been repackaged as ‘collaboration.’
I despair! Observe the twitterers, bloggers and vanity publishers. They chat to each other about the color of their neighbor’s clothes or who Britney Spears is dating. In an enterprise context, people at conferences and in meetings are futzing away on the Blackberries and other ‘smart’ devices. Is the corporation getting full value from all this intellectual Brownian motion? I don’t think so. Web x.0 technologies are not an unequivocally good thing.
The ‘wisdom of crowds’ notion also underpins much of the self fulfilling ‘surveys’ that vendors push in our direction to prove something or other. Why waste time seeking out expert advice when there is a crowd out there to ‘survey.’ Why bother testing the statistical significance of the results or controlled the results with a third party audits? All you need is a ‘crowd’ and a set of leading questions. For instance, ‘Do you use product X?’ answer, ‘No.’ Well do you think that product X would help you do your job? Answer ‘maybe.’ This then generates a statistic that X% of those interviewed believe that product X could help them with their job. It’s interesting too that when confronted with a problem, the consultants survey the very people who are asking the question in the first place! When a new technology hits the market everyone is wondering what its impact will be. Well, just survey the wonderers, find out what the ‘crowd’ thinks and bingo, you have your wisdom.
It seems to me that many of the ‘surveys’ that back up studies in the Harvard Business Review beg similar questions on the integrity of the samples and the method. Management surveys just seem so sloppy, even when they are not overtly self serving!
But lets step back a bit. What is management really about? It is at least in part about people who know telling people who don’t know what to do. In some fields the knowledge of the knowers virtually defines the company. In an exploration start-up, this might be the geologist with an intimate knowledge of a play, or a new idea that turns out to be right. In an unconventional gas play, it might be the drillers or engineers who know just how to frac that horizontal well and get the gas flowing. In a big company, with fingers in many pies, this means having a bunch of experts knowing lots of stuff and structured education programs for the new hires. It might even involve use of some of the web x.0 stuff, but this has to prove its worth. Slow take up of new technology may just reflect a healthy distrust of snake oil.
Keeping experts on through the down turn likewise is a statement that the wisdom of crowds is fine for the competition, ‘We’ll just stay with the facts and the domain knowledge thank you very much.’
I have to conclude with this snippet from a blog posting that I stumbled across while writing this editorial. After expounding on Surowiecki’s theories on the wisdom of crowds, the blogger who shall remain anonymous, concluded, ‘This is an illustration of the fact that a collective judgment may be better than the judgment of an individual expert, something which appears to be true in financial markets, for example.’ I couldn’t have put it better myself!
Stavanger, Norway-based Roxar’s board of directors has unanimously recommended that shareholders accept an offer for the company made this month by Emerson’s wholly owned Norwegian unit Aegir Norge Holding. The Roxar board opined that the deal will make Emerson a leader in topside and subsea instrumentation and strengthen Emerson’s position in decision support software and upstream data acquisition. The offer price represents a 49% premium on the February 27 closing price. If the deal completes, Emerson is to redeem NOK 1.5 billion ($212 million) of Roxar debt.
The deal already has acceptance from 41% percent of Roxar’s shareholders. The acquisition will be financed by Emerson through existing cash balances and is not subject to any financing condition. The voluntary offer is expected to close in April 2009.
Roxar has two distinct lines of business, as a supplier of measurement technologies—especially high-end multi-phase flow meters and also a provider of upstream software, notably with its flagship Irap-Reservoir Modeling System (RMS). If the deal goes ahead, Roxar will become part of Emerson Process Management (EPM).
EPM executive VP Steven Sonnenberg said, ‘With its position in offshore metering and monitoring equipment and well optimization software, Roxar is a strong complement to Emerson. Our oil and gas customers will benefit from the more complete product and service offering that will result from this strategic combination.’
Two companies who are likely to do well from the deal are FMC Technologies and Kongsberg which acquired 25 and 28 million of Roxar’s shares respectively last November. Both currently hold about 10% of the company.
Comment—The ‘digital oilfield’ lies at the intersection of upstream engineering and process control. When we interviewed Roxar’s erstwhile president and CEO Sandy Esslemont back in 2004 he said, ‘We are a product company, not a service company like Schlumberger.’
Well this will change if Roxar, with 2008 revenues of $193 million becomes part of Emerson with fiscal 2008 sales of $24.8 billion!
We asked Emerson what it was planning for Roxar’s software division but it declined to add to its public statements. It could be that Emerson is simply buying Roxar for its metering business (73% of 2008 revenues).
But if, as the press release seems to imply, Emerson is planning to integrate RMS with its PlantWeb platform, then we may be witnessing the birth of a third major contender in the upstream service sector as the ‘digital plant’ meets the ‘digital oilfield.’ Delivering on this kind of a vision will likely require a few more acquisitions!
UK-based Common Data Access (CDA) has awarded Schlumberger a contract to build and operate its new seismic data store, a centralized repository for seismic data from the UK continental shelf (UKCS). Since the inception of the CDA well data store in 1994, the extension to seismic data has been the subject of some controversy, particularly regarding its effect on competition (OITJ August 2000).
CDA chief executive Malcolm Fleming now believes that, ‘In the current economic climate, UK oil and gas explorers must use their resources in the most efficient way possible. The seismic data store enables users to download quality-assured seismic data in days or hours compared to the current performance of weeks or even months. This will significantly increase the productivity of expert resources.’
Schlumberger UK chairman Gordon Ballard added, ‘Reducing the time and cost of retrieving seismic data will enhance prospect development and improve data quality. The seismic data store will speed response times to license rounds and asset purchase opportunities and preserve the scientific heritage of the UKCS.’ CDA is a wholly-owned subsidiary of industry trade body Oil & Gas UK.
Oil IT Journal visited with The Information Store (iSTore) CEO Barry Irani (previously with Ensearch Exploration) and CTO Oscar Teoh (formerly with POSC). We asked Irani how the oil price collapse was affecting the software house.
Irani—Actually, the current low oil price is forcing efficiency, so this is a good time for iStore. Our clients are not necessarily targeting cost cutting. iStore is a bit like an oil company with domain knowledge as opposed to oil and gas reserves. We have spent years developing solutions for upstream business problems, originally developing our tools along with services. We noted that about 80% of our activity involved fixing the same problems, with the remaining 20% of the effort on customization. Today, customizing our products is very quick. It involves identifying what is specific to the client’s setup. The guiding principle which we learned from our experience with POSC is, ‘don’t move data around!’ Data needs to be left in its system of reference although this is not cast in stone. A flexible approach means that data can be copied or moved if operational requirements make it necessary.
When do you move data?
Teoh—Data in the process historian is a good example. The historian provides a moving window on operations data that needs to be captured to a database. Project data may likewise benefit from copying. We bring all these disparate data sources together using our federated data model.
What is your relationship with Microsoft?
Teoh—Two years ago we ported our solutions to Microsoft Office SharePoint Server (MOSS), opening, rolling in other Microsoft applications and user-developed solutions. Previously all development was on Oracle and Unix and we still supply on and integrate with these environments. MOSS migration was the result of push from clients to add business-focused software to Microsoft’s offerings.
Irani—This has given a significant boost to our business and we are opening an office in Dubai for the Middle East market. Other major accounts are Pemex, Shell Nigeria and RasGas (Qatar).
Irani—Yes, I loved the ‘quick failure or quick success*’ quote and the fact that Marathon’s pilot selected a ‘difficult’ business unit. Marathon, like most of our clients realize that some development is always involved in a project and they prefer not to waste time educating service providers about ‘what is a well.’ iStore’s domain knowledge is generally very well received. The Marathon proof of concept was finished in three months and one week into the ‘six month’ trial period, Marathon said, ‘OK we’ll take it!’
Are you still on a 80/20 product to service ratio?
Irani—Yes. Over time we have gotten very keen on project planning in fact one of our engineers is pretty much on this full time. Our clients know what the deliverables are and when they will arrive.
Your solutions are about data browsing.
Irani—Yes. But ‘browsing’ understates the power of data collection. It’s more than just visualization. It’s about enabling ‘intuitivity’ through time line displays of multiple production parameters, all available as pre-wired solutions. Traditional systems are too cumbersome. We can easily add in knowledge management type tools—notes that pop up warning of paraffin build up in wells—producing the ‘aha’ moment! This avoids data overload and helps answer the questions that engineers will be asking tomorrow.
Teoh—The Petrotrek Digital Oilfield is now all running on MOSS with AJAX-enabled interaction (mouse-over, changing time scales and generally customizable displays). Production data can be easily exported to Excel and we can tie in to financials in SAP, JD Edwards etc. All are WebParts-based so it’s easy to add KPIs, dials and gauges.
How do you tie all the different databases together?
Teoh—We use the techniques like well alias tables that were developed in POSC. It’s curious that although these issues were fixed a long time ago, it’s only now that the business intelligence community has got in on the act with new terminology. So now we read about ‘master data’ and think, ‘Oh, yes—that’s what we are doing!’
There have been reports of poor SharePoint scalability
Teoh—We have been working with Oracle since the days of dial up modems. Achieving scalability across different architectures is part of our ‘secret sauce.’ We use this knowledge to fix SharePoint deficiencies.
Can you fire up applications in context?
Teoh—Yes this can be done but this is not our focus. Petrotrek is primarily a tool for the non specialist. It is not really intended for project building or interpretation—more about data integration such as data viewing in Microsoft Virtual Earth. Our technology also knows about entitlements and will integrate with whatever is already deployed—single sign on, security and identity systems. We can even limit access to individual wells and by producing/depth interval.
Do you host client’s data?
Irani—Not all of it. We host Petroleum Geo-Services’ non exclusive seismic index. Chevron’ Mid Continent unit hosts its joint venture data with iStore technology—this involves access by around 100 JV partners.
What is iStore’s financial structure?
Irani—We are a privately held, employee-owned company. We have no debt and our operations fund everything.
* At the 2009 Microsoft GEF—more in next month’s Oil IT Journal.
Stafford, TX-based seismic boutique Seismic Ventures Inc. (SVI) has selected technology from Seismic Micro Technology (SMT) and Geomodeling to support its processing, AVO and direct hydrocarbon detection effort. SVI is an early adopter of SMT’s Kingdom 1D Forward Modeling (1DFM) tool which will be used to model lithologic and reservoir fluid properties to show the effect on seismic traces and tie geology and seismic. 1DFM is said to offer easy to use AVO modeling and ‘fluid workshop’ capabilities.
SVI has also bought a license to VisualVoxAt from Geomodeling Technology. The ‘workflow-oriented’ application provides 3D visualization, calibration, interpretation and advanced analytical capabilities. SVI VP Hank Saunders said, ‘VisualVoxAt helps us identify those attributes that impact hydrocarbon distribution, reducing our customers’ decision-cycle time and improving well placement.’ More from firstname.lastname@example.org.
Stavanger, Norway-based data management and software house Kadme reports on progress on a major contract with the ANH, the Hydrocarbon National Agency of Colombia for the development of a new front-end for ‘EPIS,’ the Colombian Petroleum Data Bank (epis.com.co). EPIS is operated by Schlumberger with data stored in Finder, SeisDB, LogDB and AssetDB. The current front-end is a customization of Schlumberger’s DecisionPoint.
Kadme’s ‘Whereoil’ solution has been selected by the ANH for the new EPIS front-end. Kadme will deploy Whereoil to provide a ‘modern, fast and easy-to-use’ system that will allow current operators and potential new investors to search and order E&P information available in the country.’ Data loaded by the EPIS operator into the various databases will be indexed via Whereoil’s crawlers and presented via the search and map interfaces. Whereoil understands entitlements and allows approved users to preview and download data. Kadme is also to provide its crawling/browsing technology to the ‘ArcticWeb’ joint industry project that sets out to provide a ‘geoportal’ for offshore Arctic areas. More on ArcticWeb in next month’s Oil IT Journal.
Writing in the Geovariances’ client newsletter, Rita Parisi Conde Pozzi, a geologist working with Brazilian state oil company Petrobras, reports that Geovariances’s Isatis geostatistical package is used throughout Petrobras reservoir characterization workflow, especially by development geologists and geophysicists. Isatis is used from exploratory data analysis to time-to-depth conversion and facies or petrophysical modeling to the simulation of permeability for uncertainty analysis. Isatis is used to create quality net-to-gross maps based on depth converted seismic imagery. Lately, Isatis has also been used to incorporate 4D time lapse seismic data into the geological model.
Petrobras rates Isatis as an ‘extraordinary’ geostatistical educational tool. Over 100 Petrobras geoscientists and reservoir engineers have been trained on the tool in the last four years. A good example of Isatis as used by Petrobras is SPE paper 94913, ‘Using seismic attributes to estimate net thickness on the Marlim deep water oilfield, Campos basin.’ More from Patrick Magne, email@example.com.
Speaking at the 2009 SMi Data and Document Management Conference in London this month (report in next month’s Oil It Journal) Schlumberger’s Eric Abecassis sang the praises of search over indexing/tagging by way of an introduction to a new Petrel component, ‘Data in Context.’ Petrel Data in Context (PDiC) is provided as a dual hardware appliance installation. A MetaCarta appliance provides automated location/place name based document indexing and a second PDiC appliance crawls structured data sources such as GeoFrame, OpenWorks or Finder. Both appliances share the same business objects taxonomy of wells, leases etc.
PDiC search can be launched from the Petrel window either in spatial proximity to a well or lease boundary or as a search for documents and data of relevance to a Petrel project. Documents and data can be displayed in Google Earth or Microsoft’s Virtual Earth. Documents can be displayed on a time line showing date of creation. PDiC offers read only access to information, no changes to documents are allowed. A PDiC ‘HealthCheck’ application verifies that everything is running. PDiC is a component of the Petrel 2009 release which also heralds a new native 64 bit Vista version of Petrel.
The Society of Exploration Geophysicists (SEG) Advanced Modeling (SEAM) Corporation has awarded Phase I of the SEAM simulation contract to Tierra Geophysical. The SEAM project (OITJ September 08) involves the computation of a full azimuth pre-stack synthetic seismic dataset generated from the SEAM’s sub-salt model. SEAM chair Arthur Cheng said, ‘This is the first of many planned simulations of challenging geophysical data sets. The simulation technology required is on the leading edge of what is possible today.’
Tierra CTO Christof Stork added, ‘Large-scale forward modeling can significantly reduce risk in exploration and development. Modeling can optimize acquisition, processing, and interpretation for subtle features.’
SEAM’s sub-salt investigations are of relevance to exploration hot spots such as the deepwater Gulf of Mexico, West Africa, offshore Brazil, the Red Sea and the Caspian. SEAM Corporation will make these datasets freely available to universities for research and education purposes and to industry in general after an initial two years of proprietary usage by the participating companies. SEAM Phase I project currently has 24 participating companies. SEAM is a wholly owned subsidiary of the SEG.
The Information Store (iStore) has released a new component of its PetroTrek/digital oilfield Web Parts toolkit, ‘Time Chart.’ Time Chart is a web-based visualization tool for Microsoft Office SharePoint Server 2007 that connects to production, operational and safety data, allowing for ‘mashups’ of past and current well events. Time Chart ‘understands’ allocated production, down time, choke size, well head and annulus pressure and other ‘events.’ Web Parts allows developers to assemble charts and other PetroTrek components into their own web pages.
PetroTrek’s Web Parts utilize Microsoft’s Silverlight rich browser technology.
The 2009 release of Roxar’s Irap-RMS offers a new GUI, easier navigation for new and occasional users and a new local property model update. Real time well monitoring heralds a new vision of ‘model-based geosteering with WITSML based communications of bottom hole assembly data and logging. Real time alarms can flag proximity to other wells and geologic features such as faults.
Landmark has rebranded its ‘DeepStor’ storage solution as ‘PetroStor.’ The combination of Storwize’s hardware compression technology and NetApp’s disk-based storage is designed to replace tape-based solutions at under $1,000 per terabyte.
Petrosys has released a plug-in for ESRI’s ArcGIS that adds gridding, contouring and volumetric calculations. The plug in reduces the complexity of data exchange between technical and GIS systems providing integration with E&P data in OpenWorks, GeoFrame, Finder, SMT, PPDM and other data sources.
AMEC’s Paragon American oil and gas business unit now provides clients with ASD Global’s OptiPlant 3D modeling software for front-end engineering design and estimating. OptiPlant automates the production of 3D models of layouts of pipe and other plant or offshore equipment.
The 3.1.3 release of OpenSpirit’s interop framework introduces a ‘LoadIT’ WITSML data transfer utility. LoadIT connects any OpenSpirit-enabled data store to multiple real-time WITSML data sources.
eSimulation has announced V2.0 of eSimEvaluator, an application that enables midstream natural gas processing businesses to optimize margin positions. The new release improves model construction and maintenance, automates flowsheet construction and provides a contract model view of the ‘big picture’ of contract information, residue/NGL production data, producer payment obligations, and margins.
Geosoft’s 2008 release of Montaj and Target 7.0 now embeds ESRI’s ArcEngine technology, enhancing connectivity between Geosoft and ESRI environments.
V7.0 also adds subsurface displays including fence diagrams, 3D geological voxels and automatic creation of geological surfaces.
Geotrace has been granted a US patent for ‘technology related to the matching of synthetic data derived from well-log information and processes seismic data.’ The ‘Non-Linear Seismic Trace Matching to Well Logs’ was granted to Geotrace and programming geophysicist John Dubose.
P2 Energy Solutions has released Enterprise Land 3.0 including a new Lease Acquisition module for capture and reporting of lease, mineral and surface ownership information based on legal land descriptions. The new module provides ease and flexibility in leasing activity, budget tracking, costs and payment of drafts, integrating data gathered by brokers or land agents at the field level.
Paradigm ‘s 2009 releases of its Skua 3D modeling environment extends interpretation capabilities by linking the interpreter and modeler roles. ‘Interpretation Modeling’ (previously Prospect Architecture) adds integrated workflows between seismic and geologic interpretation and modeling. The new release also introduces Stratigraphic Interpretation Modeling, a new tool for concurrent interpretation, geochronological modeling and 3D paleo-restoration.
Speaking at the 2009 Microsoft Global Energy Forum (GEF) in Houston this month (more in next month’s Oil IT Journal), Mike Hinkle described Shell’s early trials of social networking. The idea is ‘do more with less,’ to reduce travel and connect with the ‘deep expertise’ of an ageing workforce. There is also pull from young ‘digital’ hires and the need to support pockets of useful activity spread throughout the company. Enterprise social networking (ESN) is a ‘new and different way’ of digitizing informal information flows that may already exist. ESN establishes connections quickly and effectively, links to the right people, sharing ideas and solutions broadly. Tools include team and project websites and document repositories, portal personalization, wikis, blogs, RSS feeds, discussion groups and forums (the last two are the most successful). ESN is also about a richer way of connecting through instant messaging, presence management and ‘perhaps even Twitter-like services.’ Content is the fuel of ESN, enabling an ‘attention economy.’ Companies need to make it easy and rewarding to publish content which should be easy and interesting to find. You shouldn’t assume that all information flows will come from inside the enterprise. Competent staff and strict policies are required. Regarding tools, SharePoint and Office Communications Server offer a good set of core features to which Shell has added NewsGator, MediaWiki and Fast for search. Currently Shell has around 50,000 content publishers and expects SharePoint use to grow as Office 2007 and Vista are deployed.
At the GEF, Microsoft and Accenture unveiled the results of a survey they jointly commissioned from PennEnergy and the Oil & Gas Journal Research Center into oil and gas collaboration. The survey found unsurprisingly that over 70% of respondents believed that collaboration and knowledge sharing are important. ‘In spite of this,’ the survey found that organizations are still using ‘older means of collaboration’ like face-to-face meetings, e-mails and conference calls.’
Around 40% believed that they could save at least an hour every day by using social networking tools. Accenture’s Claire Markwardt said, ‘Companies have an opportunity to supplement their existing capabilities with tools such as podcasts and social networks to accelerate the sharing of knowledge, increase teaming and augmenting communication between their workforces in different regions.’ More on the Microsoft survey at www.microsoft.com/oilandgas.
Around 50 attended the 2008 Plant Tech conference and exhibition held late last year in The Hague. Plant Tech lies at the intersection between upstream/construction, supply chain/e-commerce and process and chemicals. A panel session discussed the possibility that the piping and instrumentation diagram (P&ID) could be considered as a ‘single source of truth’ throughout a plant’s lifecycle. The consensus was that the P&ID was more of a ‘view’ onto properly databased information. Regarding standards, this community has settled on the ISO 15926 suite of plant data standards, but, as we are at an intersection, there are other contenders—as witnessed by presentations from the European CEN supply chain organization and the German process industry standards body, Prolist. Operators will likely have to satisfy multiple standards in the real world and mapping from one to the other is increasingly important.
Is plant data management worth the effort? According to Pearson-Harper’s Mike Moroney, a ‘best in class’ handover today makes only a 50% data handover, the typical employee spends 20% of his or her time looking for data and spare parts data is often thrown away, ‘rationalized!’ Clean up is always worth it, ‘ask a lawyer!’ ‘There is never enough time and money to do it right, but there is always enough time and money to do it again!’ Peter Zgorzelski (Bayer and Prolist International) noted that despite talk of digital engineering workflows, today’s reality is, ‘paper, paper, paper.’ On the positive side Fluor Corp’s Project of the Future has resulted in faster design, ‘one time’ data entry and ‘tool independent’ data. Aker Solutions also reports success from its implementation of COMOS project data hub. Eldar Misund’s (Shell) talk on the Ormen Lange special case handover from StatoilHydro shows that it can be done. Shell’s three data priorities? Quality, quality and quality!
Even if the above makes it sound as though the data interoperability problems are still some way from being ‘fixed,’ there is a strong feeling that progress is being made. Is this a ‘standards success?’ In part perhaps but, in a reflection of what is happening in the upstream, it is also due to companies turning away from Excel ‘hell’ and returning to a central database.
Eldar Misund presented a special case ‘handover’ of the Ormen Lange production facility from StatoilHydro to Shell following a change in operatorship. The complex includes an onshore processing plant and a 1,200 km pipe to the UK passing through Sleipner. Ormen Lange’s 150,000 data tags are all documented in databases. Shell’s lifecycle information (LCI) system covers safety, operations, maintenance, inspection, redesign and modifications. This leverages an Aveva PDMS model, one of largest in world. Handover was entirely electronic and was data, not document, centric. The ‘as built’ handover went into Shell’s applications. These include the Shell Engineering Data Warehouse, LiveLink and SAP. Shell specified three main data requirements, ‘quality, quality and quality,’ along with a focus on ‘as built’ data. Specifications were derived from Hydro’s Documentation for Operations (DFO) and translated to Shell’s Engineering Information Specification (EIS) where they will be the basis of the next 40 years of operations. Shell’s EIS was established in 1999 based on STEP and POSC. 90,000 documents were loaded in the last year by one person, including reviews and verification at many stages. Shell’s IM toolset is built around Assai DCMS, a LiveLink document repository and an AHA4P engineering data warehouse. SAP is used for the 90,000 equipment items. Misund described the Ormen Lang paperless handover and data-centric approach as a success.
Pearson-Harper’s Mike Moroney, who is also CIO of the Karachaganak, Kazakhstan project, presented a keynote on ‘making engineering data work.’ Moroney listed data problems associated with major capital projects, noting that on handover, only 40% data collected is actually passed on to the operator. Even a ‘best in class’ handover would involve only around 50% of the data. For owner operators, this makes for sub optimal asset lifecycle management. Frequently, low cost, sub optimal solutions are deployed. Companies are still cost cutting, even on maintenance. Plants may be designed to be safe, but are all the safety information and HAZOP plans handed over? Will safety information still be there on decommissioning in 30-40 years time?
The information needed to build a plant is a small fraction of that required to operate it. Projects don’t understand the value of information. Information is ‘sliced and diced’ into silos of process, piping, electrical, managed by different contractors. But maintenance cuts across the silos. Change management is essential. A plant starts to change the day operations begin. But operators have a hard time keeping documentation up to date. It can take three years for a change to be reflected in an updated document. The corporate memory is impoverished. This is aggravated by the fact that personnel move from project to project and may experience only one shutdown in their whole career.
For Moroney, the answer is to ‘tag’ to maintenance level—i.e. all flanges, instruments, valves—as is being done on Karachaganak. This means that data (especially maintenance) can be assigned to everything. This needs to be done by contract, to avoid ‘black box’ tag information. All EPC contractors now have this kind of contract. There has to be a single master reference list—belonging to the OO or EPC. Asset numbers should be assigned up front so that equipment is tagged in the factory. ‘Oddball’ documents, especially Excel are popular but they are also abused and hard to read. Tag ids on 2D CAD drawing can be problematical. Brownfield projects require search for original documents which can be ‘smart scanned’, converting 2D drawings from Microstation to intelligent P&ID diagrams. ‘You need to find out what people are hoarding!’ Clean up is always worth it, ‘ask a lawyer!’ ‘There is never enough time and money to do it right, but there is always enough time and money to do it again!’
Ashish Shah’s talk on Fluor Corp.’s engineering ‘project of the future’ described how the engineering behemoth is dealing with project execution challenges, shrinking resources and recruiting problems and capex ‘migration’ from US to the Middle and Far East. The ‘Project of the Future’ (PoF) task force was set up to make a ‘quantum leap’ in work process improvement. The PoF identified a need for a corporate knowledge base to centralize data from tools like IPE/Kbase, InTools, 3D CAD, InVision, CompleteIT and other generics. These were evaluated alongside Fluor’s ‘Master Plan’ toolset of AspenPlus, HySys, OptimEyes, SmartPlant P&ID, CMMS, Aspen Zyqad etc. PoF sought to identify which tools had most impact, particularly on front end engineering automation. All has been consolidated to Zyqad for design with output to SmartPlant P&ID. In Fluor, Zyqad, known as FrontRunner, is used to generate data sheets along with support for units conversion. Shah concluded that the PoF has resulted in faster design, single data entry and ‘tool independent’ data.
Aker Solutions’ Jann Slettebakk described how plant data quality is key to Aker’s provision of engineering project screening and selection (projects include Kashagan, Dalia, Ormen Lange and Blind Faith.) Aker’s systematic approach to project engineering builds on a project execution model (PEM). Drilling down from the PEM shows, for instance, a piping discipline flowsheet and activity description. The PEM is used from feasibility studies, through definition, fabrication, assembly and completion with a focus on quality. Further drill down brings up an execution plan, activity, quality descriptions and project management. To evaluate the quality of a drawing, Aker embeds ‘control objects’ in P&IDs and isometrics. These connect to checklists and data model verification for control and checks on drawings. For instance a 3D model of ‘piping and layout’ shows control objects for pipeline, valve, pipe support and access/escape routes. The PEM supports concurrent engineering with quality management across the workflow—all in a database centric project management and execution environment. The COMOS project data hub (PDH) is considered such a 3rd generation project—centric data repository. ISO 15926 is used for data handover to clients and as a subcontractor interface. Aker’s focus is on how to use ISO 15926. The standard has started to mature and now supports the smooth exchange of quality information. However, we are ‘still struggling with data exchange today.’ The combination of COMOS PDH, Documentum/ProArc PDMS, SAP etc. is ‘all coming together in 3rd generation repository for systems engineering.’ Knowledge-based engineering (KBE) and design also helps standardize the process. The KBeDesign Tool provides rapid re-use and scaling of detailed design data, along with automatic generation of 3D models, leveraging Aker’s knowledge base of engineering rules.
This article is an abstract of one of The Data Room’s subscription-based Technology Watch reports. More from www.oilit.com/tech or email firstname.lastname@example.org.
Despite the economic climate, the 2008 Petex Conference held in London late last year had a record turnout just short of 3,000. A keynote by Dave Campbell, BP’s VP for North Sea ‘renewal’ noted that the North Sea still has some 25 billion barrels to play for and is still very much a part of BP’s plans. On the IT/IM front, Cambell vaunted BP’s 3 highly immersive visualization environments (HIVE) and 11 advanced collaborative environments (ACE). The ACE provides remote engineers with real time interaction with offshore platforms over fiber optic networks that are shared by other operators.
UK Energy Minister Mike O’Brien noted that oil and gas is central to energy needs representing one sixth of UK industrial investment and some 450,000 jobs. O’Brien wants banks to sustain oil and gas investments. If the banks don’t play ball, the UK government wants to know about it! The government is also working to ‘make the environment right for industry.’
Senergy’s James McCullen noted that ‘finding smaller fields demands the same amount of equipment as larger fields,’ that exploration is getting harder and the cost of discovery is increasing. Development costs are up to around $25 per barrel. Capex rose during the days of high oil prices as ‘we got lazy, nobody questioned performance—now we must focus on the essentials.’
Dynamic Graphics’ Brian Lynch gave a spectacular presentation in the 3D Visualization Theatre showing how 4 D visual data fusion can enhance reservoir management. Dynamic Graphics’ ‘co-visualization’ toolset was put to good effect blending multi-discipline data including 4D time lapse seismics and real time data. Reservoir simulation grids, color coded wells, logs, core photos were pulled on screen in a compelling demonstration spanning micro to macro scales. Lynch made a good case for the use of multi discipline data visualization and analysis of complex data relationships.
Rob Lee presented Shell’s Brent Field surveillance program. This was implemented to maximize recovery prior to decommissioning a decade or so from now. A greatly increased surveillance program has shown that reservoir dynamics can be very unpredictable. Traditional ‘rule of thumb’ approaches to well intervention and forecasting are simply not good enough. ‘Monitoring is everything.’ Cased hole log data interpretation and intimate knowledge of reservoir properties and dynamics are required. A specialized skill set that is ‘a scarce commodity within surveillance teams in the industry.’
Mark McAllister (Fairfield) presented another keynote from the view point of relative newcomer (the UK independent is three years old). McAllister recommends ‘putting assets into the hands of those who are willing and able to invest,’ Many small companies are willing but not able! For Fairfield, it’s not about reducing costs, it is all about reserve optimization through ‘subsurface excellence.’ Here ‘there are no short cuts.’ A strong seismic understanding is required. Assets are modeled in Petrel along with full field Eclipse simulations. ‘Infrastructure led’ exploration is a necessity, but ‘geoscience is the core of what we do.’
Deirdre O’Donnell (Working Smart) noted that with the retiring baby-boomers, 50% of E&P staff in the UK will be leaving in the next ten years. Oil and gas is losing the competition for geoscientists. Not because of a shortage of graduates, but more because of bad press about oil running out. The global oil industry does not have a consolidated public relations machine counteracting the Greens’ arguments!
Aveva has appointed Derek Middlemas as group operations director. Middlemas was previously head of business strategy.
Baker Hughes has named Martin Craighead senior VP and COO. Charlie Ransford is to head up Caprock Communications’ new regional office in Singapore. The company has also named Philip Harlow as CTO.
The OPC Training Institute has launched an online resource center focused on standards-based OPC technology used in process control and manufacturing automation.
Greg Aliff heads-up Deloitte’s new Center for Energy Solutions in Houston, home to its 500-strong team of energy professionals.
The Gas Certification Institute has named George Brown as Executive Director. Brown was previously with CenterPoint Energy.
Geomodeling Technology has appointed Kevin Donihoo as VP sales and services. Donihoo was previously with GX Technology.
Luc Durant is now supply chain director for Geoservices. Durant was previously global account manager with Expeditors International.
General Robotics Ltd. (GRL) has opened an ROV pilot support and assessment centre in Aberdeen. The center deploys GRL’s simulators and assessment metrics software to assess pilot performance.
The Houston Advanced Research Center (HARC) and Texas A&M University have teamed on an environmentally friendly drilling R&D program.
IDG Global has named Paul Hughes as oil and gas advisor. Hughes hails from Doral Energy. IDG provides counterfeiting, loss prevention and anti-theft solutions.
Guoping Li is to head up Ingrain’s latest ‘digital rock physics’ lab in Calgary.
Khaled Abu-Nasrah has been named senior VP of worldwide downstream sales and marketing for KBR.
MicroSeismic has promoted Malcolm Macaulay, VP International Sales, Peter Morton, manager data analysis and Leo Eisner as chief geophysicist. The company has also hired John Ughetta as VP US sales. Charles Stevens, Stephen Chelette, Mindy Manning and Don MacNeil also recently joined the company.
Ali Ferling, Microsoft ‘s MD for oil and gas is relocating the company’s worldwide oil and gas industry to Dubai, UAE.
In our December ‘Industry at Large’ piece we incorrectly attributed Lesley Stahl’s report on Saudi Aramco to CNN. This was in fact a CBS report as should have been clear to those who followed the link.
Regarding our report from the World Business Research Digital E&P Conference in last month’s issue, Endeca points out that ‘Devon is not an Endeca client. Jerome Beaudoin made it clear that Devon implemented its E&P Portal Solution internally. Endeca was presented as a new technology that appeared to offer an alternative to the custom coded application.’ Our apologies to Jerome Beaudoin, Devon and Endeca.
Folks at the Industry Technology Facilitator were understandably upset when we referred to them as the ‘Industry Task Force’ in our January ‘Folks, Facts’ section. In the same issue we also made multiple errors in our article ‘ITF reports successful 2008.’ We received the following corrections from ITF. ‘ITF is a not for profit organization therefore the phrase “ITF poneys up 9 million” is inaccurate. ITF facilitates technology development by securing funding from its members. Also, as a global organization, ITF’s remit and objectives go far beyond the UK. Finally, the RFP for solutions to enhance reservoir understanding and development was not an ITF call. It was made by the TSB [the Technology Strategy Board—not as we gaffed, the Trustee Savings Bank!]. ITF was merely supporting the TSB to promote their event to developers.’ Our abject apologies to ITF and the TSB!
Quality and safety solutions provider Intertek Group has acquired the Wisco Group, specialist in third party inspection, expediting and coordination services for on and offshore facility construction and development. Wisco’s services span engineering, procurement, construction and maintenance phases. Wisco’s Incor database application supports inspection, order tracking and provides web based orders for clients. Wisco has 120 full time staff in the USA, Europe, the Middle East and Asia and over 600 freelance contractors. Wisco president Larry Wiseman said, ‘Becoming part of Intertek will bring new benefits to our valued client-base and staff. By combining our global resources, we will strengthen our services offering.’
El Paso Energy Service Company has signed an eCommerce agreement with OFS Portal. The agreement provides El Paso with the data protection required to do business with other OFS Portal members. OFS Portal leveraged the API’s Petroleum Industry Data Exchange (PIDX) standard for electronic business. El Paso owns North America’s largest interstate natural gas pipeline system, transporting approximately 25 percent of the natural gas consumed in the United States each day. Its E&P division ranks among the top 10 domestic independent natural gas producers.
GE’s Energy Financial Services unit is investing $150 million in a partnership with Houston-based ATP Oil & Gas Corp. The deal sees GE acquiring a 49% stake in a floating oil and gas production facility, a first for GE. The ATP Innovator facility processes 20,000 barrels of oil and 100 million cubic feet of natural gas per day. Paul Bulmahn, ATP chairman and CEO said, ‘Partnership with GE, especially in today’s troubled economic times, will ensure continued production of oil and natural gas and will allow ATP to maintain our development program. We depend on investors like GE, which has the financial strength, energy expertise and long-term vision necessary for an investment of this magnitude.’
The Research Partnership to Secure Energy for America (RPSEA) has awarded Houston-based Knowledge Reservoir a contract for the provision of management and technical services on a RPSEA ultra-deepwater program (UDW) project. Deliverables include a research report and characterization database of deepwater and ultra-deepwater assets in the Gulf of Mexico. The report is to investigate drivers and identification of improved recovery techniques (IOR/EOR). Other project participants include Louisiana State University (LSU) and Anadarko.
Looking ahead to 2009, RPSEA reports a good response to its 2008 requests for proposals. A total of 92 proposals were received for UDW and another program targeting unconventional gas and ‘other’ petroleum resources as authorized by Title IX, Subtitle J, Section 999 of the Energy Policy Act of 2005. Funding amounts to $13.9 million for the unconventional resources program, with focus on gas shales, coalbed methane produced water, and tight gas sands. A further $3.2 million is available under the small producer program for advancing technology on mature fields. The programs are funded from lease bonuses and royalties on federal lands.
An implementation case study from the Professional Petroleum Data Management Association (PPDM) provides an update on Hess Corporation’s deployment of the šPPDM data model as a master well repository (OITJ December 2008). Fred Kunzinger, E&P data manager for Hess Corporation said, ‘To compete with the national oil companies and majors, we had to be faster and smarter.’ This meant an overhaul of Hess’ ‘antiquated’ database and dispersed, variable quality proprietary data located in offices in Houston, Kuala Lumpur and London.
Three teams were set up to address raw, interpreted data and knowledge capture. PPDM was selected as the primary well database. Kunzinger said, ‘PPDM is straightforward and flexible and gets what you want it to do without overdoing it. It can accommodate all interaction with the database in a clear, straightforward manner.’ Hess allocated $1 million per year to data management in a 12 man-year effort which is now nearly complete.
Today, Hess’ master well repository in Houston is serving data to all regional offices. MoveIT from Ipswitch is used to compress and encrypt data transfers. The project is now being extended to include documents such as DST, core analyses, well completions and image logs. Hess now plans to include production data in its master database.
Palantir Solutions and Tibco’s Spotfire unit have teamed on a new visual analytics tool for oil and gas portfolio analysis. The new tool, PalantirPlan (PP) combines Palantir’s portfolio analysis tool with Spotfire’s visual analytics. PP allows planners to visualize and analyze data and to define targets, model project interdependencies and change timings, all with instant graphical update.
Palantir’s software provides economic evaluation, planning, consolidated project financial statements and a data integration/workflow solution. Palantir MD Jason Ambrose said, ‘Using the visual analytics capabilities of Spotfire, we were able to quickly bring PalantirPlan’s analysis to life. Often, energy companies and their advisors are able to understand and analyze data but then struggle to communicate it clearly and coherently among colleagues involved in the decision making process.’
Tibco’s Bill Doyle added, ‘PP will enable front-line users to get more out of the information they use each day and allow oil and gas companies to manage risk, respond to change and grow asset value through better decision making and faster data analysis.’
Berkana Resources and CIDG Corp. have announced a comprehensive security and compliance solution (CSACS) security assessment of SCADA systems in oil and gas. CSACS’ methodology examines control systems, identifying and prioritizing vulnerabilities. A workflow manager assists operators with the remediation process and a business continuity and disaster recovery planning tool and knowledgebase are there for the worst case scenario.
CSACS embeds Modulo’s Risk Manager tool into a proprietary approach to security assessments that has been jointly developed by Berkana, CIDG, and Joyce & Paul. Risk Manager assures compliance with PCI DSS, SOX, ISO 27002, ISO 27001, COBIT, and more. Modulo’s ‘Metaframework’ update feature keeps asset owners up to date on the latest changes to industry standards, guidelines, best practices and regulatory requirements.
Total’s Houston Geophysical Research Group (HGRG) reports on deployment of an AMD Quad-Core/InfiniBand-based cluster supplied by Appro International. The Linux cluster provides ‘up to’ 1000 cores and a Lustre file system. Software includes a Pathscale compiler, CentOS, OFED, and MVAPICH. The system was temporarily housed off-site by Cyrus One during the upgrade. Algorithms developed at the HGRG are run ‘at scale’ on another 10,000 core machine in Total’s Parisian HQ.
Pemex has signed with Fugro-Jason for ongoing training in seismic inversion and reservoir characterization with Fugro-Jason products. The deal involves two five-month training sessions for 12 students.
BP Angola has awarded Halliburton ‘up to’ four contracts for a standardized development process of its deepwater Angolan fields. The deal, worth up to $600 million, includes completions equipment and downhole flow control.
Hess Corp. has commissioned San Francisco-based communications agency Organic Inc. to design and develop its new ‘public-facing’ website, www.hess.com. Organic is also building an intranet site for Hess including the social media tools to allow Hess’ global workforce to share knowledge and collaborate virtually. ‘Digital billboards’ at Hess facilities will keep employees abreast of the latest happenings inside the company.
Midstream company Enogex has extended its contract with eSimulation for use of its eSimOptimizer application to optimize operations of five of its cryogenic gas plants. The contract represents a new business model for eSimOptimizer deployment. A ‘value capture program’ jointly developed by the companies means software usage can be tuned to operating conditions.
Chevron and ConocoPhillips have signed commercial agreements to use Ingrain’s digital rock physics technology and services. The companies will use Ingrain’s 3D imaging and computation to rock physical properties and fluid flows.
Gazprom Neft’ has signed a multi license agreement for Paradigm’s SKUA seismic interpretation and reservoir characterization software. The deal includes training and consulting services. Along with SKUA 2008, Gazprom Neft’ is to deploy SeisEarth, Geolog, Stratimagic and GoCad at its new St. Petersburg Reservoir Centre for Science and Engineering.
Petrobras’ engineers have selected VRcontext’s ‘Walkinside’ 3D virtual reality package for safety analysis, design reviews and asset life-cycle management at its offshore exploration and production business units. The software will be deployed through Petrobras’ ‘Proteus’ web-portal. Petrobras’ project automation manager Paulo Roberto Oliveira de Araujo said, ‘Bi directional links between Walkinside and applications such as Gexcon’s Flacs CFD simulator let us optimize evacuation routes and enhance personnel safety by simulating emergency situations.’
Petrobras has also chosen SPT Group to develop and install a real-time production monitoring and management system based on the EDPM and OLGA. The PMMS will be installed for the P-35 Marlim field to optimize the production and address key flow assurance challenges.
Baker Hughes has licensed Techsia’s Techlog Interactive Suite to support its worldwide geosciences services. Techlog will be incorporated into Baker Hughes’ service offering in support of directional drilling, formation evaluation and wireline.
Energistics has had a busy month with announcements covering national data repositories, ProdML and Rescue. The national data repository work group was set up, targeting regulators of upstream oil and natural gas activity and information management of seismic, drilling, production and reservoir data. The work group is to assist emerging nations in the collection, maintenance and delivery of quality oil and gas data. Work group chair Martin Peersmann of the Netherlands topographic base-map project LSV GBKN said, ‘After a decade of successful meetings around the world, we decided it was time to formalize this effort and make sure we communicated our data management standards message to all nations with hydrocarbon reserves.’ The NDR group is led by the UK, Norway and Netherlands. Other members are Australia, Canada, India, Kenya, New Zealand, South Africa and the United States. Schlumberger, Halliburton, RPS/Paras and Pennwell are also on board ‘to ensure that data quality and regulatory standards deployment meets the needs of all users.’ The next NDR meeting will be hosted by India’s Directorate General of Hydrocarbons (DGH) from 31 August to 3 September of 2009 in New Delhi.
Energistics’ PRODML special interest group has prepared a proposal to bring service company data acquisition into the scope of the standard. The team is soliciting interested parties from energy and service companies to join a new sub-team that will identify, prioritize, and describe requirements for the receipt of service company production-related data. The effort targets activities such as artificial lift design, PVT and gas analysis, flowing and static pressure surveys. The idea behind the initiative is to standardize on ProdML data recording at the point of origin of these data types, facilitating subsequent data flow through analytical, storage and reporting systems.
Energistics is also bringing the new RESQML standard for reservoir models into the fold as a special interest group (SIG). The RESCUE SIG will open up access to RESCUE making it freely available to industry. Membership is required to participate in development and definition of long-term direction of the standards.
The American Petroleum Institute’s e-business PIDX division is surveying members and non-members to evaluate global business practices. Companies are invited to complete the survey on oilit.com/links/09_02 before 31st March 2009. PIDX’ global business practices work group captures and publishes country-specific best practices and legal, fiscal information as country profiles.
In his new year message to clients, Industrial Defender (ID) president and CEO Brian Ahern contrasted the economic downturn with growing cyber security risks and vulnerabilities. ID is part of a global ecosystem of critical infrastructure industry companies, government agencies and suppliers that is working to secure mission critical industrial control and SCADA systems against cyber attack. According to Ahern, 2008 was a year of ‘unprecedented growth’ for IT with notably the signing of Royal Dutch Shell.
According to Ted Angevaare, Shell global technology leader process automation, ‘Royal Dutch Shell has selected Industrial Defender as its global process control cyber security monitoring solution. Following a detailed evaluation, Shell concluded that the ID solution will mitigate cyber security threats to Shell’s production process and that the product is mature enough to be installed globally.’
ID is to hold its first cyber security users group next May in Boston. The company claims some 8,000 global deployments in securing critical infrastructure systems, over 3,000 mission critical SCADA deployments and some 100 process control/SCADA cyber security assessments. ID also provides managed security services for 160 process control plants in 21 countries.
The Oil & Gas RFID (OGR) Solution Group, a joint venture between Texas A&M Engineering and industry partners has outlined its 2009 plan for advancing radio frequency identification (RFID) technology. A 52 acre ‘live-lab’ facility has been opened on the Texas A&M campus for testing field equipment and applications in simulated oil, gas and petrochemical environments. Director and founder, Ben Zoghi of Texas A&M said, ‘Our objective is to develop a roadmap of RFID adoption that bridges industry standards, exposes pain points, and enables business optimization.’
The live-lab’s simulated environments include production facilities, storage tanks, rigs, pipelines and rail cars and allows for low-risk testing in a controlled environment. Zoghi added, ‘In 2009 we want to reach out to the end-user community, in exploration, drilling, production or refining and enhance their business with the systems that OGR and its partners have developed.’ One strategic initiative for 2009 is international expansion to the Middle East. OGR members include Motorola, Dow Chemical, BP, University of Houston, and EPC Global.
Tokyo-based process engineering behemoth Yokogawa has announced a new manufacturing execution system (MES) platform, ‘Real-time Production Organizer’ (RPO) targeting refining and petrochemicals. RPO plugs the gap between enterprise resource planning (ERP) and MES systems. RPO builds on Yokogawa’s VigilantPlant infrastructure and the ISA-95 standard for the integration of enterprise and control systems. RPO components expose web services for workflow (using Business Process Modeling Notation), production coordination, work order tracking and key performance indicators (KPI) monitoring.
Yokogawa also announced a new version of its Plant Resource Manager (PRM R3.03) solution for centralized device management. The new release offers enhanced integration with the Stardom network-based control system and Windows Vista support. PRM now supports the Electronic Device Description Language (EDDL) for graphs and charting. PRM is used to manage field devices in oil and natural gas production and other downstream activities.
Oklahoma City-based Universal Well Site Solutions (UWSS) has announced ‘UniSkid’, a ‘plug-and-play’ platform for coal bed methane (CBM) fields. UniSkid offers ‘instant’ CBM well-site hookup, with all equipment communicating immediately, reducing set-up time and ensuring reliable, real-time production and operational information. UniSkid components include remote monitoring, measurement, communications and metering.
Cathy Conner, UWSS chair and CEO said, ‘UniSkid was designed from more than a dozen years of field experiences, not in an engineer’s test area. We work with client engineers and field personnel to customize a product and assure a low cost alternative for installation, portability, communications and functionality.’
UniSkid customers can determine what additional well site equipment is required including separators, gas measurement, water measurement and communication devices. A programmable operator interface and flexible communications options offer remote data access. UniSkid is claimed to be a ‘profit-maximizing’ alternative to old well site trenching, integration and installation of equipment that previously required individual set-ups and costly, on-site project management.
Speaking at the 2009 IBM ‘Pulse’ IT automation and industrialization event in Las Vegas this month, Jim McHugh described BP’s global E&P Backbone, built around IBM’s Maintenance, Repair and Operations (MRO) flagship, Maximo. McHugh described the status quo ante as comprising, ‘Too many customized instances of materials management, work management, procurement and financial tools. Such multiple systems create complexity.’ BP’s answer is the E&P Backbone, spanning multiple ‘themes’ of work management (Maximo), materials management, purchase to pay and finance (SAP). The backbone sets out to provide ‘consistent end-to-end system responsiveness and stability for the end user experience.’ A ‘virtual team’ from BP and IBM performed an analysis of work management system (WMS) performance to identify issues and constraints and to determine and test corrective actions. System components including Maximo (application plus database), servers, BEA WebLogic were studied along with networks, firewalls, encryption and operating systems. Performance was compared across Houston-based users of WMS (Maximo), other applications and other BP Maximo deployments.
The study resulted in a hardware ‘landscape’ comprising IBM Blade servers in a 2 node failover cluster. Maximo databases run on Oracle 10G on Unix. Actuate and Business Objects are used for reporting. Maximo data loaders and Microsoft Project interface servers also ran. Network latency was addressed by implementing compression (GZip) and browser file caching (MaxAge). BP now plans to automate performance monitoring and is re-appraising its tool set to include third-party providers.
Computer Sciences Corporation (CSC) and Oracle have teamed on a business intelligence (BI) initiative for the upstream. The new Petroleum Enterprise Intelligence (PEI) embeds Oracle’s tools and analytics into petroleum industry workflows offering a ‘holistic’ approach to business intelligence and data integration for the digital oilfield. Oracle’s BI applications deliver role-based intelligence for all stakeholders. Component applications include Siebel, Oracle’s E-Business Suite, PeopleSoft and third party systems.
Houston-based Information Store (iStore) has contributed software to the PEI. Jane Howell, marketing strategist with CSC told Oil IT Journal ‘We are working with iStore on our PEI solution. This provides executives, geoscientists, engineers, landmen and other stakeholders with access to data and BI functions for decision support.’ The PEI brings together geological and engineering data, financial and operational information and unstructured data. The solution provides oilfield alerts, daily reports and real time costs. CSC has just signed a major outsourcing deal with BHP Billiton [more next month]. Other CSC clients include BP, Conoco, Chevron and ExxonMobil.
Houston-based NuPhysicia has announced ‘InPlace’ Medical Solutions, a new technology that brings doctor services to offshore and other remote locales. The telemedicine service lets patients see and talk to a physician guiding their examination while an onboard medic operates the digital diagnostic instruments. The real-time, two-way videoconferencing makes it possible to diagnose injuries and illnesses remotely, reducing the need for helicopter evacuations.
The onsite system comprises a digital stethoscope sending unit, a document video camera with fluorescent front lighting, and back lighting for radiographic films and x-rays and a multi-purpose otoscope, dermoscope and laryngoscope.
Privately held NuPhysicia markets solutions developed by the University of Texas Medical Branch. A NuPhysicia webcast this month showed a live video examination of a patient located on a rig offshore Malaysia.
Invensys Process Systems has just unveiled an immersive virtual reality (IVR), ‘next-generation’ human machine interface (HMI) that it claims will ‘revolutionize’ the way engineers and operator trainees see and interact with the plant and the processes they control. IVR creates a 3D computer-generated representation of a real or proposed process plant. A stereoscopic headset provides users with an immersive environment in which they can move through the plant in any direction. The virtual environment is rendered at 60 frames per second, providing a realistic user experience.
IPS director Maurizio Rovaglio said, ‘The ability to simulate complex processes in connection with virtual actions allows the user to experience an environment that changes over time, making it more effective at transferring skills learned in training to the work environment. Rarely performed volatile tasks such as plant shutdowns can be rehearsed in a stable, realistic environment, allowing users and trainees learn and make mistakes without putting themselves, the community or the environment at risk. Computer models of real equipment allow for experimentation without taking equipment off line and impacting production.’ Invensys’ DynSim software emulates the plant environment, linking process simulation models with physical models to create real world scenarios.