January 2008


PPDM for Chevron

Chevron declares the Public Petroleum Data Model ‘preferred upstream enterprise data model.’ Following multiple pilots, PPDM 3.8 forms master data backbone to Chevron’s upstream architecture.

Speaking in Calgary at the annual meet of the Public Petroleum Data Model (PPDM) Association late last year, Chellie Hailes revealed how the PPDM data model underpins Chevron’s information architecture. Chevron’s first brush with PPDM was back in 1993 when its Canadian unit implemented a PPDM 2.3 database from Applied Terravision Systems.

North America Upstream

More recently, Chevron North America Upstream unit’s field data architecture (FDA) project also chose PPDM for a proof-of-concept that ‘integrates diverse sets of information into a holistic view of asset performance.’ FDA provides data for production operations. This has defined standards for data flows and processes and a technique for integrating data from operational ‘systems of record’ (SoR). FDA proposes a simplified data architecture ‘plug and play’ environment for applications.

Vendor-neutral

Chevron wanted a standards-based infrastructure that could support field data in a ‘vendor-neutral’ context. FDA used an Oracle instantiation of the PPDM 3.8 data model. Sample well and spatial data from field SoRs was mapped and loaded to the new database. The successful pilot led Chevron to recommend PPDM as the FDA data model.

Reservoir management

Another Chevron project, the Energy and Technology unit’s international Reservoir Management Information Architecture (RMIA) initiative has likewise elected for a PPDM foundation. RMIA sets out to ‘identify and affirm’ Chevron-wide applications, data and work processes for effective reservoir management. RMIA includes a blueprint information architecture, a roadmap for implementation, data management best practices and a pilot deployment. Again, PPDM 3.8 on Oracle was selected as database of choice. Another project—the I-Field architecture also converged on PPDM as the model of choice.

Upstream Architecture

Following the RMIA success, Chevron established a global upstream architecture team to ‘adopt and translate the enterprise architecture principles, guidelines and standards [for] use at the enterprise level.’ This has resulted in the recognition of PPDM 3.8 as the preferred upstream enterprise data model, the establishment of a PPDM steering committee and the development of a global upstream master data management solution based on PPDM.

Vision

Chevron’s vision is now a PPDM-based master data management system underpinned by standard taxonomies, hierarchies and rules for data governance and quality. Deliverables include a global upstream enterprise data integration hub, master data management, unique identifiers and standard reference values. The solution, which embeds Chevron’s upstream data dictionary, will displace multiple existing well header systems.


Marathon’s new ECM

Marathon cites integration with SharePoint and Exchange in choice of Open Text-based enterprise content management solution.

Marathon has chosen Open Text enterprise content management for its company-wide document and records management. The software will be used by 20,000 employees around the world to improve business processes and share documents, records, email and information in Microsoft’s SharePoint Server.

Couch

Doug Couch, Marathon’s program manager said, ‘The objective of our Enterprise Content Records Management (ECRM) program is to make sure that Marathon personnel are accessing relevant, up-to-date and trusted information. This includes the ability to identify, capture, preserve, and classify records from across the enterprise. Integration with SharePoint means that employees can continue to work in Microsoft interfaces, while we implement the lifecycle management capabilities we need in Open Text.’

Search

Federated search spans multi-site SharePoint documents as well as content in Open Text’s ECM repositories. Marathon leverages Open Text’s email management solution for Microsoft Exchange—combining ‘foundational’ email archiving with records management capabilities.


SOA near and yet SOA far!

Oil IT Journal editor Neil McNaughton contrasts the services-oriented approach with the ‘command and control,’ top down architecture of the process industry—to conclude with more bad puns.

It is common these days, in a talk about this or that ‘technology’ to say ‘of course it’s not really about the technology, it’s about the people and the process.’ The ‘people and process’ mantra has been said so many times that it has achieved motherhood and apple pie status. How could it be any other way?

P ’n P’

Well let me start by observing that if I call a plumber to fix a tap, the last thing I want to get into is ‘people and process.’ In fact if ‘P ’n P’ or virtually any form of discourse arises during the intervention, it is likely a sign that things are going wrong. The plumber (if you can get one!) is performing a nicely encapsulated service of a near algorithmic nature. Tap dripping -> call plumber -> tap fixed.

SOA

Until now I assumed that service-oriented architecture was more about the encapsulated ‘plumber’ service than ‘P ’n P.’ I mean, the ‘services’ in SOA are web services, right? And these are SOAP, REST, HTTP whatever that are carrying unambiguous messages and completely qualified data with context and all that good stuff.

Carter

Well yes and no—according to IBM SOA evangelist Sandy Carter. In a keynote to the Object Management Group meeting in Burlingame, CA last month Carter asked what was standing in the way of SOA deployment. An IBM survey found that the main lacuna was nit IT but business skills—that the shortage was of ‘enterprise architects’ not technologists.

T-shaped

Seemingly the N° 1 requirement for an SOA project is a combination of skill sets to ‘articulate and communicate’ across business and IT. This requires a ‘T-Shaped person’ with a skill set that enables them to identify a business service and communicate it to the developers. To achieve this, Carter advocates a ‘mini MBA’ course for IT Grads and IBM has initiated a range of initiatives to cross the gap and develop the ‘T-shapers.’

Wrong?

Now while there is nothing wrong with this, it did give me some cause for reflection. On the one hand, communicating the business problem to the programmer has been an ‘issue’ since the dawn of computing. On the other hand, the more T-shaped people you have running around doing the ‘P ’n P’ stuff, the more you run the risk of your project suffering from some form of paralysis by analysis. I really didn’t see what SOA had to do with the MBA folks at all.

What is SOA anyhow?

I consulted the Wikipedia oracle for a ruling. Like all oracles, Wikipedia provides answers to suit all. On the one hand, SOA is about architecting ‘intrinsically unassociated units of functionality, which have no calls to each other embedded in them’ in other words, ‘call the plumber.’ On the other hand, the article specifically refers to XML, WSDL and SOAP—which points to a fairly low level skill set.

Which SOA?

So is SOA about high level architecture and design? Or is it about low level web services? If it is about the first, then I would submit that it is a matter of faith that the ‘services’ approach will improve on the three tier architecture, on business objects or any other past paradigms. If it is about the latter, then at least there is a chance of solving the interoperability issues that plagued previous generations of IT silver bullets.

Selling SOA

Without being too cynical, one can see the push for SOA as an object lesson in IT marketing. First get the developers hooked on some new coding paradigm—SOAP, .NET—whatever. Then heat up the rarified air in the boardroom with talk of SOA as a new business paradigm. So when the top brass meets a coder at the coffee machine—they can play buzzword bingo and give each other warm feelings. SOA near and yet SOA far!

Pai

In our interview with Schlumberger’s Satish Pai (page 3 of this issue) we discussed the relative under-representation of the process control community in the ‘digital oilfield.’ Pai also stated that the downstream is ahead of upstream in the automation stakes. This was somewhat comforting as Oil IT Journal has been pushing the boat out in the direction of engineering and process control for some time now.

BP chemicals

In this context, I would encourage you to read the latest issue of BP’s Frontiers magazine* (summarized on page 12 of this issue). This describes in detail how BP optimizes its chemicals operations with a top down hierarchical control system operating at three different granularities and time scales.

Command and control

I know it sounds crazy, but I can’t help trying to squeeze a pithy conclusion by contrasting BP’s ‘command and control’ with the notion that we should train the MBAs in SOA. On the one hand, as a ‘renaissance man’ myself, I am all in favor of more freedom and interchange as making for a more stimulating workplace. But are we really going to teach the MBAs engineering, then SOA then SOAP?

Data management

I think the answer is probably yes to a degree. At least it would be a good idea to train engineers in architectural principles—but with an emphasis on data management rather than on services. A little knowledge of plumbing is after all good for us all.

ExxonMobil

I’m not sure that was very pithy so I’ll try again. Contrast ‘command and control’ with ‘renaissance persons.’ Now ask yourselves—which major oil company’s organization lies on the command and control end of the spectrum? And which ‘Irving-based behemoth has just set new ceilings for annual and quarterly profits ever earned by a US company?**’ I REST my CASE.

* www.oilit.com/links/1004.

** Houston Chronicle.


Oil IT Journal interview—Satish Pai, Schlumberger

We spoke to Satish Pai (President of Schlumberger’s Europe & Africa Oilfield Services division) in his role as organizer of the upcoming Intelligent Energy conference. Pai sees other industries (automobile, manufacturing—even the downstream oil industry) as ahead of the upstream in digitization. But Pai has faith in the new ‘Nintendo’ generation and in automation—up to a point.

OITJ—What led to your involvement in the Intelligent Energy* event?

Pai—I see the conference as an opportunity to take stock of where we are as an industry in relationship to digital technology, automation and information management (IM), a kind of ‘reality check’ - no more hype, no more ‘what’s the vision?’ A lot of us are convinced that technology for the digital oilfield exists but that the bottleneck now is people and process. The theme of the conference is to see how we can take the processes and digitally enable them with existing technology. Other industries are way ahead of oil and gas in digital enablement.

OITJ—Could you give some examples of industries that are ahead of the oil and gas vertical?

Pai—Manufacturing, aeronautics, automobile—a modern car assembly line is all robotics, all computerized. Airline pilots are trained on simulators—oil and gas is way behind here. Even though everybody want to hire folks with 15 years experience—maybe simulators are a way to accelerate this.

OITJ—Although operator training simulators are widely used in the downstream.

Pai—Yes it’s interesting that downstream is ahead of the upstream in automation—although I’m not sure why. Perhaps a low margin business makes for more tuning of processes and automation. I was discussing this with Don Paul (Chevron CIO) who confirmed that downstream is ahead of us.

OITJ—How is the conference addressing these issues?

Pai—We have a great opportunity to benchmark the upstream in this context at the ‘scene setting’ session. Here there will be a live real time line link to BP’s center in Baku. We will then be moving over live to a Shell facility in Holland. Next we’ll make a virtual visit to a StatoilHydro platform in the North Sea and finally back to our own real time operations center in Aberdeen. All this in real time with live data links. We really wanted to go beyond the PowerPoint presentation!

OITJ—Since production platforms are more like refineries, that makes them closer to the process control community than the upstream?

Pai—That’s why we want to go see what’s happening in Baku and Snovhit where by the way, StatoilHydro has been automating and moving people onshore for safety reasons.

OITJ—What is the conference doing to address the people challenge you mention?

Pai—We have tried to make the show attractive to younger engineers—with a ‘young professionals’ track where the event will be blogged in real time. This reflects a generational change, contrasting the apprehension of older engineers with the ‘Nintendo generation’ which has no fear of technology. This is a real ‘fault line’ in the industry where new technology is enabling new ways of working for those who are familiar with the tools. We also want to find out what you people think of the industry—especially regarding automation—and the fact that there will be less people doing more with technology in the future.

OITJ—What do you think the answer is?

Pai—It’s a big question! People today attribute a certain number of failures on rigs because of inexperienced people on the rig. We need to take advantage of the new technology and make it more amenable to young engineers.

OITJ—But surely the issue is more with engineering knowledge than a ‘Nintendo’ interface...

Pai—Yes, but in the old days a driller would work with four or five or parameters, now there are many more.

OITJ—So we automate?

Pai—I am wary of automation—especially of the kind that leads to stock market crashes! Wee need automation combined with human intelligence. I’m not in favor of total automation. Mother earth is too unpredictable. Today real time sensors offer data at your fingertips—automating routine tasks.

OITJ—So you still need the experts?

Pai—Experience is important but we need to achieve the 20-25 years experience level with 5-10 year people. This could be achieved with more automation and feedback—but I don’t believe that even a 90% automation level is appropriate for the upstream.

OITJ—If you go stand on a platform and look around, you see stuff built by EPC**s like AMEC and KBR and instrumentation and control systems vendors like Emerson or Yokogawa. There really isn‘t much of a contribution from the upstream. Are the EPCs and process control folks who build the digital oilfield at the show?

Pai—I’m not sure that they are and I agree that you have a point. Maybe subsea facilities are a better example of upstream-focused engineering. These guys really are ‘linked-in.’

OITJ—What about standards like WITSML and PRODML, are they a help or a hindrance?

Pai—They are a tremendous help. XML data streams have greatly improved data sharing. A byte has a hard time getting from well to workstation—crossing firewalls and bypassing incompatible standards etc. Bytes lead miserable lives! Data standards are a big help.

OITJ—To what extent is the upstream digitally enabled today?

Pai—You have to put all this into context. High tech still has limited penetration. Rotary steering makes up maybe 15 to 20% of the market and intelligent completions as little as 1%. To get to the level of integration I’m talking about we need more intelligent completions (or vice-versa!). Clients are still concerned with what goes downhole.

OITJ—We’re back to the old ‘No jewelry in my well’ syndrome!

Pai—It’s still a problem. I would love to know how many intelligent completions there are in the world today!

OITJ—Are we really drowning in data?

Pai—Some seem to think we are. In process control, the data historian already filters data before storage. This contrasts with the seismic industry that has been crying for more and more data for years—and managing it successfully. There really should be no complaints about ‘too much data.’ Let’s just focus on turning data into high value actionable information. A terabyte is no longer a big deal. .

* www.intelligentenergyevent.com

** Engineering and procurement contractor.


Future of GIS—internet, ArcGIS and Petroleum User Group

Jack Dangermond, ESRI founder and president, talks on ‘most important technology of our time.’ Petroleum User Group meets this month.

Speaking this month to students from Beijing Normal University, where 3,000 students study geography and GIS, ESRI president Jack Dangermond offered some insights into the future of geographic information systems (GIS,) ‘one of the most important technologies of our time.’

Internet

‘GIS and the internet will become inexorably entwined—GIS technology is evolving on the Web, making geographic knowledge easier to access and more available. As our planet becomes more ‘connected,’ we will see new geographic information services and communities of users who incorporate these services into their daily decision making.’

Students

Despite the interest that GIS technology and tools evoke, Dangermond advises students, including those who aim to become GIS professionals, to pursue a ‘well-rounded’ education. GIS has geography at its core—understanding the science behind the technology is essential, for analysts, geospatial applications and GIS software development. ‘It’s the computer engineer who thinks spatially who will advance geospatial technology into the future—this field needs very creative people.’

ArcGIS future

Future releases of ESRI’s flagship ArcGIS product will target the key areas of cartography, server and mobile GIS, internet-based GIS and geodata management. A ‘robust’ server platform will deliver geographic information to heterogeneous clients including wireless technology for the mobile workforce. New tools will extend the geodatabase functionality and geospatial data management capabilities.

PUG

ESRI, founded in 1969 is now the fifth largest privately held software company in the world. The ESRI Petroleum User Group meets this month in Houston with notably a ‘GIS Leaders’ panel to be moderated by Oil IT Journal editor, Neil McNaughton. More from www.esri.com/pug.


Digital Record Center hosts scanned images, digital assets

Iron Mountain leverages IBM Content Manager to offer outsourced document management.

Information management specialist Iron Mountain has teamed with IBM to provide a hosting service for essential documents. Iron Mountain’s ‘Digital Record Center’ (DRC) is a secure repository for scanned images of physical records, PDFs and other digital documents. The DRC extends Iron Mountain’s (IM) portfolio of document management with a ‘complete solution’ for information lifecycle management.

Churchill

IM VP Chris Churchill said, ‘We’ve helped our clients manage physical records and convert paper documents to digital files for a long time. Now, the DRC brings a cost-effective repository and a comprehensive, single-source solution for secure information access.’ The DRC uses ‘sophisticated’ indexing and search tools for rapid document retrieval. The scalable solution has no capital costs and allows authorized users to quickly access records on a 24/7 basis via the Internet. The DRC also offers uninterrupted access to key records after a disaster and assures regulatory compliance through documented records management best practices.

IBM

The DRC is powered by IBM’s DB2 Content Manager OnDemand solution that provides high-volume capture of computer output and automated storage management. of archived documents. IBM’s enterprise content management platform is also available in a tailored solution for chemicals and petroleum operations with functionality tuned to E&P information management, reserves, compliance and plant life cycle information
management.


Oil and Gas UK unveils ‘Step Change in Safety’ website

ESRI responds to the Google Earth challenge with enterprise image management for utility.

The UK trade organization Oil and Gas UK (O&GUK—formerly the UK Offshore Operators’ Association) has just launched a new website in support of its ‘industry-wide asset integrity program.’ The ‘Step Change in Safety’ website aims to increase the sharing of best practices, improve companies’ internal learning processes and enhance communication within companies and across Industry on the key issues surrounding process safety and asset integrity.

Allen

O&GUK’s HSE director Chris Allen said, ‘The website is the first in a number of initiatives planned for 2008, launching a unique, industry-wide effort to enhance asset integrity on offshore installations. Obviously individual companies are working hard and investing significantly to maintain the integrity of their own installations for the longer futures which now lie ahead of them. The role of Step Change is to ensure that lessons learnt and good practices developed are widely shared so that the overall Industry effort is as effective as it possibly can be.

Work Group

The O&GUK’s Asset Integrity Work Group is also developing a one-day interactive workshop on asset integrity management. This workshop is intended to equip company leaders with enough knowledge to enable them to ask the right questions about asset integrity, and thus ensure that they are sufficiently well informed when setting priorities and making decisions.

Forrest

John Forrest, who heads up the Asset Integrity Work Group, added, ‘A big attraction of the new website is that an interested individual can access full details of a best practice example or a lesson learnt from anywhere in Industry almost immediately. This means that the new idea can be very quickly translated into an improved work practice.’ Checkout the new website on www.oilit.com/links/1005.


Software, hardware short takes ...

Palisade, AVEVA, Tecplot, Heliosoft, Energy Navigator, ESRI GIS appliance, Mercury, Imation.

Palisade has just released a new version of its @RISK decision support package. @RISK 5.0 offers ‘total’ Excel integration, improved graphics and model sharing and more analysis including Six Sigma design of experiment and value at risk functions. @RISK Library is a new SQL database for sharing of probability distributions, model components and simulation results with other users.

~

AVEVA has announced PDMS 12.0, a new version of its plant design package. The new release is said to be easier to use with rule-based design, automation and easier deployment. Enhancements include applications for designing equipment, piping, ducting, structural steel and supports with a Microsoft office ‘look and feel. ’ Design rule technology adds rule-based, automatic pipe routing and quality checking. New object associations in the plant model allow key relationships to be defined and monitored as the design evolves. IPR protection allows for data sharing with sub-contractors and partners, reducing the risk of copy or modification.

~

Tecplot RS, a joint Tecplot/Chevron development is a post-processor for oil and gas reservoir modeling. The new RS 2008 release manages data from multiple reservoir simulation software applications, as well as observed data such as production rates and formation tests. New features include project-oriented I/O—with one-step project leading, templates for recreating specific plots, a loader for VIP and NEXUS solutions saved in the VDB database format and interactive plot editing.

~

Heliosoft’s SeisMaster Pro 6.0 offers an improved graphical user interface and new modules for horizon tracking on multiple 2D data, horizon flattening or static shifts and automatic gain control. A new ‘generic application framework’ enables researchers and developers to create their own applications within the SeisMaster environment.

~

Energy Navigator has announced version 6.0 of its AFE Navigator spend management package. The new release improves end user functionality and integration with other systems. A single sign-on logs users into AFE Navigator automatically. AFE-related documentation can be retrieved through a web browser front-end to corporate document management systems. Key AFE events can now be ‘pushed’ to third-party applications with a user-defined data integration tool.

~

ESRI has announced an ArcGIS ‘Data Appliance’ to provide terabytes of ‘ready-to-use’ data to ArcGIS Server users. The data appliance serves pre-rendered, pre-cached geographic data previously only available from ESRI’s hosted ArcGIS Online service. The appliance is built around a network attached storage solution from Inline Corp. Data collections include 15m worldwide satellite imagery, a world street map with shaded relief imagery, physical and topographic maps and a world political atlas. A USA ‘Prime Imagery’ collection offers eight terabytes of data including a mosaic of 1-meter or better aerial imagery and detailed topography.

~

Open Inventor 7 from Mercury, a new version of the cross-platform 3D graphics toolkit, adds new functions including ‘ReservoirViz’ very large, 3D structured meshes and ScaleViz for cluster-based rendering. A new function is available for GPU-based geometry shading and a range of compressed formats has been added. A Frame Buffer Object allows for GPU-based number crunching (GPGPU).

~

Imation’s new LTO-4-based ‘adjacent-track’ tape technology increases the capacity of conventional data storage tape to 10,000 tracks per inch. This doubles current capacity to around 1.6 terabytes on a conventional LTO4 cartridge.


‘Information Engine’ bundles master data management and BI

Kalido’s new business intelligence solution ‘streamlines how key questions are asked and answered.’

Kalido has announced a new ‘Information Engine’ solution to ‘streamline how key performance questions get asked and answered.’ The Kalido Information Engine (KIE) bundles Kalido’s Business Information Modeler, Dynamic Information Warehouse, Master Data Management and Universal Information Director products. KIE cohabits with existing transaction systems, data warehouses and business intelligence front-ends addressing what Kalido describes as enterprise computing’s ‘inconvenient truth,’ that, despite the billions invested to date, BI’s promise remains ‘largely unfulfilled.’

Morris

According to IDC Senior VP Henry Morris, ‘BI can be time-consuming to implement and costly to modify when business needs change. KIE enables business and IT to work together to maintain a high-level model that reflects current requirements, manages metadata and handles changes over time.’ KIE helps customers design, develop and deploy a BI infrastructure faster than traditional hand-coding or ETL-based methods.

Hewitt

Kalido CEO Bill Hewitt said, ‘Companies are challenged with the complex transformation of corporate data to actionable information. The problem’s not with BI, but rather with the data and processes that are managed on spreadsheets, reliant on custom coding and subject to misinterpretation. KIE makes your infrastructure ‘intelligent,’ automated and business model-driven.’

Visual Business Modeler

Kalido has also just announced a ‘visual modeler’ for BI—a graphical design environment that introduces ‘gesture-based’ modeling. The modeler streamlines data warehouse development and change management, enabling BI directors and data architects to keep their infrastructures in step with changing business requirements.

Sarbanoglu

Kalido’s Hakan Sarbanoglu told Oil IT Journal, ‘KIE powers many projects in the oil and gas industry, across a range of applications, including well master data management, pipelines, asset integrity management, financial reporting and more.’ Kalido’s oil and gas customers include Shell and BP. More from david.winterhalter@kalido.com.


PESGB Data Management SIG 2007, London

The Petroleum Exploration Society of Great Britain’s biennial data management conference heard of a new oilfield history project from Aberdeen University, of the pros and cons of Web 2.0 from Tribal Tech, of a new ‘business and IT’ modeling tool from Stroma Software, of the Avatar-m project for long term archival of seismics and more from Exprodat on the ‘GIS in E&P’ study.

Siobhan Convery (Aberdeen University) introduced the ‘Capturing the Energy’ (CTE) project which follows along the lines of the Norwegian State Archive (Statsarkivet—see Oil IT Journal September 2007). The aim is to preserve what may become historically significant records of business decisions taken in the development of the UK’s oil and gas fields. Aberdeen University, with support from Total E&P UK and Business Archive Scotland will host the resulting industry archive.

Frigg

The Frigg Field, which spans the UK and Norwegian boundary, was developed in the 1970s following a landmark international legal agreement. Aberdeen University approached the operator, Total, with regards to documenting UK-specific parts of the field including the MCP01 gas compression platform. This is to include information on machines, equipment, engineering and the 32” gas pipeline to St. Fergus. The resulting Frigg Archive is the first CTE project (sponsored by Total and Gassled) and includes engineering drawings, photographs and reports. These have been captured with Ex Libris’ ‘DigiTool’ digital asset management package. A related ‘Lives in the Oil Industry’ project is to capture oral history. More from www.capturing-the-energy.org.uk.

Web 2.0

Paul Duller’s (Tribal Technology) talk on the hardcopy ‘nightmare’ was subtitled ‘from Gutenberg to an e-mess!’ How can organizations dispersed around the world get people to work together as though they are in the same location? For Duller, the answer lies in ‘Web 2.0’ with its social networking tools. Bulletin boards provide answers to simple questions, social networking can be used to locate skills and blogs to publish material. Wikis have been used in-house for electronic communication policy development. Tagging has its role as shown by Flickr’s user-generated tags—and the tools that scan tags and create tag clouds or ‘folksonomies.’ Google Blog Search was recommended as a high quality search engine. RSS offers automated update—unfortunately, ‘most in oil and gas don’t know about RSS and constantly go back and check websites.’ Other Web 2.0 tools offer ‘zero footprint’ on the user’s PC—no local software or storage—which can cause headaches for IT managers. Should they ban Web 2 tools or perhaps study why they are so popular? Microsoft has been a bit slow with Web 2.0 but now offers a more integrated system—a ‘transformed’ Office with SharePoint and Groove for project planning.

Workforce planning

Tim Doel described a project that Venture Information Management conducted for an unnamed client which was finding it hard to manage its business critical Excel spreadsheets in a multi-user environment. These were used for workforce planning and touched on business issues such as HR, skills, aging workforce, local/expatriate mix and so on. The project presented multiple data management challenges as roles and skills changed over time. Some 4,500 roles, 2000 staff/contractors and multiple projects were involved. Consolidating to Excel was ‘a nightmare,’ with intermediate output of around 500,000 rows before pivoting on time. Excel was cumbersome, slow and causing data loss. Venture’s ‘pragmatic solution’ to the Excel hell was ... an Access database with workforce plans exported to Excel for ‘instant analysis.’ Contrary to popular belief, Access is good for a network database on a shared drive—and without IT involvement! The result was ‘network-enabled’ spreadsheets. Some issues remain—it easy to corrupt data with remote synchronization, especially on wireless networks.

Avatar-m

Charlotte Norlund, (University of Southampton IT Innovation Centre) presented the Avatar-m project that is investigating ways of storing digital audio/visual and seismic data. The project sets out to address some of the ‘interesting challenges’ that archivists will face in the next 10-20 years: growing data sizes, disruptive technologies, storage obsolescence, economic and ecological issues. The project has support from the BBC and UK DTI. Seismics and video share common issues such as obsolete media, lost assets and compliance (data must be keep for perpetuity). Data management for field and stack data is generally satisfactory but intermediate stage processing data capture (management of processing parameters) ‘could be improved.’ Processing knowledge is not ‘actively managed.’ Looking to the future, ‘market forces are driving seismic data on line.’ Governments are pushing for centralized archiving of seismic. The future is a services-oriented architecture, and a ‘workflow engine.’ Avatar-m sees the future for seismics as online services. There was considerable interest and not a little surprise from the audience regarding the £3 million awarded to the project partners which include Ovation Data. A straw poll established that no one present had been asked to tender on the DTI’s data archival project.

Business and IT mapping

Fergus Cloughley (Stroma Software) unveiled a new modeling tool that creates ‘Business and IT’ (B&IT) diagrams that serve as a ‘common language’ for engineers and IT. Stroma was developed for BP’s Grangemouth refinery. During the Y2K period, BP discovered a communications gap between IT and business. There was no ‘big picture’ of business information flows. BP asked Stroma to build a link between information models and CAD, LIMS, PI, optimization and simulation. The result is a ‘dynamic view’ of process and IT infrastructure. Stroma’s B&IT diagrams show the relationship between pumps and the IT systems that look after the plant. B&IT shows data flows from tank through various business processes and owners into a spreadsheet in the accounts dept. Recent developments include ‘swim lane’ diagrams (beyond Excel and Visio) that are inventing ‘new ways to relate to information.’

GIS in E&P

Chris Jepps presented the results from Exprodat’s multi client survey of the Role of Geographical Information Systems in E&P (OITJ Dec. 07). High end GIS holds out the promise of competitive analysis—although the data management required to achieve this can be hard to realize. Although GIS has been around for a while, it is a ‘young’ business in oil and gas. Some 70% of GIS workers have only been in the business since 2000. A migration of GIS was observed—from geotechnical support to IT. Usage has risen significantly in the last three years and GIS usage is expanding across the organization. Most GIS use is in ‘business services’ (data management), new ventures and exploration—less in development. The report found a ‘sweet spot’ of support to staff ratios of 10:1 or less. No companies used a formal system of metrics to measure GIS service quality. The main GIS support issue was poor integration with other systems (ironic as GIS is marketed as an integrator). 90% of GIS users have not built basic data management structures—this is ‘surprising.’ Standards used were PPDM (36%), PODS (9%) and APDM (9%) with little other standards use reported. 82% of respondents were ESRI users. In general, E&P companies do not use geospatial IT standards. This means they are missing out on systems and data interoperability and fail to ‘unlock spatial data from isolated GIS applications and leverage IT investment in unforeseen and effective ways.’

Records management

Veronica Gordon (Iron Mountain) reminded those present that in US legislations you must comply with records management, ‘or you will go to jail.’ Sarbanes-Oxley is driving record keeping even though many organizations are still ‘in denial.’ Some use ‘arbitrary’ destruction programs—for instance, everything over five years old is trashed. The information management playing field is particularly uneven when it comes to digital records. These are ‘C-level’ issues. The American Records Management Association’s (ARMA) list server has an ongoing debate on topics like ‘what is a record?’ ‘What is a vital record?’ No firm conclusions have been reached to date. But there is a financial carrot to good records management—one Iron Mountain client implemented an RM policy including destruction and reported a 43% ROI over a 3 year period.

Legacy data management

Tarun Chandrasekar (Neuralog) believes that both ‘corporate’ and ‘project’ database paradigms have been shown to work. Issues remain with legacy unstructured data such as paper, image, reports and rasters. Neuralog has been working with Pemex, ‘cleaning gunk off Mylars’ prior to scan. This enables ‘hybrid’ data analysis as available in NeuraSection—allowing for interpretation of calibrated raster logs. Log data management can be complex—requiring interoperability with industry and horizontal applications such as SharePoint, WebParts and Informatica. Chandrasekar distinguishes two cultures—‘enterprise’ data management and ‘Google’ usability. The ideal is a blend of both with added security serving Web 2.0-ish apps. Pemex uses a quality/certification process for approved data. SQL Server Express is deployed for remote workers—a ‘mini me’ database that can be disconnected for field work and synched on return.

This article is based on a longer report from The Data Room. More from info@oilit.com.


PPDM 2007 AGM and fall user meeting, Calgary

PPDM update—educating the landmen, the business case and the pitfalls of deviation data.

The PPDM Association’s member count now stands at 112 with 23 organizations from outside North America. CEO Trudy Curtis outlined recent developments including a PPDM WIKI, education & certification programs and the imminent release of the 3.8 Alpha database. PPDM’s latest release now sports 37,000 columns and 1,650 tables. Current projects include metadata, spatial and records management. The latter includes thesauri and glossaries—leveraging Dublin Core and ‘Faceted Taxonomies.’ A user survey of work group relevance found data content as number one followed by a sample dataset load (of the RMOTC’s Teapot Dome data) and ‘data management.’ Devon is backing a workgroup to study coordinate reference systems and EPSG integration.

CEAMS

Kevin McFarlane introduced the Centre for Energy Asset Management Studies. CEAMS evolved from the Canadian Association of Petroleum Land Administration (CAPLA). CEAMS’ mandate is to promote the development of integrated learning strategies and career paths for Energy Asset Management personnel (a.k.a. landmen). The Southern Alberta Institute of Technology (SAIT) has been chosen as education partner.

Business case

Bob Faught (Sierra Systems) did some soul searching regarding ‘what honest statements can we make about usefulness of PPDM?’ The ‘classic’ arguments for PPDM are sometimes greeted with skepticism. Projects suffer from ‘creep,’ old versions stay around too long and users often just don’t care. Other issues include the ‘tug of war’ with commercial applications and what can be complex implementation. But Fuaght concludes that PPDM remains relevant in managing information as an asset that spans many different disciplines. A proper PPDM implementation helps cut costs without destroying quality, leveraging man-years of effort, removing redundancies and supporting multiple E&P workflows. PPDM can be an ‘open and common base to build on.’

Open system

Sean Udell (geoLOGIC) and Sherry Sturko (Petroleum Place Energy Solutions—P2ES) offered a fairly compelling business case for PPDM in the form of a project that tagged QByte’s PetroLAB front end with geoLOGIC’s PPDM 3.7 database. Leveraging PPDM’s ‘openness,’ P2ES was able ‘to recreate 10 years work on a data model in less than a year.’

Directional surveys

James Stolle (P2ES) described the pitfalls of managing deviated wells and directional surveys. Bad data, poor or missing reference information, can produce an erroneous ‘Central European’ location for a Barnet Shale well! For Stolle, storing data to PPDM should always be considered as a data quality exercise. Data in application stores should be considered to be ‘at risk.’ It is preferable to collect and store surveys and metadata in a PPDM datastore suitably extended to handle metadata. Stolle asked, ‘Why would a company pay very large sums to improve databases that they are already licensed to?’ The simple answer is that this is the only way to eliminate risk and loss from bad data. ‘Dry hole odds and costs are high enough as it is already!’


Folks, facts, orgs ...

PSE, Acceleware, Aker, Allegro, AMR Research, Chevron, FIATECH, Shell, Chiyoda, IADC, EMS, Energistics, GE, Geomodeling, GMI, GTS-Geotech, HTC, KBR, Noble, OFS-Portal, PGS, SGI etc.

Sang Phil Han is to head-up Process Systems Enterprise’s new offices in Daejeon, Korea.

Shawn Lorenz is now VP sales with oil and gas HPC specialist Acceleware.

Simen Lieungh is returning to Aker Kvaerner as president and CEO following a short stint with Arne Blystad. The company is also changing its name to Aker Solutions ASA.

Trading and risk management software boutique Allegro has appointed Kyle Bowker as senior VP, sales. Bowker was previously CEO of Nextance.

AMR Research has hired Phil Fersht as research director. Fersht comes from Deloitte Consulting.

Apache has named Alex de Alvarez as VP security. De Alvarez was previously with the US Department of Energy.

Chevron has joined the FIATECH standards body.

Shell Global Services has signed an agreement with Chiyoda Corp to market its Maintenance Enhancement Reliability Improvement Team program in Japan.

John Lindsay has been elected 2008 Chairman of the International Association of Drilling Contractors. Claus Hemmingsen is Vice Chairman.

Budd Melvin has been appointed director of operations and maintenance specialist EMS’ Canadian unit.

Total and Saudi Aramco have joined Energistics.

Dick Mitchell is to head up GE’s new sensing and inspection technologies facility in Skaneateles, N.Y.

Geomodeling has appointed Mike Odell as CEO, succeeding Renjun Wen, now CTO and chairman. Odell was previously with MetaCarta.

GeoMechanics International has promoted Peter O’Conor to VP global sales.

Shell Information Technology International has awarded a multi year, renewable contract to GTS-Geotech’s Scout Recruitment unit for the provision of IT contract labour. Project scope includes IT project and program management, networks and systems, applications development and support.

The Houston Technology Center has hired Maryanne Barker as associate director, Energy. Barker previously headed-up the UK’s Energy Team in Houston.

Brad Lankford is now senior VP sales with KBR’s upstream business. Lankford has been with the company since 1980. Noble Corp. has named David Williams chairman, CEO and president.

Dave Wallis is now Europe-Africa-Middle East representative for OFS Portal.

Svein Rennemo is to retire as president and CEO of PGS in the second quarter of 2008.

Irene Qualters has returned to SGI as senior VP software after a period with NASA, Merck and AGEIA.

Neil Meldrum is to head-up Sensornet’s new Aberdeen office. Robert Hobbs is now COO of TGS-NOPEC. Hobbs was previously with Marathon.

A special session on ‘Free and open-source geospatial software in the earth sciences’ is to be held at the 33rd International Geological Congress in Oslo next August—more from www.33igc.org

Correction

Charlotte Norlund has pointed out an error in our editorial last month. ‘The purpose of Avatar-m is not to compete with the NHDA or any other national archive. Instead, it is aiming to develop tools for the long-term storage of digital content.’ Our apologies for the misrepresentation. More from www.oilit.com/links/1007.


ArcGIS Pipeline Data Model meeting, Houston

Case histories from Enbridge Pipeline, Colonial and Questar and Dig-Smart’s ‘one call’ system.

The ESRI-backed ArcGIS Pipeline Data Model was initiated in 2002 from earlier work by M.J. Harden (now GE Energy). The highlight of the recent APDM user group meeting in Houston John Linehan’s presentation on Enbridge’s Pipeline’s Data Management System (EPDMS). Like other pipeline operators, Enbridge’s mapping program was accelerated by the 2002 ‘192 Gas Rule’ regulations that define High Consequence Areas, leading to an assessment of pipeline integrity risk assessment. The EPDMS program kicked off in 2004 with the aim of creating a ‘maintainable and dynamic mapping database based on ESRI’s ArcGIS.’

PODS vs. APDM

APDM was chosen over the PODS database as filling Enbridge’s requirements and supporting customization. A third party application was acquired for data editing and risk evaluation and the APDM 2 spatial database was instantiated on Microsoft SQL Server. The database was populated with centerline and attribute data and HCA’s located with public imagery and NPMS data for risking.

Phase II

A review in 2007 led to a number of changes including acquisition of recent aerial photography, digitization and identification of all structures near pipelines for DOT and HCA classification and sub-meter GPS verification of known HCA areas. The HCA and risk ranking tools were also upgraded and the database migrated to APDM 4.0.

One Call

Jim Schoenberg (Dig-Smart) advocates a GIS-centric ‘One Call’ system to automate the ‘call before you dig process’ and thereby avoid, for example, a backhoe rupturing a gas transmission line. Automating a one call system requires cross referencing between an address database and the one call system. Geocoding translates street addresses to actionable locations for the mobile workforce. Just as companies made the move from paper maps and records to GIS and digital data management, paper one call tickets are now moving to real time enabled ‘geographical ticked databases.’

Questar

Ted Peay (Questar) described data improvement as ‘infinite and daunting, requiring ongoing attention and unlimited resources to accomplish!’ Questar uses tornado plots to rank risk parameter sensitivity for compliance with the HCA rules. Improving pipe grade and wall thickness information was top priority. A workflow assured bi-directional data exchange between APDM and subject matter experts’ toolsets. Peay concluded that GIS is the only way to comply with the new integrity management regulations and that ‘APDM is serving us extremely well.’

Colonial

Colonial Pipeline is testing a new hybrid feature/event-based model and working to reconcile pipeline centerline data with high accuracy GPS. NiSource Gas Transmission and Storage showed off its web-based GIS Information Portal, developed with ESRI’s GeoPortalToolkit. The APDM, an ESRI geodatabase implementation, can be downloaded from the ESRI website. More from www.apdm.net.


Security specialists team on cyber risk mitigation

Industrial Defender teams with RuggedCom and CSE-Semaphore on critical infrastructure defense.

Industrial Defender (formerly Verano—OITJ July 2007) and RuggedCom are to provide a ‘comprehensive cyber security solution’ for critical infrastructure protection. RuggedCom manufactures networking devices such as switches, routers and wireless devices for harsh environments along with ‘Gauntlet’ a user authentication solution for cyber assets..

CSE-Semaphore

ID has also partnered with CSE-Semaphore, an IP-based SCADA RTU provider. CSE’s Kingfisher G3 Remote I/O Module announced this month enables wireless monitoring and control of end devices that would prove too difficult or costly using traditional connections. Interest in cyber infrastructure protection (CIP) has been raised as utilities face mandatory infrastructure protection compliance by mid-2008. CIP standards have been designed with help from the American National Standards Institute. The US Department of Homeland Security lists energy and chemicals as two of its 17 sector-specific ‘National Infrastructure Protection Plans.


OSIsoft upgrades analysis framework and data directory

‘Major evolution’ of PI-System heralds shift from ‘tag-centric’ to ‘asset-centric’ data infrastructure.

OSIsoft’s Asset Framework (AF) 2.0 is a major evolution for the PI-System process and plant data infrastructure. AF 2.0 is a component of OSIsoft’s ‘Data Directory’ strategy that is moving the user interface from its current tag-centric focus to an asset-centric approach. AF promises a consistent representation of assets that can be leveraged in simple or complex analyses to provide ‘actionable information.’ Designers can identify components that make up a process and associate real-time or relational data-with them.

Data management

Data management is enhanced by a common data access infrastructure with named assets and processes across the entire plant or enterprise. AF scales to ‘millions of assets,’ each with hundreds of properties or attributes.

Kennedy

OSIsoft CEO Pat Kennedy said, ‘The combination of AF and our new PI Notifications provides an enterprise-wide solution for integration, exposing PI System data in various contexts for novel analysis. The asset-centric view of real-time information marks the next evolution of real-time data provision.’


Oracle Crystal Ball for oil and gas user group

Case histories from Spectra Energy, Catheart Energy and ‘uncertainty enlightenment’ from Rice.

In January 2007, Hyperion acquired Crystal Ball, the Monte Carlo risk analysis plug in to Microsoft Excel. A couple of months later, Hyperion itself was picked up by Oracle Corp. This means that Crystal Ball (CB) for oil and gas is now a component of Oracle’s ‘digital oilfield’ effort. CB for oil and gas users met in Houston last month to hear how CB is used in risk analysis, simulation and optimization and to learn about ‘communicating’ risk and how to get the most from their forecasts and analyses.

Spectra Energy

Ken Jeans described how Spectra Energy is engaged in a large number of capital projects as it expands its natural gas facilities. Crystal Ball is considered key to Spectra’s analysis of project schedules and costs. Spectra is now on its third generation of CB models. These are now aligned with the project execution plan and integrated with capex and schedule models. Optimization now supports risk-based mitigation planning and efficient capital to revenue expenditures. The modeling process includes rigorous validation and discussions with a focus on risk mitigation and optimization. CB’s OptQuest goal seeking function is widely used. Notwithstanding the science, Jeans wonders if there is a ‘conspiracy of optimism,’ and suggests all levels of uncertainty need to be managed, that expectations should be set early and knowledge about risk should be shared by complete disclosure.

Catheart Energy

Robert Merrill (Catheart Energy) explained that shale gas resource evaluation is dependent on a range of shale parameters including density, thickness, total organic carbon, hydrogen index and maturity. CB is used to evaluate recoverable gas, providing outcome probabilities and sensitivity analyses. Investment decisions can then be in the light of statistical estimates of recoverable resource and with an understanding of the associated risks.

Rice

Susan Peterson’s (Rice University) presentation on ‘uncertainty enlightenment’ in field development showed how the decision making process for a marginal offshore field development was facilitated with CB. Issues resolved with probabilistic modeling included arbitrating between work-overs and new wells and drilling sequencing in the light of a FPSO hookup schedule and the nearby discovery of a satellite tie-back. Lease or buy options for the FPSO were analyzed in terms of likely field life and various commercial terms. Probabilistic full field modeling allowed discussions to be centered on quantifiable risks and uncertainties and allowed management development decisions to be based on the uncertainty enlightenment that resulted from those models.

Hoye

Steve Hoye (CB) showed how time series analysis can be used to anticipate future oil prices (it’s not called Crystal Ball for nothing!) by breaking up historical data into periods of relative stability. Price forecasting is then possible using ‘Gaussian mean reversion with jumps.’ CB Predictor, correlation, and distribution fitting tools were used.


CITGO deploys Commodity Management/SAP combo

Triple Point Technology’s ‘pre-integrated’ trading solution includes SAP’s ERP flagship and joint support offering.

PDVSA unit CITGO Petroleum Corp. has selected Triple Point Technology’s (TPT) Commodity Management solution. Commodity Management (CM) is a joint SAP and Triple Point development for trading, risk management and operations.

Armstrong

TPT president Peter Armstrong said, ‘CITGO is the fourth organization and second major oil company to choose the joint SAP and Triple Point Commodity Management solution in the last 60 days. The market is clearly stating that for SAP customers, the pre-integrated SAP/TPT solution provides both the best function and business value.’

ERP

CM embeds TPT’s Commodity SL within SAP’s ERP platform. The tightly integrated package is tested by SAP and comes with a joint support program. CM is a component of CITGO’s ‘mid-office’ infrastructure—a project that aims to improve data integrity and create transparency across the value chain. CM incorporates physical and financial trades into one system, integrating front and mid-office with existing SAP back-office systems.

Coon

CITGO IT manager Gina Coon said, ‘As an existing SAP customer, it makes sense for us to adopt the only oil trading and risk management software solution endorsed by SAP. Also the integration effort, cost and ownership is moving from CITGO to SAP and Triple Point, lowering our risk and lifecycle cost.’ Other TPT customers include Engen Petroleum, Hess, Tesoro, OMV, ConocoPhillips, Petronas, Oxy, BG Group and Anadarko. More from info@tpt.com.


InFusion deployed at ExxonMobil’s Port Allen lubricants plant

Invensys’ enterprise control system to become template for other ExxonMobil facilities.

ExxonMobil has just switched on a new enterprise control system (ECS) at its Port Allen Lubricants Plant in Louisiana. The ECS, built on Invensys Process Control’s InFusion platform, spans the plant’s SAP enterprise resource planning system (inventory, order/shipments, etc.), batch process control and final packaging and shipping operations.

Template

The original aim of the ECS was to replace the ageing existing control system. But the new system was found to bring extra benefits in production flexibility, work flow and scalability. The ECS has now been adopted as a template for other ExxonMobil lubricant facilities. The ECS was installed without incident.

Le Sueur

Invensys marketing director Grant Le Sueur said, ‘This shows how efficient the InFusion platform is for plant-wide integration. ExxonMobil’s requirements were met through a combination of technology and delivery expertise, resulting in enhanced operations and improved supply chain performance.’


Done deals, acquisitions, mergers and more ...

AGR Group, Avalon, Emerson, FMC, Intellection, Nokia, National Oilwell, RPS, SARS, SensorTran

AGR Group has completed its acquisition of Australian headquartered Upstream Petroleum, through a final payout to the 5 original owners.

~

Avalon has filed an S-1 form for the floatation of its Oiltek unit on the OTC Bulletin Board. Last year Avalon licensed technologies for production enhancement, intelligent drilling and completion, real-time reservoir monitoring and leak detection for hazardous gas pipelines to Oiltek in return for 80% of Oiltek’s capital.

~

Emerson has acquired The Automation Group (TAG) of Houston. Terms of the deal were not announced. TAG provides process automation/control system engineering, instrument and electrical design, project management services to the refining and petrochemical industries.

~

FMC Technologies has upped its stake in CDS Engineering BV from 91 to 100%. CDS sells gas and liquids separation technology and equipment for onshore and offshore.

~

Australia-based Intellection merged with with X-Ray Mineral Services (XMS) of Wales and embarked on an expansion of its UK operation with the backing of a £240,000 Regional Selective Assistance grant from the Welsh Assembly Government. Intellection’s flagship solution is QEMSCAN for advanced automated analysis of minerals.

~

Nokia has acquired Trolltech for $150 approx million cash. Nokia is committed to honoring Trolltech’s open source engagements including the KDE Free Qt agreement. Nokia has applied to become a Patron of KDE.

~

National Oilwell Varco entered into a merger agreement with Grant Prideco in a cash and paper transaction that puts a $32 billion value on the merged companies. Grant Prideco is a high tech drill pipe manufacturer—including notably the Intellipipe drill string telemetry system.

~

RPS Group has acquired JDC, an environmental consultancy, for an $11 million cash consideration. JDC provides consultancy services to the petroleum refining industry and other verticals.

~

SARS Corp. has closed financing rounds of $13.3 million. The monies will go to boost SARS’ sales and marketing efforts and accelerate commercialization of its global asset tracking technology.

~

SensorTran has raised $8,000,000 in a round of venture capital financing co-led by Advantage Capital Partners and Expansion Capital Partners. The cash is to help Sensornet expand its distributed temperature monitoring product line, and to boost sales reach and the global service network.


New ‘ISASecure’ seal of approval for process control systems

New report highlights benefits of ISA 99 standard for cyber security in automation and critical infrastructures.

The Instrument Society of America (ISA) has released Part 1 of a new cyber security standard industrial automation and control systems. ISA99, a.k.a ANSI/ISA-99.00.01-2007 covers terminology, concepts and models and is the first in a series of ISA cyber security standards. ISA, via its Security Compliance Institute, is working to identify and promote security standards-compliant products which will receive the ‘ISASecure’ seal of approval.

New report

A new report on ISA99 has also been released providing an assessment of current cyber security tools, mitigation countermeasures, and technologies. The report includes a survey of control systems as deployed in several industries and critical infrastructures. The pros and cons of different cyber security products are discussed in relationship to anticipated threats and known vulnerabilities with recommendations for countermeasures. More from www.oilit.com/links/1006.


ScadaLynx—entry level automation and data management

Stripper well operators and pipeline owners benefit from low cost monitoring solution.

eLynx has announced several successful deployments of its entry-level remote monitoring solutions this month. Permian Basin stripper well operator MWS Producing has deployed eLynx’ remote tank level monitoring solution—showing that remote monitoring can be cost-effective on wells producing only ‘a barrel or two’ per day.

Swinson

MWS owner Mike Swinson said, ‘With eLynx we can measure tank levels to one-sixteenth of an inch. This means increased production due to less downtime and increased operating efficiencies. We save $195 per site per month, as fewer trips to the field are required.’

Davis

Another happy customer is Todd Davis, SCADA/Measurement Manager for Hanna Oil and Gas—‘Before we started using eLynx, we were driving to every location every day, sometimes several times, to get readings. This was very costly and inefficient. Now we can operate by exception and go where we need to go first thing in the morning.’ Hanna uses ScadaLynx to remotely measure oil and gas production.

Pipeline

ScadaLynx was also successfully deployed by Houston-based OGS Pipeline to monitor a 16-mile section of pipe in a remote area of West Texas. OGS president Roy Brehm said, ‘We have found all of our lost and unaccounted-for gas, reduced our weekend on-call staff from 3 to 1 and can now identify problem areas without spending hours behind the windshield.’


Honeywell for Woodside’s Pluto Western Australian LNG plant

Operator training system, data historian and process knowledge system for A$11.2 billion project.

Woodside Petroleum has chosen Honeywell for the integrated control system on its A$11.2 billion Pluto liquefied natural gas (LNG) project. Pluto is Western Australia’s first new LNG plant in over twenty years and will produce 4.3 million tons of LNG per year. Pluto includes sub-sea wells, an unmanned wellhead, a 180 km trunk line and the onshore LNG plant. Honeywell’s UniSim operator training simulator will also be used for process design verification.

Experion

Honeywell’s Field Device Manager and Asset Manager applications will help operators monitor and diagnose field devices with real time data captured to the Uniformance PHD historian. Honeywell’s Experion Process Knowledge System gives operators the ‘big picture’ of what is happening in the plant. Honeywell will also design and build emergency shutdown and fire and gas systems. The project is scheduled for completion in 2009 with production beginning in 2010.

Perdido

Honeywell also announced a project with Shell Perdido in Louisiana for 59 Excel Optima Short range detectors and accessories for their Fire and Emergency Equipment Systems project. The project will help Shell standardize all offshore oil and gas production platforms in the Gulf of Mexico region.


Barco launches control room software and 3D workstation

CMS-100 visualization solution targets collaboration across multiple video and data sources.

Barco has released a new control room management package and 3D workstation for multi-channel desktops and smaller video walls. The visualization solutions target collaboration and information sharing between control room operators. The new ‘CMS-100’ management suite lets operators visualize multiple data sources on any combination of LCD displays, video walls or workstations. Operators can customize the display space and share user-defined views with colleagues.

Buijsse

Barco product manager Karel Buijsse said, ‘Today’s control rooms are complex networked environments handling an ever increasing number of video and data sources. Efficient collaboration and decision-making is only possible if operators and decision-makers have easy and timely access to this information. This new network-centric solution will enhance operators’ ability to work with large amounts.’

PWS-101 3D workstation

The CMS-100 software is compatible with Barco’s new PWS-101 controller. The 3D workstation can be equipped with two Quad-Core Intel Xeon CPUs, high-end 3D graphics acceleration and media integration capabilities.


AspenTech—‘real’ real-time optimization for BP Chemical

‘Equation-oriented’ model-based optimization brings ‘significant’ commercial gains.

A report in BP’s Frontiers magazine* provides insights from BP chemicals units that might interest the digital oilfield community. The report by Michelle Brown describes how optimization in a chemical plant happens at different granularities and time scales. BP deploys the AspenPlus process simulator from AspenTech to simulate its processes, leveraging ‘equation-oriented modeling’ to simulate the entire plant. The model-based optimization is performed several times a day.

Significant gains

The technique was first tried in 2000 at BP’s Decatur chemicals plant in Alabama and is now in use worldwide. ‘Significant’ commercial gains from the technique are described as ‘too commercially sensitive to report!’ A layered approach to optimization has been used. The foundation is the distributed control system (DCS) operating at a high granularity with control of individual equipment set points. The second tier concerns ‘multi-variable control,’ using AspenTech’s DMCplus. Here dynamic, non steady state optimization works on equipment groups at a time scale of minutes. The top tier optimizes steady state operations at a timescale of hours. Once a solution has been obtained, DMCplus moves the key variables throughout the plant towards their new targets. The approach has enabled BP to push process units to ‘safe operating conditions that previously were thought to be technically impossible.’

* www.oilit.com/links/1004.


Kongsberg Process Simulation gets Chevron Frade contract

Process and pipeline simulator and multiphase modeler control subsea unit, gathering and FPSO.

Chevron has selected an operator training simulator from Kongsberg Process Simulation for its Brazilian Frade subsea development. The Frade field is located in the Campos Basin in a water depth of 3,500 feet, 75 miles northeast of Rio de Janeiro. Horizontal production wells and vertical water injectors on the field are tied back to a floating production storage and offloading vessel. Kongsberg’s D-SPICE process and pipeline simulator and the OLGA multiphase modeler provide models for training, operations and production optimization. Partners in the Frade field are Chevron (operator), Petrobras and an Inpex/Japan Oil joint venture.

Tecnicas Reunidas

In a separate deal, Spanish engineering contractor, Tecnicas Reunidas (TR), has ordered dynamic simulation study of critical depletion compressors from Kongsberg on behalf of Petroleum Development Oman (PDO). The study will assist TR’s execution of the $600 million engineering contract for the Saih Rawl gas compression project.

ASSETT

Kongsberg will model a section of the plant using its ASSETT dynamic simulator, testing alternative scenarios to optimize compressor performance. Kongsberg has previously supplied operator training simulators to PDO.


Ansys toolset for Petrobras’ simulation-driven process development

Following extensive trials with CENPES R&D unit, CFD modeler now used across the board.

Petrobras has expanded its technology agreement with Philadelphia-based Ansys for the provision of simulation and modeling applications. Petrobras is to use the Ansys toolset in product and process design in its production, refining and processing units. Maucir de Almeida, refining process optimization manager at Petrobras said, ‘Simulation-driven product and process development is helping us to improve the performance of refining equipment. It is a key element of our innovation strategy. The breadth and depth of the Ansys product portfolio, with its modeling and high-performance computing capabilities, makes it suitable for our applications that require rapid, high-fidelity simulations.’

CENPES

Ansys is currently in use within Petrobras’ CENPES R&D unit for cluster-based computational fluid dynamic studies of product flow to optimize refining processes. Other Ansys multiphysics applications in oil and gas include development decisions on deepwater fields, to assess feasibility and reduce risk.

ESSS

Ansys’ South American channel partner Engineering Simulation and Scientific Software is providing support to Petrobras on the project.


BP Norway awards Valhall data cleanup to Sharecat Solutions

Product data specialist gets 12 million NOK supply chain data contract. ISO 15926 compliance announced.

BP Norway has awarded Tektonisk unit Sharecat Solutions a 12MNOK contract on the Valhall redevelopment project out to 2010. Sharecat is to gather and validate supply chain technical data connecting main contractors and vendors. A common view of data and documents will facilitate inter-partner workflows aligned with industry processes. Earlier this year, Tektonisk was awarded another project by BP on the Skarv development. Earlier this year StatoilHydro awarded Tektonisk a 1.8MNOK contract for technical services relating to material master data classification and cleanup. Tektonisk now has 25MNOK of BP Norway projects.

ISO 15926

Sharecat has also announced that it can now deliver structured equipment datasheets to the ISO 15926 standard. This functionality resulted from Sharecat’s participation in the Norwegian Intelligent Data Set (IDS) initiative (OITJ May 07) to establish data in the ISO 15926 format to support data exchange and integration. ISO 15926 underpinned the construction of BP’s Greater Plutonia development (OITJ Sept. 06), a $4 billion project with six FPSOs and 80,000 equipment items. The system is integrated with BP’s SAP ERP package.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.