Following its return to the oil and gas market (OITJ Vol. 10 N° 11), Calgary-based Labrador Technologies Inc. (LTI) has now announced its flagship product, Labrador eTriever, a front end to existing commercial data sets. eTriever is the first commercial software in the upstream to embed Microsoft Network’s Virtual Earth (MSNVE) mapping technology.
LTI president Ron Sterne told Oil IT Journal, ‘Today’s data access packages are software ‘battleships.’ They try to do everything. The E&P activity in Calgary today is frenetically energized and what today’s users need is a lightweight application that lets them go, grab the data they need and get on with their work.’
One early adopter is GeoLogic, also of Calgary, which will be demonstrating the new data access technology at its Data Center next month. Sterne said, ‘We have pitched eTriever’s introductory pricing and its contract terms such that potential clients can sign up for eTriever on the spot and the negotiate with the Data Center for their specific data needs.
Currently, LTI uses MSVE as a ‘powerful visual aid’ rather than a full-blown Geographical Information System (GIS). eTriever is primarily marketed as a ‘quick and nimble query and reporting solution.’ MSVE’s satellite and roadmap coverage adds geographic context to a user’s area interest.
LTI CTO Tim Breitkreutz explained, ‘Early versions of MSVE were interesting but Canadian coverage was poor. With version 2.0, released last month, MSVE now rivals Google Earth for its imagery. Most important to us though is the fact that Microsoft, unlike Google, allows ad-free commercial use.’ MSVE offers geocoding, IP-based and wireless access point location techniques for mobile workers.
LTI was an early adopter of the PPDM data model in its PetroLab product, sold to IHS Energy. LTI has been hampered by a five year non compete agreement which Sterne described as ‘death by strangulation.’ LTI plans to price eTriever at $1,000 per seat to encourage rapid take-up. More from www.etriever.com.
Chesapeake Energy Corporation is the latest company to sign up for Oildex’ SpendWorks hosted e-payment system. Chesapeake, currently the most active driller of new oil and gas wells in the US, will use SpendWorks to speed its accounts payable process. SpendWorks ‘software as a service’ (SaaS) offering replaces traditional paper-based invoices with digital data and workflows.
Oildex president Peter Flanagan told Oil IT Journal, ‘Studies have shown that paper based settlements in oil and gas typically cost in the $15-20 range. Our hosted service brings this down to around $5. But we also add value to the process through our industry-tailored workflows and soon, new in-depth business intelligence (BI) of a company’s spend. Unlike previous BI techniques, which produce results weeks or months after the fact, our SaaS offering supports near real time BI, providing analysis within a day or two of actual spend.’
Oildex claims to be one of the energy industry’s largest Internet-based data exchanges serving over 1,000 companies and 10,000 users.
Oil IT Journal will be publishing an in-depth interview with Flanagan in next month’s issue.
In our interview with Paradigm CEO John Gibson at last year’s SEG (OITJ Vol. 10 N° 10), John expressed a degree of exasperation with the industry’s growing use of license management software, ‘Companies are over-focused on license management tools like FlexLM and GlobeTrotter. They are wasting everybody’s time. The last thing the industry needs is more licensing management software!’ The announcement (page 8 of this issue) of sales of OpenIT’s LicenseAnalyzer to ConocoPhillps and Marathon suggests that the trend isn’t going to stop any time soon.
For those of you who are not familiar with the practice, license management software allows a corporate user to monitor how often a software package is in use. If they have bought 100 seats for a particular package and discover that they are only using 50, then they can go back to the vendor and renegotiate. In the above circumstances, this would seem like quite a reasonable proposition. But things are not always so clear-cut. License management software is getting more sophisticated and can ‘multiplex’ licenses so that clients can better ‘align’ the number of licenses they buy with actual use.
For some reason this reminds me of a story I read in the paper about a local burger chain. The manager was onto what he thought was a good idea when he decided to put his burger flippers ‘on holiday’ when demand slowed. Nobody in the shop? Take a couple of hours out of your annual leave—go for a nice walk in the park! I admit that the analogy is stretching it a bit. But taking things to the extreme, software license management is a little like asking your vendor to go on holiday between your mouse clicks. The trouble is that software companies, especially the smaller ones, are not terribly sophisticated when it comes to sales and pricing. The charges that a software house levies are at least in the early days, a kind of pact between user and developer—‘We sell you the software and charge a maintenance fee. This feeds us while we write new code and fix bugs.’ It is not an exact science but one thing is for sure, the license management systems sucks money out of the system and it is not clear who is going to pay the tab.
While preparing this month’s issue I myself spent a few hours sucking money out of another system. Using Skype and our new SkypeIn phone number in the USA means that an hour long transatlantic phone call costs either nothing or the cost of a local call. I like that. A decade ago, Frances Cairncross in her seminal article ‘The Death of Distance’ forecast the huge cost reductions that the telcos could achieve with new technology—anticipating a hundred fold reduction in prices. Well it has happened now and it feels good.
As the graph below shows, our website oilIT.com has begun the year with a bang. We currently receive around one thousand visitors per day as evaluated by our Urchin site monitor. That does not include visitors to intranet-deployed mirrors. This is up from 400 per month in 2003—the last number I can find without some serious hacking.
For the last few months, we have been collecting and filing company financials without ever finding the time and space to digest them for you so I’ll have a quick bash in this last column. With oil ending the year in the $65 plus range, it is unsurprising to find that Halliburton’s revenues for 2005 were ‘The best in our 86-year history,’ according to president and CEO, David Lesar. Halliburton’s Digital and Consulting Solutions’ (principally Landmark Graphics) operating income for the full year came in at $146 million (up from $60 million in 2004) and ran at a princely $66 million in the fourth quarter. Schlumberger’s Oilfield Services revenue for 2005 came in at $12.65 billion, up 24% on 2004. Operating margins improved 4.6% showing ‘high demand, strong pricing and accelerated technology delivery.’ Q4 revenues, at $3.6 billion were up 30% year-on-year.
The only fly in the ointment is that, according to Schlumberger president Andrew Gould, offshore drilling is likely to be constrained by rig availability and that people and equipment shortages are likely to result in cost inflation and project delays. But perhaps the most interesting facet of the growth is the fact that the seismic business is finally coming out of the doldrums. Schlumberger reported revenues from its high bandwidth ‘Q’ technology at $399 million in 2005, more than doubling the $162 million achieved in 2004. A growth rate ‘higher than 80%’ is expected in 2006! If you thought data volumes were high and growing fast, you ain’t seen nothing yet!
Why did you take the job?
I have been interested and involved in oil and gas (and other) industry standards for a long time. From my previous work, I think that the lack of standards represents a significant barrier to the free flow of e-commerce. While the technology has stabilized, information exchange still suffers from inconsistent data formats and protocols. So when the opportunity arose to devote myself full time to information exchange standards in the oil and gas industry, I jumped at the chance.
What are your immediate plans?
I looking for ways to add value to members’ business. The upstream is looking everywhere for added value—even if it is hard to identify. POSC’s focus has been always been on the upstream with its special needs, unusual, high volume data types and complex information. We need to support information availability and integration across the value chain and out to trading partners. I also want to build a community around the standards and to become a link between users and developers of many standards—not just POSC—anything that is needed to support business efficiency. I want to create a talent pool of thinkers and developers of standards—to focus development effort where we see a marketplace need.
Any immediate targets?
I want to identify targets through consultancy on information standardization in the short to mid term—to find out what the market needs. Potential targets today might include RFID, web services, catalogues of existing standards—and finding ways to make them work. A lot has already been done in the subsurface arena, more remains to be done in production and operations. There are big gaps in the hand-over of construction companies—there are lots of opportunities there. I also want to make POSC a community—a focus for information that people can’t get elsewhere. A home for successful practices and emerging best practices and a collaborative environment where people work online through threaded discussions, a ‘think tank’ if you like. Lastly I want to become a vocal and visible evangelist for standards in industry—especially re global trading partners—and to ensure that we are ‘plugged into’ horizontal and or vertical industry standards—and to get a seat at the table of standards bodies.
Would that mean POSC joining the W3C?
That’s a possibility—but I’ll be looking at where POSC should be involved—where we can most usefully place our limited resources. I certainly want to track any initiatives that show value to the community.
What did you do with Baker Hughes?
Initially, managing the engineering and construction of oil and gas facilities. I was also involved in Project Renaissance—a big SAP and business process reengineering initiative. Later I reengineered the new product development process for Baker Hughes. I also worked on the e-business at the corporate level—working in a group looking at e-procurement. Hence when the opportunity arose with Trade Ranger, I joined in the newly created post as VP community, developing a strategic-level community around the market place. In one year I had created a successful community.
What went wrong?
I wouldn’t characterize what happened to TR as ‘going wrong’. The market place scene evolved and some were not successful. This was in part because of a degree of resistance from suppliers. There were organizational issues in procurement companies. Some buyers regretted the absence of the personal relationships they had built with the sellers. In other companies, the drive from the top was not borne out in the operating units. Organizational inertia was another barrier in some cases. In the end, excess marketplace capacity led to consolidation. Marketplaces serve a role—but they are not for all.
E-Commerce with PIDX also works independently of a marketplace as PIDX has shown.
Yes—a lot of companies have built their own private exchanges—but these do not solve the procurement issue and the multiple connection problem. This has probably reached a plateau. My thought was that web-based e-procurement in the oil and gas companies has not been deeply driven into the organizations and only accounts for a relatively small portion of their spend at this point in time. It will certainly take some time to become ubiquitous in the industry, but ultimately that will happen. The continued development of information exchange standards will make that much easier.
Getting back to POSC—do you anticipate any big changes?
No, not in the short term. We are setting out on strategy planning and we want to listen to our membership—to learn what they want us to be doing.
At the 2005 AGM and Member Meeting in November, POSC chairman Herb Yuan spoke of a potential merger between POSC, PIDX and PPDM—do you have any news or thoughts on this?
I think all these organizations have been talking to each other and collaborating in some areas for a long time. Organizations like ours should always be looking to add value either by merger or by collaboration. Merger is a potential option—but there is nothing on the horizon.
What are the plans for Epicentre—POSC’s flagship data model?
We are going to take a serious look at our software portfolio and see what can be leveraged and what can be moved into a run and maintenance mode. I am a strong believer in lifecycle management of standards.
Does that run to actually retiring a standard that is no longer used?
Yes, that is exactly what I am saying. We should retire standards where there is no activity or support—or when they are obsolete.
Your work in construction with Baker must have exposed you to the ISO 15926—POSC CAESAR standard set. Do you see developments in this space within POSC?
I’m not sure what the relationship with POSC CAESAR is today. I’d like to re engage with them—see where they are and what they are doing and how we can add value.
David Archer regretted that POSC was not more of a software company. What do you think?
I have no particular bias. Our focus should be on facilitating and authoring standards. If going further down the software route adds value, then OK—but I don’t have any feeling for this today. We do have a new website in development.
Germany-based Integrated Exploration Systems has announced a marketing and technology alliance with UK partners Midland Valley and Badley Geoscience. The alliance targets joint software services and data links between the members’ software tools. The companies are also to combine their marketing efforts with co-hosted booths at the Houston AAPG and the EAGE in Vienna. IES reports that 2005 was its best year ever with new sales to PDVSA, Petronas, PTT and Talisman.
Vancouver-based Safe Software has just announced a new release of its geospatial toolkit, the Feature Manipulation Engine (FME). FME 2006 enhances Safe’s Extract, Transform, and Load (ETL) tools for spatial data, in particular with enhanced raster and vector support. New features include Oracle GeoRaster for loading raster data to an Oracle 10g database, and support for the KML spatial format, which lets users visualize their spatial information in Google Earth. Also new is support for the Geography Markup Language (GML 3.1.1), the standard XML-based group of formats for representing geographic features.
Fugro-Jason has just updated its Geoscience Workbench reservoir characterization package to Version 7.1. The new release adds functionality and enhancements across the board. All modules are now available on 64bit Linux, for the 64-bit Intel Xeon, AMD Opteron, AMD Athlon 64 and AMD Turion 64 processor. A new ‘volume data alignment’ tool has been added to enhance computational accuracy.
Calgary-based LogTech now has over five million log curves for its Digital Log Database. The data set is available online from Logtech’s LOGarc on-line hosted data service. The company is currently adding data at a rate of 70,000 curves on 1,400 new wells per month. The six million curve mark is expected to be reached ‘real soon now’. LogTech claims nearly 1,000 users for LogArc and that often, ‘hundreds of thousands of curves are retrieved per month’.
Calgary-based software house Geomodeling Technology Corp. has announced availability of Multi-Phase Upscaling (MPU), a new feature of its SBED geological modeling package. Multi-Phase Upscaling concerns the accurate representation of fine grained physical properties such as relative permeability to be ‘encapsulated’ in a coarser scale model.
According to Geomodeling, conventional techniques extrapolate relative permeability measurements from core plugs to full-field reservoir models, without considering the impact of small-scale geological details. Understanding all of the processes involved in fluid flow through sedimentary rocks is ‘key to understanding contaminant migration and petroleum production.’
Geomodeling President and CEO Renjun Wen said, ‘By using multi-phase upscaling data, engineers can improve history matching and reserve forecasting and designing better development plans. SBED models the small-scale bedding structures that impact fluid distribution. Integration of such effects decreases the uncertainty in reservoir predictions.’
First announced in March 2005 (OITJ Vol. 10 N° 3), IHS Energy’s Enerdeq service is now live and providing a ‘one stop shop’ to IHS’ comprehensive US oil and gas dataset of some three million wells and two million production entities. The first release of Enerdeq is a web browser version, later releases will offer more sophisticated interfaces supporting B2B data exchange via SOAP.
IHS Energy president and COO Ron Mobed said, ‘We have a long history of providing data that many in the industry rely on for their business-critical decisions. Enerdeq’s user interface gives customers better visibility of our data, with more flexibility, reducing risk in E&P settings from new business development to infill drilling. As a platform, it enables us to rapidly deliver new types of data, as evidenced by our recent acquisition of 3-D seismic survey outlines, which can already be viewed in Enerdeq alongside our other data.’
IHS Energy is developing other versions of Enerdeq for different markets, Enerdeq Desktop for Canadian E&P data and an International edition for Petroconsultants data. Rich Herrmann, VP global product management added ‘Our goal is to support the broadest range of energy company strategies for optimizing the value of our information. For some, our US Browser may meet all their needs. For others, a combination of Intranet deployment, GIS extensions and/or Web services may be required. Enerdeq is designed to leverage the benefits the Internet offers to fundamentally improve the ways our customers interact with IHS content.’
Ultimately, IHS Energy’s ‘next-generation’ data access and integration platform will distribute global information to clients in a variety of ways—from smart client desktops to browser and web services-based delivery. Already, users can receive ‘pro-active’ e-mail notification of new data in their area of interest, greatly reducing the risk of key data being omitted in basemap construction, geological interpretation and other critical analysis.
The Petroleum Exploration Society of Great Britain’s Data Management Special Interest Group’s biennial meeting in London last month covered the UK’s various data repositories, DEAL, CDA and the NHDA in more depth than we can reasonably cover here. Suffice it to say that if a country’s E&P success was judged by the number of data initiatives, then the UK would be a world-leader.
Paras’ CEO Hamish Wilson’s thesis is that good data management (DM) underpins company performance management. Whether a company is trying to link well data across financial and production reporting systems or is focusing on production reporting and Sarbanes-Oxley (SOX) compliance, data management is proving ‘a tad more important than in the past.’ Wilson’s top down analysis starts by asking, ‘What does the business worry about?’ The answer is, HSE, production rates, reserves, OPEX and CAPEX. All of which link technical and business systems. Wilson uses a return on investment (ROI) tree to show how DM affects earnings. ‘It is all about making sure that capital is spent on the best opportunities.’ Here, accurate evaluations are ‘all about data,’ linking backward-looking financial systems with forward-looking production systems like Peep or Merak.
But even with good technology, management is frequently confronted with two sets of numbers, from finance and from production metrics. Pressure is also coming from the SEC which is now much more prescriptive about reporting numbers, cutting into the reserve calculation data flows, from the subsurface to the ‘Excel domain’. While it may be possible to answer the SOX ‘show me?’ question with production decline data it may be harder to track which simulator run, seismic line and well picks were used, although it may be a moot point whether we will be asked for such information. Wilson concluded that DM’s profile has been raised by SOX. ‘Without data you are just another guy with an opinion.’
Oil IT Journal
Oil IT Journal editor Neil McNaughton traced some of the major upstream standards initiatives of the last few years including the ‘soap operas’ of POSC vs. PPDM, data model wars, business objects and OpenSpirit. This active field has known failure (Epicentre, Project Synergy) but has also been a proving ground for new technologies, from CORBA to XML. McNaughton warned against the ‘blind assumption’ that a standard is a good thing and that standardization projects fail ‘because of lack of take up.’ Standards are more likely to fail because they are not up to the job, or they are poorly planned or implemented. Most of all, standards need to fulfill a genuine need. Commenting on the long running battle between PPDM and POSC over a ‘reference upstream database,’ McNaughton concluded that while POSC won the communications battle, PPDM won on the technology front as its data model underpins many commercial and in-house projects.
Stuart Robinson, UK DTI
Robinson described the government’s past data management culture as ‘keep lots of data, never use it and don’t look after it.’ The situation has changed thanks to a variety of initiatives such as UK PILOT, DEAL, the UK Oil Portal and CDA. Robinson, now on the POSC board, believes that ‘e-Fields are happening. There is a big push to store and share production data,’ hence the WITSML-derived PRODML standard. Robinson also made passing mention of other e-government initiatives such as SeaFish, Vantage (helicopter seat booking), People on Board offshore installations and the environmental EEMS registry, which is ‘struggling to go live.’ Robinson encouraged data managers to get interested in and develop standards for the e-field—but warned that companies who boast of data as ‘a priceless asset’ are not telling the truth.
Ash Johnson, Geosoft
Shell has large amounts of data on Fieldbank (jointly developed by Ark Geophysics and the BGS), in Oracle and SDE. End-user tools include Geosoft’s Oasis Montaj, ArcGIS and seismic interpretation systems like Petrel. Geosoft’s data access protocol (DAP) is ‘a kind of OpenSpirit for potential field data.’ DAP helps organize geophysical data catalogs and large datasets. DAP is used across Geosoft Montaj data browser, ArcGIS, MapInfo and the acQuire API (a mining industry standard).
Paul Duller, Tribal Technology
The public has the right to ask for and receive ‘anything’ from the UK government except information whose disclosure would not be in the public interest or that is covered by privacy legislation. The US has had a FOI Act for several decades. Companies may be affected by this act in respect of environment issues. Because of the risk of disclosure, Duller advises circumspection as to what companies record in email and other systems and what is offered up to government bodies by way of reporting.
Muhammad Tahseen has developed a novel methodology for E&P data management (DM) in a project performed for the Libyan WAHA Oil Company. Tahseen’s brief was to investigate WAHA’s DM requirements and prepare an implementation plan. Tahseen has recommended a digitization program and the knowledge enablement of WAHA’s data for the widest possible audience. Tasheen argues against an ‘all-encompassing’ data model and in favor of simple files and a data hierarchy. Physical data is kept near to data owners and users. Data is owned and managed by the department that uses the data. Bulk data is stored nearline on a DVD robot that cost 1/10th of tape robotics. It has taken about a year to give four departments a data ‘facelift’. A geoscience technology overhaul is also underway.
This article has been taken from a longer report produced as part of The Data Room’s Technology Watch Reporting Service. More from firstname.lastname@example.org.
According to Total CIO Philippe Chalon, Total’s new global security plan this year involves a change of strategy. ‘China Wall’ peripheral security can make life hard for joint venture partners, service providers and Total’s ‘nomadic’ staff. Total has opted for a three component security architecture comprising secure data centers, secure desktops and encrypted data flows between them. There is no longer any difference (except for performance) between the internet and the secure company intranet. User access is granted according to a user’s profile and the trust level of a device. For instance, you can’t get at reserve data from the public internet in an airport!
Schlumberger Information Solutions president Olivier Le Peuch agreed with Chalon, ‘putting a firewall around every component is not a solution.’ You need to secure processes, not components. Le Peuch backed the suggestion made last January at the Houston SPE Digital Security Conference that an industry security body be set up. Reserves management requires a security-centric architecture, integrated standardized workflows, and identity and access management. One significant SIS initiative is a security-enabled workflow based on an infrastructure of identity mapping, a public key infrastructure (PKI), directory services and role designation. The emerging paradigm of ‘federated identity management’ (FIM) is a possible solution to access management. The idea is to create secure, streamlined collaboration and to eliminate risky third party ID management. Le Peuch suggests a technical working body to design and vet an industry-wide solution for collaboration security. SIS also intends to publish an open source API for petrotechnical application security.
Chevron CTO Don Paul described the increasing ‘digital intensity’ of the industry. A refinery produces 1TB/day of raw data from perhaps 30,000 I/O points and 75,000 model coefficients. A large offshore field produces ten GB/day. A subsurface project portfolio is contained in around 1,000 TB. Chevron totals some four millions transaction and two million emails per day. Corporate data storage is growing at two TB/day—and this is ‘not nearly enough.’ Historically, enterprise, engineering and operations and R&D/technology were three different systems. But today users and data flows cross all three domains. Everything is connected. Everyone, CIO, CFO, CTO and the business ‘needs to get along to make security work.’ Doing nothing is not an option, ‘staying the same means declining security and increasing risk.’ Government R&D can help because government faces the same problems as industry. Paul described Chevron’s involvement in the ‘Linking the Oil and Gas Industry’ (LOGI2C) project, sponsored by the US Department of Homeland Security (DHS) Science & Technology Unit. LOGI2C sets out to improve security and reduce vulnerabilities of pipelines and facilities. The LOGI2C correlation engine has analyzed abnormal events over a 12 month period on a 10,000 mile Chevron Pipeline SCADA network. The project investigated a possible multi pronged cyber attack over a period of time with feints etc. The technique is applicable to digital oil fields now and ultimately, to SCADA systems that interface with the outside world as deployed in the downstream.
SCADA security in BP
Justin Lowe (PA Consulting) and Ian Henderson (BP) stated that improving the security of process control SCADA systems ‘is not a diet but a change of lifestyle.’ In the old days, process control systems (PCS) were ‘clunky’ but resilient, and there was no chance of hacking them. Today, DCS and SCADA are all implemented on Wintel and increasingly on Internet Protocol (IP) standards and share the security risks of such systems. Except that fewer security measures exist for this specialist hardware. Following the Nimbda worm, BP now has a Chief Information Security Officer (CISO) responsible for digital security. To raise security consciousness at BP’s 400 Process Control sites, a group center of excellence was established. Today, BP has ‘built security into PCS engineers’ day jobs.’ In 2004, BP initiated PCS vendor accreditation for antivirus, security patches and secure, remote access methods. Today, anti-virus accreditation is widely accepted. It used to be said that you can’t patch SCADA, you can, now maybe even faster than IT systems.
Secure Joint Ventures, Chevron
Mike Reddy, CIO Chevron International E&P spoke of the increasing demand to access Chevron IT resources by joint venture (JV) partners and other third parties. Current security methods are labor intensive and provide only ‘course grained’ security. Reddy described a Federated Identity Management Technology proof of concept test undertaken this year by Chevron, Schlumberger, Sun, Microsoft and others. Federated identity is a standards-based means of sharing an identity and entitlements across different domains—as between JV partners and their contractors. A test was performed across 30 servers running Chevron’s web-based ‘Operational Excellence’ application and Schlumberger’s Petrel. The demonstrator showed that seismic interpretation could be performed across the firewall using Citrix thin clients. Standard Microsoft Office and web-based applications can be shared securely today. Computer Associates’ eTrust SiteMinder 6.0 also ran.
Disaster planning, Occidental
Don Moore, CIO Occidental, spoke about security and disaster recovery planning. Oxy has been brainstorming to identify potential threats such as a tornado on Tulsa, geological issues (Ecuador-volcanic activity, west coast earthquakes), geopolitics (guerilla activity) and terrorist threats (a dirty bomb in LA). Disaster planning is now a full-time job in IT. Hurricane Rita put Oxy’s planning to the test. With Rita, Oxy learned a lot about shut down, business recovery etc. Three million people tried to leave Houston at the same time. It was taking 27 hours to cover the 220 miles to Dallas. In general, while Oxy’s disaster plans worked, business continuation ‘did not work well at all.’
Identity management, Chevron
Edmund Yee described Chevron’s deployment of a common image for its Windows desktops with automatic update and an enterprise security architecture. This involves all users. Employees, contractors, 3rd parties, JV partners all have managed identities as users or administrators. Devices and services (applications) also have IDs. All business processes use IDs (line of business, SAP, Oracle, network logon, applications etc.) The idea is to ‘unify and simplify physical and logical access with a single corporate ID card.’ This provides a single common process for authentication. The project also delivered enterprise single sign-on (ESSO) and web ESSO where needed. Shell plans to get rid of passwords next year as the necessary IT components become available. Biometrics authentication is available for special groups.
Smart card deployment, Shell
Ken Mann presented Shell’s IT Infrastructure Refresh Project which has reduced the cost of delivering a desktop by 50%. It is based around Windows 2000 and Active Directory. Email is encrypted on the fly depending on its confidentiality level. A Smart Card-based solution ‘gives preference’ to Microsoft-based products. The original goal was to build an out-of-the-box infrastructure ‘without engineering.’ But it ‘didn’t quite work like this,’ even though much functionality was already in the operating system. Shell is moving from the ‘hard perimeter, soft interior’ security model. Schlumberger is to take the smart card management system (SCMS) to market. Microsoft is ‘pushing smart cards hard.’ Both Shell and Schlumberger are early adopters.
SCADA vulnerabilities, NISCC
According to Mark Logsdon of the UK Government’s National Infrastructure Security Co-ordination Centre (NISCC), common-off-the-shelf (COTS) hardware and software has let hackers into water and electricity supply systems—notably with a denial of service (DOS) attack on Israel Electric Corp. In general, terrorism-related incidents are ‘probably under-reported.’ There are risks from hackers and politically motivated individuals. Today, PCS/SCADA vulnerabilities are ‘widely understood.’ NISCC has set up a number of information exchanges, with regular discussions of threats and vulnerabilities. Vendors allow NISCC to manage vulnerabilities in their software and to share the information with members. Companies should ask ‘would you recognize an attack?’ The answer is probably not.
Chris Wright (KPMG) described Sarbanes-Oxley (SOX) as ‘specific to a company’s controls over its financial reporting.’ SOX stipulates that IT shouldn’t have access to live financial data. So some companies have put monitoring in place and then had nasty surprises as to who could see their financial information. Business continuity management is specifically excluded from SOX because SOX is not concerned about the future value of assets. ‘SOX is about the 31st of December.’ It does cover a company’s ability to backup and restore financial data, to ensure that a transaction has completed, been properly recorded and authorized. Note that nine companies in the energy and utilities sector failed SOX. Some failed for ‘creative accounting,’ but others failed for inadequate IT controls—mostly of unauthorized access. Useful resources for SOX compliance include ‘The IT Executive’s Best Practice Guide to SOX,’ a Gartner White Paper. See also www.itgi.org and COBIT’s papers on SOX deployment. Wright also noted that SOX ‘has put an end to the way many people use spreadsheets.’
This article has been taken from a 10 page report produced as part of The Data Room’s Technology Watch Reporting Service. For more information on this service and to request a copy of the original report please email email@example.com.
A study group led by Dartmouth College, working under the US Department of Homeland Security Institute of Information Infrastructure Protection (I3P) program has just signed off on a study of ‘Trends for Oil and Gas Terrorist Attacks’. The study provides a statistical analysis of such incidents which is said to ‘lay a foundation for in-depth evaluation of the role of Supervisory Control and Data Acquisition (SCADA) in the disabling and rate of recovery of the oil and gas system.’ The report leverages the international terrorist attack database from the National Memorial Institute for the Prevention of Terrorism (MIPT).
The study determines that so far, while attacks have been of a physical, not cyber, nature, they may have affected SCADA systems, resulting in a knock-on impact on the rest of the network. But the authors warn that the potential for cyber disruption represents ‘a potentially destructive mode of attack for terrorists.’ While attacks on oil and gas infrastructure sector are a relatively small proportion of terrorist attacks overall, the data show that the sector is vulnerable. In particular, the report speculates that, ‘If terrorist groups feel that carrying out a physical attack within the United States is too difficult they could turn their attention to other vulnerabilities such as SCADA systems.’ The full report is available from http://www.thei3p.org/research/scada/i3presrep2.pdf.
The Petrotechnical Open Standards Consortium (POSC) has appointed Randy Clark as its new CEO and president. Clark was formerly with the e-commerce marketplace Trade Ranger (acquired by CC-Hubwoo last year). Clark also chairs the API’s Petroleum Industry Data eXchange (PIDX) subcommittee.
Fugro has appointed Paul van Riel and Arnold Steenbakker to its board of Directors. van Riel founded Jason BV in 1986. Before joining Fugro, Steenbacker was with Fluor Corp.’s oil and gas division.
The UK Offshore Operators Association has just announced an initiative to simplify the process of offering and accepting bids for work, between purchasers and suppliers. Details of the Model Invitation to Tender will be unveiled at the official launch of the new Supply Chain Code of Practice late this month.
David Zeh, founder of plotting software specialists ZEH Software died last month aged 72.
WellDynamics has acquired Wood Group’s Production Technology (Protech) Business for $24.5m. Protech’s permanent downhole, subsea and surface monitoring products will augment WellDynamics’ e-field offering.
AVEVA has appointed Otto Weiberth as head of sales its U.S. and Latin America operations, and Emon Zaman as manager of its Canadian subsidiary. Weiberth was previously with Autodesk, Zaman with Aspen Tech.
Patrick Héreng has been appointed group CIO of Total. Héreng was previously CIO of Total’s downstream unit. Philippe Chalon moves over to Senior VP Finance Economics Information Systems.
Aspen Technology has appointed Chris Pike to its Board, replacing Douglas Kingsley, who has resigned. Pike was previously a partner at private equity firm Advent International.
Steve Slezak has joined CygNet Software as Senior Account Executive. Slezak was previously with Schlumberger and CASE Services.
Jeffrey Jarrett is the new Assistant Secretary for Fossil Energy at the US Department of Energy. Jarret previously directed the Department of Interior’s Office of Surface Mining.
Energy Scitech has appointed John Merrell to provide ‘strategic counsel’. Previously, Merrell co-founded The Makers, a private equity company.
Mike Spradley has rejoined Fairfield Industries as Acquisition Marketing Manager.
Millennial Net has named John Biasi as VP Product Development and Sheng Liu as VP Technical Services. Biasi was previously with Brooks Automation. Liu co-founded Millenial Net.
OGRE Systems has appointed three new international agents as follows: for Nigeria, Bulwark Services Ltd, for India, Essem High-Tech PVT and for the Philippines, Panpisco Inc.
Paras has appointed John Packer to its team of consultants.
Vinay Vaidya is to manage Rock Solid Images’ newly opened office in Kuala Lumpur, Malaysia.
Claudia Collado is to manage Scandpower Petroleum Technologies’ new office in Torre Mayor, Mexico.
Pom Sabharwal has joined Schlumberger Information Solutions as Marketing Manager, Reservoir Characterization. Sabharwal was previously with Landmark.
Spectrum and Global Geo Services are to merge their business interests and data libraries.
Tadpole Technology has appointed Ross Coulman as Business Manager of Process Industries. Coulman was previously with Enron.
The World Wide Web Consortium (W3C) has appointed Steven Bratt to the newly-created position of CEO.
Aberdeen start-up consultancy Xodus has purchased a license of Scandpower’s OLGA 2000 simulator for transient flow for wells and pipelines.
Norway-based OpenIT has signed three new clients in the last month for its software license management tools. Houston-based Newfield Exploration has chosen OpenIT’s LicenseAnalyzer (LA) to meter its application use. LA generates reports on software license usage which can be used to ‘optimize’ the purchasing of software licenses.
ConocoPhillips has opted for OpenIT’s LicenseOptimizer (LO) package to optimize its portfolio of ‘high-end’ software applications. LO detects software licenses checked out, but not in active use and can make inactive licenses available for other users, reducing the number of software licenses needed to serve all users.
Marathon Oil is likewise using OpenIT’s LicenseOptimizer to ‘align’ geoscience software purchasing with its actual use.
A new Norwegian Daily Production Report (DPR) project has just been announced that sets out to ‘make production data more accessible,’ for authorities and license partners. Project partners include DNV, TietoEnator and POSC and is being tested on Hydro’s Åsgård field.
DPR leverages work done under another Norwegian IM initiative, the Petromake Integrated Information Platform (IIP—see Oil ITJ Vol. 10 n° 6). DNV leads the IIP project, whose partners include Statoil, Hydro, POSC and POSC Caesar. TietoEnator has developed the DPR package which is said to embrace the ISO 15926 ontology and to conform to the WITSML standard.
Pål Rylandsholm, DNV said, ‘The objective of the IIP project is to develop a common ontology for drilling, production and operations/maintenance. The ontology will be specified in the Web Ontology Language (OWL), based on Semantic Web technology, and will become a part of a ISO 15926 Reference Data Library.’
See also our report on a similar 15926-based initiative from Shell on page 12 of this issue.
Brunei Shell Petroleum Co Sdn Bhd (BSP) has just announced the start of crude oil production from Phase III of its Champion West field located 90 kilometres offshore. The high tech field’s production came onstream two months ahead of schedule and at a record breaking flow rate of 16,700 b/d (2,650 m3/d). The field is also a gas producer and by 2010, will supply about 25% of BSP’s total gas production.
The platform is said to be one of the most technically advanced offshore facilities in the world with permanent downhole pressure and temperature sensors pre-installed in a five kilometer long fiber optic cable. Remote control of downhole instrumentation and control is made possible with high bandwidth connections to the shore. Engineers in BSP’s head office can continuously monitor the performance of the offshore wells and facilities improving both production and reserves recovery.
Shell’s joint venture with Halliburton, WellDynamics, installed its SmartWell intelligent completion technology on Champion West, including its Digital Hydraulics technology (DHT). DHT gives Shell remote, independent control of multiple reservoir intervals, providing real time data acquisition and ‘closed loop’ reservoir management capabilities.
DHT minimizes the number of control lines required to operate multiple devices. Interval Control Valves (ICVs) are deployed for remote control of each zone. Distributed Temperature Sensors (DTS) and Permanent Downhole Gauges (PDHG) are deployed for real time optimization and well performance monitoring.
BSP MD Grahaeme Henderson said, ‘Champion West is one of the largest undeveloped resources in Brunei, and will produce oil and gas for the next 20 years and beyond. BSP is a global leader in the application of Shell’s Smart Field Technology, and Champion West is a frontrunner in both the Shell Group and in the industry at large.’ Champion West was discovered in 1975 but production only became possible with the application of a number of cutting edge technologies. In particular, a horizontal ‘snake’ well, was drilled through the sands with a tortuous trajectory.
Norway-based Roxar has signed a marketing agreement with Energy Scitech Ltd., a UK-based independent consultancy and software development company, to sell and support Scitech’s EnAble product in Latin America and the Asia Pacific. EnAble is a history matching and uncertainty estimation package that offers ‘better understanding and measurement of uncertainty in reservoir performance prediction.’
Scitech continues to own all rights to EnAble and will sell the package directly into other regions of the world from its UK headquarters. Scitech will also still provide all enhancement and software updates.
Roxar is also linking its IRAP RMS reservoir modeler and Tempest/MORE simulator to EnAble to offer a new commercial solution where multiple geological scenarios can be examined and history-matched to derive simulation models that are consistent with a geological interpretation.
Roxar has offices in Kuala Lumpur, Jakarta, Ho Chi Minh City, Perth and Beijing, Maracaibo and Puerto la Cruz, Venezuela. A new software training facility is due to open in Puerto la Cruz in early 2006.
IHS Energy is extending its Midstream database to support a greater depth of commercial planning and analysis. The new Midstream Extended Data Module (EDM) includes pipeline utilization rates and tariffs and gas distribution structures as well as data on large industrial customers and macro-economic and energy profiles for each region.
The new database will help decision makers assess opportunities arising in the current context of strong demand, high prices and de-regulation of markets where infrastructure information is frequently the key component of project viability. All information in the EDM is spatially enabled and can be accessed via a GIS front end.
Cynthia Poynter, Senior Manager, Midstream with IHS Energy said, ‘The EDM contains critical information for companies with gas interests. Monetization of gas reserves is closely tied to transportation and market options. EDM enables advanced analysis of key infrastructure issues and can identify ‘make or break’ project variables that will determine the timing as well as profitability of an oil and gas venture.’ The Midstream database is ‘fully compatible’ with IHS Energy’s E&P databases and is updated daily. The EDM and Midstream Database cover most countries outside of North America.
BP reports on progress of its deployment of P2ES’ Enterprise Upstream Volumes Management (EUVM) package at its deepwater Gulf of Mexico business unit. The phased implementation of EUVM began last year (OITJ Vol. 10 N° 2). Enterprise Upstream is a web-enabled application built with Oracle development tools and database technology. Applications and data can be centrally stored, eliminating the need for IT infrastructure in remote locations.
In the current implementation phase, BP is rolling-out the EUVM Allocation Process Modeling (APM) to manage complex allocations including sub sea commingled wells and tiebacks, circulated volumes, a disparity of fluids and continuously-changing operating conditions. Steve Fortune, Information Delivery Manager for BP, commented, ‘EUVM is ideal for meeting our deep water allocation requirements. We are very pleased with the implementation and have already begun to see a return on investment.’
BP has also improved its internal and external reporting capabilities, shortened the monthly close process and has been able to catch exceptions and issues earlier in the process. Mark Eikermann, Senior Vice President of Development for P2 Energy Solutions, added, ‘APM allows users to improve the accuracy and timeliness of production data and allocation results, allowing for proactive decision making cost reduction.’
Digital Oilfield has received ‘Powered by SAP NetWeaver’ certification from SAP for its OpenInvoice SupplierConnect e-business solution. Certification concerns the content and portal integration of SupplierConnect which now integrates the SAP NetWeaver Exchange Infrastructure. The solution offers data exchange between the SAP R/3 and SupplierConnect, a hosted application that allows suppliers to create and transmit field tickets, invoices and supporting documentation to operating companies. The integration provides for the invoice routing Digital Oilfield supplier application directly into the oil and gas company’s SAP financial system. The operating company can then invoke its invoice approval workflows from within SAP. SAP users can now ‘seamlessly’ transact with Digital’s supplier network claimed to be the largest supplier group in the energy industry with over 4,000 members.
Houston-based Technical Toolboxes has just released Pipeline Toolbox (PT) 2006, an upgrade of its integrated pipeline industry software package. PT comprises 60 modules designed for the pipeline professional, and is available in Gas, Liquid and Enterprise (Liquid & Gas) versions. The package includes a database of commonly transported liquids with their physical properties.
Wealth of information
The toolbox contains a wealth of information on just about every facet of pipeline engineering—from sizing of facilities, through hydraulic flow calculations, pipeline design and stress limits and corrosion analysis. A searchable database of DOT and MMS pipeline regulations completes the picture. More from www.ttoolboxes.com.
Speaking at Rice University, Houston last November, mathematician Henry Rachford (Advantica) explained why the modeling of gas flow in pipelines is both important and hard.
In the US, much electricity is generated by burning gas in power stations. The demands of the electricity market are for quite rapid intra-day changes in generation capacity. Since there is no storage capacity in the electricity grid, the transients are ‘passed on’ to the gas supplier, turning the pipeline network into de facto storage. Without careful planning, this can wreak havoc on the gas transmission companies attempts to keep the pipeline supplied with gas and operating at safe conditions.
Rachford showed a simple transient scenario as flow changed from its initial state, taking some time to travel along the pipe. Controls to achieve one goal may have unexpected results elsewhere. Advantica’s modeling package helps operators evaluate pressure and power requirements at pumping stations along pipe, respecting delivery and pipeline operating constraints.
Achieving such goals is a mixture of simulation, guesswork and experience. It turns out that subtle changes in controls can satisfy goals. Rachford’s software uses ‘smarts’ rather than brute force. Results are usually obtained within a couple of minutes of CPU time and if a task is not feasible, this is quickly flagged by the software. Further optimization includes uncertainty as to what users will take out of the system, identifying controls that ‘defensively position’ linepack to support delivery of different anticipated loads.
Belgian gas transmission company Fluxys reports successful deployment of Energy Solutions International’s (ESI) GasLoadForecaster (GLF). GLF uses neural net technology to forecast volume and energy targets at supply points along the pipeline network. This neural network finds patterns in complex pipeline operations resulting from weather fluctuations, calendar information and other variables. The simulator is ‘trained’ regularly to fine tune prediction parameters.
Fluxys needed a forecasting tool to anticipate gas demands help its operators react to demand fluctuations and transport constraints as well as to additional shipper capacity requests.
Fluxys project leader Sophie Jehaes said, ‘We chose GLF because of its accuracy and its comprehensive administrative functions and overall ease of use. As an IT team member, I am interested in a solution that can be easily integrated with our other business applications but also can work standalone.’
ESI CTO John Sherman said, ‘GLF is a fast, flexible and easy-to-use tool. Users see immediate improvements in their forecasts after implementing this product. The multivariate, non-linear nature of gas market forecasting makes this a perfect application for neural networks.’
Last month a white paper from Microsoft Research unveiled a new API which simplifies programming of graphics processing units (GPU) for general-purpose computing. GPUs have already been deployed as computational devices in oil and gas software prototypes from SMT (reservoir simulation) and Finetooth (seismic data compression).
Designed for graphics display, GPUs are powerful, but hard to program for general-purpose use. Microsoft Accelerator provides a high-level interface to translate data parallel operations on-the-fly to optimized GPU pixel shader code and API calls. Accelerator compiled code is typically within 50% of hand-written pixel shader code and up to 18 times faster than C code running on a CPU.
Imation Corp. recently unveiled its new ‘Ulysses’ Tape as Disk technology. Ulysses offers users of tape backup and storage robotics the possibility of replacing tape storage with hard disks. The solution runs without modification to existing hardware or software and ‘radically accelerates’ automation. The Ulysses units fit a 2.5” Serial ATA (SATA) disk drive into a standard LTO Ultrium tape cartridge form factor. According to Imation, this offers accelerated backup and restore performance—up to 10 times faster than a tape-only library.
In a separate announcement this month, Imation has acquired consumer-oriented storage specialist Memorex in a $330 million cash transaction.
Shell’s Global Solutions unit is partnering with the US-based FIATECH, Norwegian POSC/Caesar and Netherlands’ USPI-NL to develop a data integration strategy for the facilities industry that will ‘improve productivity of the capital project and facility life cycle.’ The initiative targets engineering, operations, and maintenance, and will allow contractors and suppliers to exchange data with owner operators.
At the heart of the project is the ISO 15926 Reference Data Library (RDL). This defines standard equipment classifications and equipment-naming conventions and facilitates the sharing of equipment information—particularly at the handover phases of a plant’s lifecycle. For a backgrounder on these issues please refer to our report from the 2005 USPI-NL meeting, OITJ Vol. 10 N° 4.
At a DuPont-sponsored Process Industry Data Integration Workshop last summer, data integration experts from 22 international companies and agencies achieve consensus to ‘drive toward’ a single process industry RDL, ISO 15926-4 TS. The RDL of FIATECH’s ‘Accelerating Deployment of Lifecycle Integration Technologies’ (ADI) project is being built by EPM Technologies using its Express data management system. The project sets out to demonstrate that ISO 15926 can be implemented in a ‘robust’ proof-of-concept. A use case will develop around a fictitious Flour-Bechtel joint venture with ‘Facades’ for different data sources.
The software, which will be released to the public domain, is built with a variety of open source tools including PHP, PEAR, MySQL, AJAX and SVG. The data model and templates will be described in the W3C web ontology language, OWL. Facades are stored as n-triples and a SPARQL API will be provided.
Shell Petroleum Development Company of Nigeria (SPDC) monitors its Niger Delta and shallow offshore area production via DCS’s, Fisher ROC RTUs, and Modbus-based PLCs. Data from Shell’s 3,000 kilometers of pipelines, 87 flowstations, 8 gas plants and 1,000 wells is concentrated to a central location via satellite and radio links. Shell has just announced the completion of a major transition from its legacy, UNIX-based system real time data collection system to a system supplied by MatrikonOPC, based on Microsoft Windows and the OPC data protocol.
SPDC’s Process Control Engineer Kikelomo Afolabi said, “MatrikonOPC proved to be a cost effective, easy to integrate and robust solution providing true interoperability that aligned well with Shell’s need for scalability.” OPC, (OLE for process control) has its origins in Microsoft’s Object Linking and Embedding technology. Today, OPC is a multi-vendor, published communication standard.
Matrikon, a charter member of the OPC Foundation, claims 100,000 installations worldwide for its 500 OPC-based products. SPDC’s system now comprises a MatrikonOPC Server for ROC to interface with the RTUs, and the MatrikonOPC Server for Modbus to interface with their PLCs. A Windows-based system collects field data via OPC, and stores data in a local Process Historian from OSIsoft. Shell uses OSI’s PI ProcessBook for reporting and visualizing its production data in real time.
As revealed in last month’s Oil IT Journal (Vol. 10 N° 12), CDA has awarded the next tranche of its Well DataStore contract to Schlumberger Information Solutions. The new, five-year contract is valued at $9 million. In our haste to go to press last month, we got one aspect of the story wrong—the technology stack that Schlumberger is to deploy in its new solution.
Schlumberger is to deploy a three tier, web services-based architecture for CDA. The foundation datastore is a brand new database developed from the Seabed configurable data model (Oil ITJ Vol. 9 N° 10). Seabed will offer a data catalogue, and ‘enabling capabilities with regard to entitlements.’ Data access will be via the Schlumberger Integration Engine (SIE) and the presentation layer will be a version of Decision Point customized to CDA’s workflows.
The CDA tender required a ‘forward-looking’ web-based infrastructure and Schlumberger has responded by extending its mainstream technologies with embedded Web Services for Remote Portlets (WSRP) from the OASIS organization, a protocol for aggregating content and interactive web applications from remote sources. Schlumberger is already in the process of migrating the legacy CDA data from TIF files.
The new services will be served from Schlumberger’s European Service Center (ESC) in Aberdeen, a redundant multiclient center with ‘resilient’ data storage and high-availability. Data will be accessed across the Internet or Schlumberger’s own DexaNet. Switch-over to the new system is planned for June 2006 with a phased delivery of all components of the technology stack over the following year.