May 2011


Nagios, JTrac for Shell

PNEC presentation shows use of open source tools by Shell’s upstream data managers. Nagios provides ‘industrial strength’ network monitoring. Bug tracker JTrac repurposed as workflow engine.

Speaking at the 2011 PNEC conference in Houston this month Randy Petit reported successful trials of two open source tools in Shell. First up was Nagios (www.oilit.com/links/1105_10), a cross platform IT infrastructure monitoring tool that tracks multi-protocol network traffic and provides configurable alerts on failures, bottlenecks and other issues allowing for timely remediation before outages affect the business.

Shell uses Nagios for ‘proactive’ monitoring of connections to its external Petrobank data server hosted by Halliburton’s Landmark unit. Nagios is used to report issues and plan downtime. Nagios also provides ‘critical infrastructure’ monitoring of Shell’s own key systems, applications and databases. Shell needs to know more than just ‘the Apache server is up.’

Shell’s geomatics department provides a Nagios-powered RSS feed to keep users posted on network activity. Standard action plans have been developed providing who to inform and what actions to take in the event of an outage. All data issues can be reported to a single email alias. A dashboard shows system status with alerts sent out to pagers or Blackberries.

Nagios has extensions for license management, disk space monitoring and Oracle usage. The RSS Feed is used to alert staff to ‘on demand’ events like data loads, Recall refreshes and data orders. RSS can be consumed by ‘just about any device’ and provides an entry point to Shell’s SharePoint portal.

The Nagios/RSS combo has freed up Shell’s data managers for ‘proper’ work while the single mailbox has reduced disruption and duplication of effort. Nagios is considered to be the ‘industry standard’ for monitoring network services.

Petit’s next presentation covered the use of JTrak (www.oilit.com/links/1105_11) in Shell. JTrac, an open source development by one Peter Thomas, is described as a web-based ‘issue (i.e. bug) tracking’ application. But Petit spotted the opportunity to extend its use to upstream workflow orchestration. Target workflows include well data tracking, seismic requests and compliance with company procedures in these fields. Shell’s previous workflow management attempts were spreadsheet based and proved somewhat unsustainable.

JTrac plugs into existing systems and can perform Oracle or mySQL queries and inserts. JTrack provides notification to stakeholders at key junctures in the workflow with emails, and updates metadata as appropriate. JTrac manages roles and ‘who does what,’ passing information along to the ‘next allowed state,’ forcing completion and approvals. ‘States’ can be defined as mandatory or optional. The system generates a dashboard with ‘spaces’ for component workflows. Configurable KPIs provide management-level workflow tracking. Custom Java code allows interaction with the Landmark PowerHub API, linkage to the document management system and other data stores. Petit sees the JTrac workflow manager as a real ‘spreadsheet killer.’ Java makes the system multi platform allowing automated KPIs and metrics across Linux and Windows systems. 

Comment—Nagios has good credentials amongst horizontal ‘blue chip’ users and oil and gas deployments including Landmark, Grant Prideco, Kelman, LMKR and Siemens. JTrac appears to be a more modest project. Perhaps this novel use in workflow orchestration will change that.


CygNet sold

Weatherford buys CygNet Software adding SCADA system monitoring to Field Office. Deal to extend CygNet client base from domestic gas focus.

Switzerland-based Weatherford International has acquired SCADA software boutique CygNet. Transaction details were not disclosed. The acquisition follows a 2010 teaming between the companies whereby Weatherford operated a hosting service for CygNet’s Enterprise Operations Platform (Oil IT Journal November 2010), targeting shale gas producers. EOP collects, manages and distributes real-time and historical data from field, production and business systems, providing users with decision support for daily operations and strategy. EOP will be rolled into Weatherford’s ‘Field Office’ package to provide a ‘comprehensive set of tools for monitoring, analyzing and controlling production operations.’

The solution will be delivered both as a native installation, and as a hosted offering. CygNet claims over 70 customers and a ‘significant’ share of the US domestic gas production and transmission markets including ‘seven of the top 10 US gas producers.’

CygNet President and CEO Chris Smith said, ‘This acquisition will propel CygNet into international markets and liquids production.’ According to Weatherford, following the integration Field Office components will be used on over 350,000 wells. More from Weatherford on www.oilit.com/links/1105_9.


15 years of PNEC—Part 1, 1997-2001

Full PNEC report next month. But meanwhile we thought we’d celebrate the 15th anniversary of Phil Crouse’s data management conference with a three part walk down memory lane. Part 1 covers interoperability, outsourcing, ‘back door’ data management, e-commerce and application service provision and GIS-based data management. It all sounds rather modern in fact!

1997 Geoshare, although Schlumberger-sponsored is a truly open environment. Development kits can be obtained by any third party for a modest sum. The essence of Geoshare is that an application can output or receive data through a ‘half link,’ and an application-independent data model. Geoshare makes sense in today’s ‘best of breed’ multi-vendor environment.

Of course Geoshare is not magic. Conoco’s Jack Gordon emphasized the care necessary to ensure data integrity during transfer especially with topographic datum shifts, and when two applications have a fundamentally different view of data representation. The deployment and regular use of Geoshare is not for the faint hearted.

Cindy Pierce described the outsourcing program underway in Conoco. Conoco’s radical approach involves not only outsourcing the task, but the people too. Full time Conoco employees are taken on by the contractor (GeoQuest). The new look E&P department no longer considers activities such as data management as core business.

Janet Rigler (BP Houston) debunked a widely held belief concerning the way the E&P asset team does its business. The shared earth model should enable interpreters to include new information as new wells are drilled. Well, it doesn’t work like that. The limiting factor is data management. Applications are licensed on different machines and data must be moved around constantly.

1998 Mark Robinson (GeoQuest) estimated that around 90% of the oil produced today is managed through Excel!

Marion Eftink (Unocal) introduced the concept of ‘back-door’ data management. In theory, data should be managed from the instant it arrives in the exploration department, catalogued and cleaned-up so that everyone calls the same thing by the same name. In practice, power users in the asset teams often get first crack of the whip, and before you can say ‘data management’ there are multiple copies of the data, with different naming conventions in just about every system within the company. Enter ‘back door’ data management. Instead of attempting impossible policing of data up-front, Unocal’s system uses a GIS front end to the various data stores and implements a system of cross-referencing different appellations through look-up tables.

1999 Rene Calderon presented a novel product, Petris Winds—a system that is configured to browse a corporation’s data in situ across a variety of data stores. Winds builds its own meta-data view of the enterprise by ‘spidering’ the data overnight.

Gayle Holzinger (Shell) described the deployment of PetroBank for offshore 2D data delivery to the workstation. This has cut the data loading cycle time from ‘up to’ six months down to a couple of days.

2001 PetroWeb’s David Noel traced the evolution of thinking on E&P data access, from data models to the present situation of multiple databases and multiple clients. Accepting the status quo, Noel advocates the use of ESRI’s ArcView as ‘doing a great job of integrating internal and external data sources.’ Non-proprietary map layers and external databases can be managed by third parties.

Vidar Andresen traced PetroBank’s history starting as IBM’s development for the Norwegian national data repository, Diskos, to its deployment, via PGS, at sites all over the world. Landmark is now re-branding PetroBank, which also underpins the ‘Grand Basin’ e-business unit.

Geodynamics’ GIS-based data management system was the subject of no less than three presentations, featuring authors from Kerr McGee Oil and Gas, BJ Services, and Enron. Enron’s Mark Ferguson described how GIS was used to match pipeline capacity to demand. Enron has integrated Petrolynx GIS software with security and access control through Sun’s Java Start plug-in and leveraging Tibco’s ‘hub’ middleware.

Innerlogix’ Dag Heggelund’s thesis is that the ‘next generation’ of web applications will be built using an XML-based object technology combining data and stylesheets for data translation into and out of proprietary environments. Heggelund downplays the importance of standards, saying these would be ‘nice to have’, but are unlikely to materialize. The key is the XML/XSL technology that allows for interoperability in a non-standard world. SOAP is also significant in that it circumvents the battle of the CORBA, COM and Java orbs. Heggelund quotes—’well logs should have portable web addresses,’ ‘today’s ‘fat’ applications know too much,’ ‘a data item should not know where it lives, have no knowledge of who uses it and not know where it comes from,’ and ‘Bill Gates is not going to implement Open Spirit!’

In the discussion on application service provision (ASP), Geonet’s Bill Micho described ASP as a ‘tough marketplace.’ Geonet provides ASP for some 50 applications. GeoQuest also offers an ASP-based monthly subscription service to its products, and Landmark’s Grand Basin subsidiary does likewise, although Graham Merikow said the move to ASP was not like ‘flipping a switch’.

Addressing the integration question, Bill Quinlivan (GeoQuest) observed that ‘life is ten times harder for a receiver than a sender of data.’ Also the difficulty of a solution might be unrelated to the urgency of the problem. Geoshare, Open Spirit etc. may be ‘hammers looking for nails.’ Quinlivan advocates evaluating a link development project by comparing cost of development with cost of use. This is represented as a ‘cost-performance frontier’. Further optimization is achieved by arbitration and selection of cost effective link technologies according to constraints such as development budget and/or cost of use. Visit PNEC on www.oilit.com/links/1105_43 and checkout The Data Room’s coverage of 15 years of the show on www.oilit.com/links1105_44.


Oil IT Journal interview—Dayapatra Nevatia, Wipro

Wipro energy head explains rationale behind acquisition of SAIC’s oil and gas consulting unit.

With oil and gas accounts like Shell, BP, ExxonMobil, Saudi Aramco and Petrobras, why did Wipro need to buy SAIC?

We have around seven years of activity in oil and gas and there are now around 2,300 in the practice. Our work for NOC and IOC clients has led to us becoming a leading IT outsourcing service provider. We already develop and implement solutions, but we felt that we were lacking thought leadership and a deep understanding of our customer’s requirements. We really lacked the capacity to advise on major IT programs as compared to companies like SAIC, Booz or Accenture. So when the opportunity came along to acquire SAIC’s oil and gas practice, with its strong ‘digital oilfield’ involvement we jumped at the chance to move up the value chain and into complex, multi-year engagements.

Can you give an example?

Yes. We have a Russian customer who is working on exploration and greenfield development. We already have people advising on a point solution for a collaborative work environment (leveraging previous experience with RTOS for other clients). But we are now looking to take on a complete instrumentation project—along with advice on strategy. The SAIC/Wipro combo will provide an end to end solution for such projects.

A kind of EPC1 for IT?

Absolutely. We now can offer strong engineering, global R&D, good domain expertise around telecoms and collaboration and data visualization deployment—and we can pull it all together with program management—which was previously lacking.

Why did SAIC sell?

SAIC’s main business is government and defense. Some capabilities, like visualization, have been successfully transferred from to the upstream. SAIC has been working notably with BP for over 20 years and has gradually increased its footprint from the upstream. But the reality is that SAIC’s DNA is in the US. Its clients wanted a multi geography support with more offshore operations—things that the company was struggling to provide. The government/defense work meant less investment in oil and gas. Wipro is already strong in downstream and corporate functions—so everything is very complementary. There is a wide geographic spread and capabilities—SAIC upstream, Wipro downstream. Customer wise, there are a couple of overlaps but in general we are very complementary and I expect this to generate significant growth opportunities.

What direction is the knowledge/information flow? From up to downstream or vice versa?

The two domains really are very different. Upstream is about finding new fields and investing significant amounts. Downstream is more about cost optimization and getting the most out of existing investments. Upstream spends more on collaboration technology and has significant new investment in challenging geographies. There is more technical innovation in upstream than in mature downstream.

There is a different IT culture too—IT on a plant/facility is rather special…

The technology is not so different—but the culture, yes. Upstream, IT is embedded in the business. Downstream it is much more in the IT organization. Upstream continues to invest through the business.

Is that good or bad?

It depends. The upstream has more regional business differences, a global approach to IT is challenging. Government involvement may mandate local investment. On the other hand, optimization possibilities may have been missed from this piecemeal approach. We need more globalization and optimization.

Do you see a Wipro-branded solution for the upstream?

We are aiming at thought leadership, and will be aggressively presenting our capability in forums and with white papers and articles. By this we hope to accelerate take-up of our templates and framework. But we will not be productizing a ‘solution.’

So this is an eclectic offering?

We call it agnostic. Our strength is as a strong independent expert. We are very good at testing software and we are the largest 3rd party provider of R&D for our customers. Elsewhere Wipro is strong in infrastructure design and R&D for telecoms and utilities. In fact 10% of our revenue comes from energy and utilities. We consolidate IT and subject matter expertise. A customer comes with a problem, we find a solution.

More from info@wipro.com.

1Engineering prime contractor.


Book review—Software Testing

ITSQB study guide fails to provide ‘quick start’ for newbie testers.

The 3rd edition of Software Testing Foundations1 is a preparation for the ISTQB2 qualification. It is a decent introduction to the subject of software testing, terminology, psychology and more using the General ‘V-Model.’ This advocates concurrent development and testing as twin branches of a ‘V’ that converges on a finished, tested product.

The book works through the development of a ‘Virtual Showroom’ application illustrating use of static and dynamic testing, test and team management and strategies.

A section on ‘test tools’ was a disappointment, attempting to classify tools without any examples. The debate is at too high a level to be useful to anyone wanting to get testing quickly. No commercial tools are named, let alone compared. 

If you are either a student, or a manager lumbered with a new responsibility for testing this is probably a good starting point. If you want to roll up your sleeves and get going fast, probably not. Read also the useful comments on earlier editions on Amazon. 

1 Spillner, Linz and Schaefer, Rockynook Press. ISBN 978-1-933952-78-9 and www.oilit.com/links/1105_45.

2 International Software Testing Qualifications Board, www.oilit.com/links/1105_66. 


BP cash for Gulf of Mexico R&D

$500 million program to investigate effects of Macondo blow out.

The BP-backed Gulf of Mexico Research Initiative (GRI) issues a request for proposals ‘RFP-I’ with the aim of establishing ‘four to eight’ research consortia with a minimum of $37.5 million per funding. The consortia will be tasked with studying the effects of the Deepwater Horizon incident on the Gulf of Mexico. In particular, research will seek to understand the impact of oil and dispersant on ocean and coastal systems, and to establish how the marine environment responds to large accidental inputs.

GRI Chair Rita Colwell said, ‘GRI was created by BP and the Gulf of Mexico Alliance to develop an independent, merit-based process to identify and fund the best possible research into the fate and effects of oil and oil dispersants on the Gulf of Mexico.’ Last May, BP committed $500 million over a 10-year period to create the independent research program which is to be conducted at research institutions primarily in the US Gulf Coast States. 

Along with the impact assessment, GRI is to develop improved spill mitigation, oil and gas detection, characterization, and remediation technologies. The ultimate goal of the GRI is to improve society’s ability to understand and respond to the impacts of petroleum pollution and related stressors of marine and coastal ecosystems. The results of the research will be applied to restoration and to improvement of the long-term environmental health of the Gulf of Mexico.

RFP-I has five component areas, 1) physical distribution, dispersion and dilution of petroleum, 2) chemical evolution and degradation, 3) environmental effects, 4) response/mitigation technology and 5) public health. More from www.oilit.com/links/1105_35.


GE upgrades blowout preventer software stack

New Drilling iBox’ monitoring application. Positive feedback on ram activity. Subsea ‘black box.’

GE Oil & Gas has improved its Hydril blowout preventer (BOP) and remote monitoring and diagnostics (RM&D) capabilities. The new ‘Drilling iBox’ (DiB) system, a hardware/software combo, leverages real time data, currently somewhat under utilized, turning it into useable products like reports and status updates. DiB feeds monitoring data into GE’s ‘Proficy’ suite for reporting and diagnostics. GE has also announced subsea drilling data recorder (DDR) a.k.a. a ‘black box’ for BOP data logging. The DDR, like the flight recorder used on aircraft, captures operational data and stores it on a hardened system that can survive in ultra deepwater depths and pressures in the event of an emergency.

Other BOP innovations include RamTel which provides a direct indication of ram position and activity and a system that allows remote operated vehicles to access critical information when the main BOP control system communication path is ‘disrupted.’ The technology can be retrofitted to all GE BOP Multiplex (MUX) control systems. More from www.oilit.com/links/1105_12.


Houston Association of Professional Landmen

Geographic information systems increasingly important to landmen.

Speaking at the 2011 Houston Association of Professional Landmen’s (HAPL) Technical Workshop, Bill Gardner (GIS Data Maps) offered a primer on GIS usage. Today’s landmen are required to keep up with multiple larger prospect areas. Mineral leases are no longer ‘pristine’ and systems need to support JOA’s, farmouts and top leasing. Clients are increasingly using GIS systems and storing their data in online databases as Shapefiles. GIS systems can then create maps on the fly blending data from company material and from third party data providers such as the landman’s favorite, Tobin.

GIS market leader is (no surprises to Oil IT Journal readers) ESRI. But according to Gardner, some open source GIS systems are gaining in popularity—notably, Quantum GIS (QGIS/www.oilit.com/links/1105_16) and the Geographic Resources Analysis Support System (GRASS/www.oilit.com/links/1105_15) originally developed by the US Corps of engineers.

Gardner warned that clients are having a hard time keeping pace with the amount of GIS data coming in and now expect their service providers to supply GIS data. This means that smaller companies and independents will have to show that they know what they are doing before their data is accepted. This includes provision of appropriate metadata such as lessee, gross acreage and expiry dates. Projections are another key issue for GIS data as are license restrictions on third part data sets. Training is key—Gardner recommends ‘GIS for Dummies’ (www.oilit.com/links/1105_14) but notes ‘you didn’t become a landman overnight, don’t expect to become a GIS expert in a week!’ More presentations from HAPL on www.oilit.com/links/1105_13.


Sercel’s million channel system—‘real soon now’

428XL Giga Transverse add-on for seismic recording system redefines ‘exponential data growth.’

In The Data Room’s Technology Watch report from last year’s SEG (www.oilit.com/links/1105_33) we quoted University of Colorado seismic thought leader John Scales as saying ‘Seismics is a sampling problem, not optimization.’ Seismic equipment manufacturer Sercel (a wholly owned unit of CGGVeritas) seems to be thinking along the same lines with the announcement of a roadmap to a ‘million channel’ acquisition system. The new Giga Transverse add on for Sercel’s 428XL acquisition system targets the ‘super crew’ market for high-density wide-azimuth acquisition. Giga Transverse promises high data rates with optical fiber cable and a capacity of 100,000 channels per line.

Sercel CEO Pascal Rouiller reported an installed base of three million channels currently. Data managers wondering just how long ‘exponential’ data volume growth will continue had better plan for petabytes more of the same! More from www.oilit.com/links/1105_34.


Software, hardware short takes

Baker Hughes, Petris, Blueback Reservoir, dGB, Ark, Emerson, Ensyte, Eurotech, Exprodat, Polyhedron, Geoforce, Schlumberger, P2 Energy Solutions.

Baker Hughes’ JewelSuite 2011 introduces new support for collaborative earth modeling, workflows for unconventionals and multithreading/multicore functionality. New workflow automation supports batch processing of property models, gridding and more. Communication between JewelSuite, SMT, and CMG software supports read/write, versioning, and audit capabilities—www.oilit.com/links/1105_50.

Version 7.5 of PetrisWinds Enterprise gets a re-looked GUI deriving from a usability study and the principles set out on UseIT (www.oilit.com/links/1105_51) and UsableWeb (www.oilit.com/links/1105_52). The result is a move from Java Script/Google Web Toolkit to Microsoft Silverlight. The backend remains Java for platform independence—www.oilit.com/links/1105_53.

Blueback Reservoir has rolled out the Blueback Toolbox 2011.1, a suite of plug-ins for Schlumberger’s Petrel. New in this release is a Reservoir Engineering module—www.oilit.com/links/1105_54.

dGB Earth Sciences and ARK CLS have announced a direct data link between dGB’s OpendTect and Petrel. The connector provides access to OpendTect’s own plug-ins for attribute analysis, sequence stratigraphy, fluid migration, rock property predictions and velocity modeling—www.oilit.com/links/1105_55.

Emerson ‘s Roxar RMS 2011 edition includes new tools to model complex geologies and incorporate 4D seismic into the workflow, geological well correlation improvements, fracture modeling and usability enhancements. RMS 2011 operates on Linux 64-bit, Windows XP and the Vista 32 and 64-bit platforms, as well as Windows 7 64-bit—www.oilit.com/links/1105_56.

Ensyte ‘s new Prophet XL release is a Microsoft Excel-driven economics and production forecasting package—www.oilit.com/links/1105_57.

New ruggedized computers from Eurotech target the oil and gas vertical. The Zypad BR2000 series leverages ‘power stingy’ Intel Atom processors. Designed for vehicle-mounted or ‘man-worn’ applications, the systems include high-speed I/O, multimedia and communications capabilities. The Zypad BR2000 is compliant with IP67 environmental standards and runs Linux, Windows Embedded or Windows 7—www.oilit.com/links/1105_58.

Exprodat has released Team-GIS KBridge for Esri’s ArcGIS 10 platform. KBridge provides bi-directional geodata integration between SMT Kingdom and Esri ArcGIS—www.oilit.com/links/1105_59

Linux/Windows benchmarks by Polyhedron show an average 17% speed improvement of Linux over Windows across seven Fortran compilers—www.oilit.com/links/1105_60.

Geoforce has announced a new ‘remote worker safety’ solution for personnel on pipelines, refineries and platforms. The system builds uses satellite-based SPOT GPS personal tracking to provide a map showing workers and field equipment locations in real time—www.oilit.com/links/1105_61.

Schlumberger’s Techlog 2011 reports enhanced usability and a new module incorporating industry-standard methods to compute pore pressure and fracture gradients. The new release implements Elan functionality in Techlog for enhanced mineral solving capabilities—www.oilit.com/links/1105_62.

P2 Energy Solutions ’ Excalibur ‘NexGen’ release brings a Windows Presentation Foundation GUI and adds flexible data grids, XPS printing and tight integration with Microsoft desktop solutions. Under the hood is P2 Analytics ‘powered by’ TIBCO Spotfire—www.oilit.com/links/1105_63.


ESRI 2011 Petroleum User Group

Shell’s GeoWeb, Apache’s Silverlight portal, seabed survey model, GIS on Ormen Lange, linear programming.

Keith Fraley described use of the oil and gas ‘GeoWeb’ in Shell. To find such, use the Google ‘inurl’ search to locate ArcGIS REST services and mapservers. Once you’ve located your data, you can either mashup different sources or import raw data for processing in Python, ‘the ideal geospatial integration platform.’ Examples include ship track processing with the Spot Blue Skye network, ‘game changing’ opportunity tracking with IHS’ Enerdeq map and the NDBC GoM rig tracking service.

Carlos Sosa presented Apache’s new ‘Eclipse’ map portal, a Microsoft .NET/Silverlight developed three tier server, applet and viewer for GIS data. Users can subscribe to information feeds such as HSE, geology or land and mashup their own maps on demand.

Woodside’s Gareth Wright introduced the seabed survey data model (SSDM) developed in collaboration with Shell and the Oil & Gas Producers association, the new home of the EU Petroleum Survey Group. SSDM is an Esri personal geodatabase template, supplied with a data dictionary, stylesheets and user/implementations guides. Wright showed some compelling imagery from SSDM use cases including a 3D view of a site survey in IVS’ Fledermaus (www.oilit.com/links/1105_42). The SSDM also captures metadata for survey management and operator/contractor interactions and data handover.

Colonial Pipeline has been working with Cri+Igen on integrating GIS systems with computer aided design and asset management. Interoperability of Esri, IBM/Maximo and Autocad leverages a web services architecture.

Calum Shand’s presentation on the development of Shell’s Norwegian giant Ormen Lange (OL) field was a GIS spectacular. ESRI GIS was used to position seabed infrastructure in the rugged terrain of the Storegga slide, an 8,000 year old subsea mudslide involving some 3 million tonnes of rock! ArcMap has been used to spatialize engineering data in CAD systems for the field, gathering lines and twin pipelines—one 1,200km long connecting Norway to the UK and claimed as the world’s longest. For OL, WebGIS is now the ‘engineer’s desktop.’ GIS also underpins OL’s shore-based operations, providing situational awareness from multiple data feeds in the high tech control room.

 Scott Oelfke (Geographix) showed use of linear programming techniques developed for the rag trade which are being applied to optimize pad placement in non-conventional ‘factory’ drilling. The method leverages geoprocessing to avoid hazards, quantify risks and to ‘track costs spatially.’

Read these and other PUG presentations on www.oilit.com/links/1105_41.


SPE Digital Energy 2011

Total on IT-driven productivity gains. Baker Hughes’ reservoir simulation in the Azure cloud. Aera Energy’s ‘manufacturing mindset.’ Saudi Aramco and SAS on production data mining. Star of show award to Chevron’s 1960 movie. Shell on ‘end to end’ security and how IT helps reduce flaring.

Jean-François Minster (Total) observed that IT has been the main source of increased productivity in last two decades and this is not going to stop soon. New tools are constantly invented, new human machine interfaces, processing, sensors and petaflop computing are all set to impact our way of working. Real time data, decision support centers, simulation, data mining and remote operations can improve production and increase safety. All of this is at least possible providing that our data is reliable. Here standard formats can ease data flow between applications. But technology is only part of the picture. Today’s ‘inflection point’ is the human interface which needs to accommodate multi disciplinary teams. Other verticals have mastered this—witness the ‘system of systems’ of a large electrical plant.

Baker Hughes International (BHI) CEO Chad Deaton thinks that operations represent the most mature digital oilfield activity. Drilling undergoes continuous improvement thanks to real time data and a huge increase in logging complexity and sophistication. Global expert centers like BHI’s ‘Beacon’ centers work for ten rigs simultaneously. Deaton cited GDF Suez’ Gjoa field as digitally sophisticated and highly instrumented with all data replicated onshore. Cloud computing got an enthusiastic plug. This enabled BHI’s Russian unit to perform ‘500,000 simulations’ in the Microsoft Azure cloud. Macondo raised many issues—from government oversight, through uncertain communications, poor project history and data sharing, risk assessment and unlearned lessons. Most all of these can be fixed with digital oilfield technology. We need more and better sensors, better alarms and automatic shutoffs and perhaps a black box for a drilling rig. Cyber risk has kept Deacon awake. A drilling rig may have a dozen contractors who may not all know everything they should be aware of. We need an IT security standard and better on-rig data management. A log of BHI’s firewall activity showed some 5.8 million attempted break ins per day. While digital is changing our culture, the digital oilfield really is at an inflection point—a digital oilfield identity crisis that is making it hard to move forward.

Gaurdie Banister (Aera Energy) advocates a manufacturing mindset, ‘Think small, move grains of sand not mountains.’ Start with information quality—you need enterprise-wide quality. Next build an enterprise architecture spanning ERP, geoscience, wells and facilities. Then all this can feed the information ‘factory,’ a.k.a. the data warehouse ready for analysis, action and process optimization. This now supports aligning personnel’s competencies with job requirements or pump failure root cause analysis. People get involved and committed through transparent data. Everyone gets to play, everyone drives or as they say, ‘In God we trust, everyone else bring data!’ The factory approach is simple and effective—don’t get lost in the details!

Husam Madani described Saudi Aramco’s reservoir engineers as ‘swimming in data.’ Access to data is still hard with ‘at least 60% of your time spent gathering and prepping data.’ Hence the interest in an ‘integrated reservoir management portal’ iRMP populated with clean consistent data. Aramco’s iRMP includes production data, well files, logs and PVT data along with workflows for depletion planning, well and field-level performance analysis. Working with SAS Institute, Aramco is now trialing statistics and data mining on the iRMP data. This includes decline curve analysis backed by a range of SAS statistical methods such as Stepar, least squares and residual co variance. iRMP is developed with the Siwz internet development framework (www.oilit.com/links/1105_39) and MyBatis (www.oilit.com/links/1105_40) for data mapping.

Our ‘star of the show’ award goes to a movie shown by Chevron’s Deon Rae. This was made by Socal and IBM back in the 1960s to explain the use of digital technology to model and improve operations at Socal’s El Segundo refinery. Some giggled at the old punch card machines, but for us the Socal movie showed a) that ‘digital’ has a long and honorable history in oil and gas and b) marketing in the 1960s did not talk down to its audience. But we digress.

Rae traced his own history in refinery operations from the first digital control systems in the 1980s. In the 1990s these were  extended with RS232 and ModBus to safety systems. The 2000s saw the FieldBus and ProfiBus wars and a move to Microsoft systems and OPC. Throughout this, the upstream stuck with pneumatic systems and local control. Even today, connectivity with remote installations is the challenge. The data landscape is complex with SCADA, DCS, telemetry and subsea equipment. The trend now is for integrated control systems and a main automation contractor concept with fewer suppliers and for the global deployment of a ‘standard reference architecture’ developed by IT and automation.

Shell’s Johan Krebbers agreed that ‘smart’ is not new, although the industry still suffers from ‘silo-based’ development. Real time data is now spreading through the enterprise from manufacturing to the upstream. Shell is building a common platform crossing up and downstream. This starts with a standard process control domain and includes end-to-end security to the office. While consumer IT has had limited impact, users now require multiple devices. Microsoft’s offering in the mobile space in limited and PCs are no longer the only client. Microsoft Windows’ position as the dominant operating system is being undermined.

Krebbers also thinks that application development is ‘too slow,’ and advocates a horizontal/vertical model with thick/rich horizontal services. Here data modeling is key to providing a single view of the enterprise.

Ron Cramer is using energy efficiency surveillance (EES) to minimize greenhouse gas (GHG) emissions at Shell. In the North Sea, EES translates real time data into EE and GHG KPIs, flagging the operators when something suspicious occurs. Data streams from SCADA/DCS to OSIsoft’s PI System. Calculations are done offline in the office domain. GHG monitoring is good for business. Alarms mean that operators investigate and fix faults early on. Cramer noted that older pneumatic instrumentation in refineries are powered by compressed air. But in the upstream, there is no air available so systems use methane—a GHG 20 times more potent than CO2. This practice is ‘practically eradicated’ in Shell.


Digital Energy special session on oil and gas cyber security

Lockheed Martin on ‘advanced persistent threats,’ Chevron—‘everything is connected,’ Shell—‘Microsoft losing dominant client role,’ Oxy—‘physical separation of network and plant,’ Emerson—‘Don’t pick up USB sticks in the parking lot.’

Lockheed Martin’s Ken van Meter noted similarities between gas transmission and his specialty, the ‘smart’ electricity grid. While the smart grid is a necessity to replace the current ‘worn out’ system, the advent of around 440 million ‘hackable points’ by 2016 means that the new system will need serious protection against attacks like Stuxnet, the Slammer worm and the Aurora event. The recent hack of security solutions provider RSA shows the extent of the problem. RSA was the victim of an ‘advanced persistent threat’ that likely came from ‘a nation state or criminal.’ Utilities, used to the relative safety of legacy SCADA systems are unprepared for this. But attackers like small vulnerable entities where they can try stuff out before going prime time. Today, everything is connected—so if you hack a small utility this can be a route into much bigger targets. This is a ‘serious problem,’ but one that can be solved. The North America Electric Reliability Corp’s NERC-CIP (www.oilit.com/links/1105_17) is a good start. But van Meter also recommends separate routers for electricity and IT and real time monitoring and forensics. The Department of Homeland Security’s Defense Industrial Base (www.oilit.com/links/1105_18) proved a good forum for sharing information on threats.

Chevron’s Peter Breunig picked up on ‘everything is going to be connected’ theme as geoscientists, traders and others increasingly push for access to ‘all the information all the time.’ But the connected enterprise, as well as exposing users to attack, allows IT to mine information and detect risks. IT ‘situational awareness’ includes probing your own systems, seeing how long it takes to recover from an attack. Better make it fast because, ‘you will be hit!’ The balance between presenting and preventing access to information, just like connectivity, is a ‘risk game.’

Shell’s Johan Krebbers sees authentication as a key area—especially with the increasing use of services in the cloud. Windows is losing its role as the main client operating system as users bring in novel devices. You can no longer trust the ‘endpoint’ which may be privately or company-owned. Authentication needs to move to the cloud too, perhaps with a standards-based protocol like SAML (www.oilit.com/links/1105_19) and single sign-on procedures. Krebbers notes, ‘the last thing you want is to be in bed with Microsoft or another proprietary system.’ Joint venture entitlements need to be ‘application and data driven,’ rather than by a firewall whose perimeter may evolve. Data encryption will get far more important, likely leveraging the Oasis key management interoperability protocol (www.oilit.com/links/1105_36). Current systems are not up to scratch for logging and forensics—we need better real time complex event processing—going way beyond ‘just logging.’

Don Moore, Oxy’s ‘chief cyber security guardian’ used to feel comfortable about IT security—now he ‘works constantly’ to improve it. There is a ‘lot of tension in the industry and in Oxy and concern at executive and board level about how information is shared. Companies need to protect against ‘crazies,’ their own employees and ‘sophisticated nation state-based attackers.’ Oxy gets 400 million spam/virus infected emails per year and up to 400,000 unauthorized access attempts per day. Cyber security needs a refresh as the world is full of smart devices, phones and iPads. Digital canopies and expert systems are deployed fast, thanks to $100 oil, ‘but you need to balance speed with security.’ As part of the US National Critical Infrastructure, oil and gas is being ‘lent on’ by the Feds. The digital oilfield has ‘changed and raised’ the profile of cyber security. Moore proposes physical separation of network and plant. For SCADA systems, ‘80% of the payoff is from physical separation.’ Today, ‘data is flowing all over the company.’

Cynthia Johnson provided some more details of how Oxy is implementing its security. The key is network separation with de-militarized zones between plant and office and between office and internet. These can limit connections so that for instance, the SCADA delivers only 24 hours of data at 10pm to the office system. Even this involves complex data flows across many paths. But as Moore stated, the lion’s share of security is assured by physical separation. Authentication choices need ‘tuning’ to user accounts and devices. But these can be spoofed and need constant monitoring. User credentials can be constrained to place and time and correlated with other events to identify unusual patterns of use. Inside threats can be mitigated by constraining users to authorized activity. While it’s impossible to know where all attacks will come from, the above is a good starting point.

Peter Zornio described his company, Emerson, as operating in the ‘last mile’ of cyber space—where the valves and chokes that cause things to happen are located. In the old days, we had security by obscurity, with bespoke operating systems and applications. Since the mid 1990s, for better or worse, things have evolved to a prevalent ‘Wintel’ environment. Likewise, digital energy means that business value is derived from interconnection of business systems and the plant. Until last year these issues were discussed in automation forums. Then there was Stuxnet. Zornio had to write his CEO to explain why ‘what happened to Siemens could not happen to us—although we are by no means invincible!’ How do you protect control systems from targeting with viruses? By isolating from the business network and through patch and device management and white listing deices and applications. There is lots of good technology, but the reality is that ‘people are the weakest link.’ ‘Don’t pick up USB sticks in parking lot!’ Read Ronald Krutz’ book on ‘Securing SCADA Systems’ (www.oilit.com/links/1105_37). And join the ISA-99 Control Systems Security group (www.oilit.com/links/1105_38). Zornio confessed surprise at the slow take-up of cyber security and at the fact that it is not used as a selection criteria.

For more on oil and gas cyber security, read The Data Room’s Technology Watch report from the 2008 API Cyber Security Conference (www.oilit.com/links/1105_47) and the 2005 SPE Digital Security Event (www.oilit.com/links/1105_18). 


Folks, facts, orgs ...

Kvaerner, AspenTech, Baker Hughes, Bracewell & Giuliani, Deloitte, Paradigm, DeLorme, Doyles, EnergyNet, ENGlobal, Fugro, PSE, GE, Helix, HP, Technip, MVE, TerraSpark, TGS, KBR ... more

Eiliv Gjesdal has been appointed CFO of Kvaerner.

Bob Whelan has been appointed to AspenTech’s board.

Baker Hughes’ board has approved the transition of Chad Deaton, chairman and CEO, to the role of executive chairman as of January 1, 2012 when Martin Craighead becomes CEO and president.

Kirstin Gibbs and Mike Brooks have joined Bracewell & Giuliani’s Washington energy practice.

John McCue has been named leader of Deloitte’s energy industry group in the US.

Paradigm has appointed Bruce Koch as CFO. He was formerly VP and CFO Nabors Industries.

DeLorme has appointed John Auble as VP Data Products. He hails from Tele Atlas and DigitalGlobe.

Doyles has hired Anthony Onorato as its VP operations. He hails from the Drilling Manifold Group at Cameron International.

Former Chevron executive John Munroe has joined EnergyNet as VP of government relations and engineering.

ENGlobal has named Steven Kelly as general manager, Houston automation, Shelly Leedy as executive VP of automation and Cynthia Southall as senior VP business development.

Arnold Steenbakker has been named chairman of Fugro’s new ‘board of management.’ Paul van Riel is vice-chairman, technology/innovation and geoscience director. Present CEO Klaas Wester retires next year.

Process Systems Enterprise has named Dale Curtis president of its Americas operation. He was previously with Freeslate, Inc.

GE Oil & Gas has opened a new wireline manufacturing and headquarters facility in Farnborough, UK. Former MD at Citigroup Global Markets, Ronnie Hawkins, has been named VP of business development for GE Energy, and  Prady Iyyanki has been appointed VP for Oil & Gas, GE Energy.

Helix Energy Solutions has promoted co-founder of Canyon Offshore, Cliff Chamblee, to executive VP, contracting services.

HP has appointed Yves de Talhouët senior VP, enterprise business, and MD for HP (EMEA). He succeeds Jan Zadak, recently named executive VP enterprise business.

Pipeline, riser and subsea engineering and training company Jee has recruited Jonathan Lindsay to head its Aberdeen office.

Technip has two new directors, Maury Devine, formerly with Det Norske Veritas, and Leticia Costa, director of the Automotive Engineers Association in Brazil.

Jonathan McLaughlin has joined Midland Valley’s software development team. McLaughlin has a degree in computer games technology, AI and visualization.

TerraSpark Geosciences has appointed Jo Dominguez VP interpretation and consulting. He hails from Sherpa Energy Resources.

Rod Starr has been named senior VP EAME with TGS-Nopec.

Paul Tyree has been promoted COO of Total Safety.

John Derbyshire is now president of KBR’s Technology business unit, based in Houston. He succeeds Tim Challand who is retiring. Derbyshire was previously with Invensys and Aspen Technology.

Asbjorn Raeder and Peng Xu have started internships with Caesar Systems.

Crane Co. has appointed Max H. Mitchell as Executive Vice President and COO. He hails from Ford.

Stephen Johnson is now chairman of McDermott’s board and Bradley McWilliams is Lead Director of the Board, following the retirement of Ronald Cambre.

Former CIO of Sears Holdings, Karen Austin, has been appointed Senior VP and CIO of Pacific Gas and Electric.

Current COO Naveen Agarwal has been appointed CEO and Board member of Pricelock. Founder and former CEO Robert Fell has been named co-chairman along with Michael Bonsignore.  Agarwal was formerly president of E*Trade Capital Management.


Done Deals

Divestco, Frac Tech, Fugro, Kelman, Wood Group, IHS, IndexIQ, Seitel, TGS-Nopec, P2ES, Autonomy.

Divestco has successfully closed a $5 million subordinate bridge financing with Toscana Capital Corporation.

Chesapeake Energy’ s net investment in Frac Tech to date is approximately $115. Following a recent recapitalization, the company believes its investment will be worth at least $1.5 billion by year-end 2011.

Fugro has signed a letter of intent to acquire Kelman Technologies’ seismic data processing business. Fugro is also acquiring units of JDR Cable Systems.

GE has completed the acquisition of John Wood Group’s Well Support Division in a transaction worth $2.8 billion.

IHS has acquired Chemical Market Associates, its fifth transaction this year.

IndexIQ is to introduce its IQ Global Oil Small Cap exchange-traded fund, a ‘pure play’ investment targeting the oil industry.

Non destructive pipeline tester Profile Technologies has filed for Chapter 11 protection.

 Seitel has sold a minority equity interest to affiliates of Centerbridge Partners for $125 million in cash.

In its annual report, TGS-Nopec reveals that it paid $3.6 million to acquire P2 Energy Solutions’ directional survey business in 2010 including $300,000 for software and $320,000 for a non-compete agreement with two of P2ES’ key employees.

Autonomy Corporation is to acquire ‘selected key assets’ of Iron Mountain ‘s digital division including archiving, eDiscovery and online backup.


First PIDEX International EU Spring Meet

The new e-business standards body hears from Shell and BP. New supplier KPI standards proposed.

PIDX International held its EU spring conference in London this month. EU director Dave Wallis stressed the PIDX goal of becoming the sole global ebusiness standard for the oil and gas industry—backing up this claim with a world map showing PIDX use in over 20 countries.

Shell’s Huibert Vigeveno noted that over capacity was changing the refining industry footprint from multiple small local refineries to ‘world scale’ facilities. Pressure on margins is likewise driving supply chain optimization. Shell’s approach is ‘global, standard and simple,’ leveraging its ‘GSAP’ transaction processing and reporting system and ‘CAP’ a constellation of interoperating applications. Interaction with third parties is being standardized, leveraging a ‘global standard for data exchange’ under PIDX governance.

Kay McDonald, materials process manager with Shell provided some chapter and verse on the scale of the procurement problem. Her department has 2600 staff managing 30,000 contracts from 120,000 suppliers and over a million invoices annually—covering ‘everything except the hydrocarbons.’ An ‘end to end’ review of maintenance management procedures is underway to address issues such as maintenance practices that challenge on-time delivery of materials, unrealistic required-on-site dates and multiple short cuts around official procurement best practices.

Andy Walker described BP’s progress in master data management. Mergers and acquisitions have led to BP inheriting a variety of processes and systems, impacting master data quality. BP’s MDM vision is to leverage MDM in a ‘controlled processes’ for creating and maintaining master records with a single, group-wide record of reference for consuming systems. All BP ERP projects must link to MDM to comply with the BP master data standards. ‘Live’ MDM processes have been deployed at several BP upstream and downstream units worldwide—feeding into BP’s SAP and materials systems. These integrate with Dun and Bradstreet’s (D&B)  ‘DUNS’ company identifiers and hemp track corporate legal daisy chains.

BP makes extensive use of SAP’s NetWeaver development environment for interactive forms, reporting and one time data entry and validation. Data views consolidate to an SAP Enterprise Portal. MDM workflows leverage SAP Process Integration to feed data consuming SAP and non-SAP ERPs. Walker emphasized BP’s use of ‘standards’ although, apart from D&B, it would appear that BP, like ExxonMobil, largely rolls it own.

Daryl Fullerton asked for volunteers to participate in developing a PIDX standard for exchanging supplier KPI1 data between service companies and operators2. The initiative sets out to fix ‘scorecard overload and data inconsistency.’ Such a standard would reduce data input for suppliers, improve data quality and speed performance review. Fullerton calculates the saving to industry from such an effort at $90-135 million. More from www.oilit.com/links/1105_26. 

1 Key performance indicator.

2 Not clear if this includes providing operator KPIs to suppliers!


Fiatech in 2010

Standards body progresses on RFID for materials management, ISO 15926 and handover guide.

The Austin, Texas-based Fiatech organization has just released its 2010 annual report. Fiatech works to develop standards and best practices in the engineering and construction arena and its activity intersects with upstream in engineering, facilities, initiatives in RFID standardization and through the ISO 15926 standard for equipment data exchange.

2010 highlights include advancing interoperability in partnership with the Norwegian POSC Caesar Association with the ISO 15926 Joint Operations Reference Data (JORD) project—a primer for ISO 15926 deployment will be published in 2011. 2010 also saw the publication of the Fiatech manual on ‘RFID for Materials Management and Productivity Improvement,’ (www.oilit.com/links/1105_29) representing the culmination of a two-year collaboration between manufacturers, operators, RFID technologists and global standards organizations.

Work proceeds in collaboration with the USPI-NL standards body on a Capital Facilities Information Handover Guide which will include ‘common workflows and checklists’ and ‘minimum vendor data deliverables to support construction planning, work-packaging, storage and maintenance processes.’ Fiatech’s total income for 2010 amounted to approx $1.7 million—most from dues and the annual conference. Oil company members are ExxonMobil, Petronas and just about all EPCs and several software suppliers. The 2011 Fiatech Technology Conference and Showcase was held last month in Phoenix, Arizona. A report on the show will appear in next month’s Oil IT Journal. More from www.oilit.com/links/1105_27.


PODS 5.1 highlights

DCP Midstream demos new One Call/damage repair sub-model.

The 5.1 release of the Pipeline Open Data Standard model (PODS) is due for release in July 2011. A summary of key features is available on the PODS website (www.oilit.com/links/1105_30). PODS has kept the model size down to around 670 tables while adding sub models for ‘One Call’ and damage prevention. The One Call workgroup aims to mitigate the risk of third-party damage extending the model with support for One Call ticket information as well as third-party Damage/Repair information. A comprehensive prototype of the solution was presented by PODS user DCP Midstream illustrating PODS to ESRI linkage and geoprocessing of One Call data. More from www.oilit.com/links/1105_31.


Sales, contracts, partnerships

Amalto, Complete Production Services, Cognity, Ansys, Autonomy Corporation, Baker Hughes, ONGC, Trayport, ENGlobal, Caspian Pipeline, Entero, Expro, ITT VIS, e-GEOS, Chevron Australia, McLaren Software, Mobiform, MatrikonOPC, ConocoPhillips, PinnacleAIS, Quorum Business Solutions, Roxar, Statoil, Technip, Cortex.

Amalto has deployed its ‘b2box’ PIDX-compliant invoicing solution at Complete Production Services.

Aberdeen-based design consultancy Cognity used simulation software from Ansys to complete an offshore oil drilling project. The solution was said to be ‘75% faster and more cost effective than conventional methods.’

Baker Hughes has been awarded a five-year contract by India’s ONGC to provide drilling and evaluation services and to manage third-party services for the Platinum Explorer Drillship.

The trading subsidiary of Italy’s power company Enel,  Enel Trade, has gone live using Trayport’s GlobalVision Internal Marketplace.

ENGlobal has been awarded an engineering, procurement, and commissioning services agreement from the Caspian Pipeline Consortium, with an expected total value of approximately $86 million over four years.

Entero has implemented its ‘Entero One’ trading and marketing solution with two unnamed oil and gas companies in North America.

Expro has opened a custom-built facility in Ghana, West Africa. The 8,000m2 facility allows all project activity, including deepwater subsea tools, well testing, clean-up, sampling and PVT services, to be co-ordinated and monitored from a ‘fully integrated’ facility.

ITT VIS and e-GEOS are in a strategic partnership for COSMO-SkyMed data and SARscape. COSMO-SkyMed SAR data is distributed exclusively by e-GEOS.

Chevron Australia has chosen McLaren Software’s Enterprise Engineer to support the Gorgon Project. Enterprise Engineer for Projects will be deployed initially to manage incoming vendor and contractor documentation.

Mobiform has joined MatrikonOPC’s Global Partner Network, and will be integrating Matrikon’s full range of OPC Servers into StatusVision.

ConocoPhillips has engaged PinnacleAIS to enhance its inspection and reliability program at a gas plant located near Lost Cabin, Wyoming.

Quorum Business Solutions has announced the first hosted implementation of its Quorum Upstream accounting suite for an unnamed coal bed methane operator. Quorum also led the data conversion efforts from the previous asset owner to the client.

The first deployment of Roxar’s Fieldwatch 2.3 is taking place on Statoil’s Sleipner oil field in the North Sea where a combined sand and erosion monitoring system will see the integration of 3 Roxar Sand Erosion probes and 94 acoustic Roxar Sand monitors within Fieldwatch. The system provides Statoil with an overview of asset sand production and erosion as well as providing ‘smart alarms.’

Technip has been awarded a frame agreement for engineering studies by Statoil Brasil Óleo & Gàs. The 3-year contract covers feasibility, concept and front-end engineering design studies for Statoil’s offshore fields and future developments in Brazil. The work will be performed by Technip’s operating center in Rio de Janeiro, Brazil and will achieve a minimum 60% of local content.

Western Canadian gas company WestFire Energy has joined the Cortex Trading Partner Network.


Standards Stuff

PIDX International, Oasis geothermal data standard, API on hydraulic fracturing best practices.

The American Petroleum Institute (API) has spun-out its e-commerce standards unit, the Petroleum Industry Data Exchange committee, PIDX. A new independent, industry managed, non-profit standards body, PIDX, Inc. replaces the venerable Committee, originally created in 1987.  PIDX Inc. will continue to develop maintain global oil and gas eBusiness standards—www.oilit.com/links/1105_64.

The Oasis organization has floated a ‘potential’ geothermal energy data standards project that sets out ti create common data structures for geothermal energy information standards. Geothermal resources exploitation, like petroleum exploration, requires ‘robust’ 3D location coordinates, lifecycle data management and ‘heterogeneous’ metadata spanning geology, geophysics, hydrology and more. There is a need for ‘widespread data sharing’ and calculations ‘based on interoperable compatible data’—www.oilit.com/links/1105_65. Once more into the breach!

API director Erik Milito has ‘urged’ the US Department of Energy to rely upon the ‘robust best practices already in place’ for hydraulic fracturing. API released a publically available series of industry guidance documents on hydraulic fracturing in February—www.oilit.com/links/1105_49.


SAS Global Forum

SAIC leverages operations research toolset in gas lift optimization.

Speaking at the SAS Global Forum in Orlando, Florida last month, Bob Hatton and Ken Potter of SAIC showed how SAS’ operational research tools (SAS/OR) can be used to keep gas injection rates in the optimum ‘sweet spot’ between an unstable low régime and wastefully high injection rates. Field-wide gas lift optimization is a complex problem due to interactions between wells and constraints on gas use and production flow. Modern instrumented wells allow for gas lift optimization curves to be established for each well. These can be combined using numerical techniques to derive optimum injection/production rates to maximize profitability in the face of gas costs and the oil price.

SAIC used the Proc OptModel, a component of SAS/OR, to derive the solutions. The application included ‘on/off’ switches to allow for real world temporary well shut ins. The authors concluded that ‘SAS/OR optimization using Proc OptModel provides a powerful and flexible approach to solve complex optimization problems. The ability to translate objective functions into SAS Code and the efficient algorithms in Proc OptModel make the choice of SAS appropriate for both small and large problems such as the optimization of a 100 well oil field.’ Download the complete paper from www.oilit.com/links/1105_3.


Wireless World

Rajant floats ‘GulfMesh’ radio broadband network. FreeWave rolls out LRS Series radios.

At the Offshore Technology Conference in Houston this month, Rajant Corp. floated its proposed ‘GulfMesh,’ (GM) broadband network for the Gulf of Mexico. GM will be based on Rajant’s ‘BreadCrumb’ nodes and ‘InstaMesh’ software and will cover the area from Brownsville, Texas to the Florida panhandle. Target users include oil platforms, marine vessels and helicopters which would connect to the network for Wi-Fi access, ship-to-shore communications and other IP-based services. BreadCrumb clusters on wells, pipelines or platforms can be connected via satellite transponder to satisfy a full range of communications needs. Rajant also showcased its ‘RAPTR’ solution, a quickly-installed radio network offering 25 Mbps bandwidth over a six mile range and 5 Mbps at ten miles. More from www.oilit.com/links/1105_24.

Meanwhile at the Entelec tradeshow in Houston this month, FreeWave Technologies was demoing its new licensed LRS Series radios which are claimed to make optimum use of available bandwidth in narrow-band UHF channels ‘without the protocol overhead of native-IP radios.’ Chief marketing officer Ashish Sharma, claimed that FreeWave is ‘the leading provider of wireless data radios for oil and gas with more than 50 percent of all new installations and 250,000 radios deployed.’ The LRS unit has EU certification for use in more than 27 European countries, plus Canada, Mexico and New Zealand. More from www.oilit.com/links/1105_25.


Cyber security round-up

Obama’s 2009 warning on oil and gas cyber security revisited. NPRA says ‘CFATS ain’t broke.’

Introducing the US Cybersecurity legislative proposal back in May 2009, President Obama said, ‘We count on computer networks to deliver our oil and gas, power and water. […] But as we failed in the past to invest in our roads, bridges and rails, we’ve failed to invest in the security of our digital infrastructure.’

The department of Homeland Security (DHS) is offering to help private-sector companies analyze their logs to see when a hacker broke in. Proposed legislation sets out new protocols for sharing information on threats, advocating ‘transparency to help market forces ensure that critical-infrastructure operators are accountable for their cybersecurity.’ The legislation addresses enhanced protection of computer systems with notably, the DHS’ ‘Einstein’ intrusion prevention system.

In a sperate announcement, Charles Drevna, president of the National Petrochemical & Refiners Association, endorsed the current Chemical Facility
Anti-Terrorism Standards (CFATS) program warning that, ‘Significantly altering the program could greatly risk the high level of security that has been established at chemical facilities and should not be considered.’


Houston Serious Games Research Consortium

ExxonMobil, Hess back move promote gaming technology take-up in oil and gas.

Speaking at the recent high performance computing workshop at Houston’s Rice University, engineering special projects director Tony Elam showed how gaming technology, as used by the military, is being applied in various roles in oil and gas. Elam contrasts today’s snazzy gaming technology with traditional classroom-based teaching, suggesting that youngsters now expect more attractive and involving learning experience. Enter the ‘serious game,’ or virtual learning environment.

Military examples of serious games include USC’s ‘BiLat’ training system (www.oilit.com/links/1105_1), a ‘social’ simulator for training soldiers in negotiating with local leaders. In oil and gas, VRContext’s SimuLynx provides rig skills training to drillers and roustabouts. Simprentis’ OilSim is a comprehensive training system for decision taking, risk analysis and economics across the E&P lifecycle. The  UH ‘PetroChallenge’ offers students a similar environment to test their oil and gas business acumen.

Elam announced a new Houston Serious Games Research Consortium with backing from ExxonMobil, Hess, HP, IBM, Microsoft, SAIC and others. The consortium’s mission is to ‘explore, promote, share and develop the emerging field of serious games.’ More from www.oilit.com/links/1105_2.


Class action against BP for ‘defective’ point of sale application

SeegerWeiss and Lee Tran & Liang lay into BP and developer Retalix on behalf of retail franchisees.

A nationwide class action against two US units of BP plc has been filed by law firms SeegerWeiss LLP and Lee Tran & Liang, APLC. The lawsuit was lodged this month with the District Court for Northern California on behalf of franchise holders of BP and ARCO gas stations and AM/PM convenience store.

The case centers on a requirement by BP that the franchisees install a new centralized point of sale computer system, developed by co-defendant Retalix. This, according to the claimants, was ‘defective,’ and resulted in ‘substantial damages’ to the franchisees in terms of lost time, lost revenue and inaccurate inventory. The plaintiffs also claim that BP ‘illegally manipulated gas supply,’ exercised ‘improper control of pricing’ and that BP operated a policy of forcing sale of items and collection of fees for which Plaintiffs receive no compensation.’

The ‘proposed class’ for this comprises some 1,600 franchised BP and ARCO gas stations and AM/PM stores across the country. The Service Station Franchise Association has been assisting the plaintiffs. More from www.oilit.com/links/1105_4 (SeegerWeiss) and www.oilit.com/links/1105_5 (LT&L).


Recovery Analytics production data mining spin-out

Edinburgh University researchers moot commercial version of ‘statistical reservoir model.’

Researcher Laurence Ormerod has set up Recovery Analytics, a ‘pre-incorporation spin-out’ project to commercialize the technology developed by University of Edinburgh professor Ian Main, as described in our report from the 2010 PETEX conference (Oil ITJ, January 2011).

Main’s thesis is that current reservoir evaluation is far too subjective and that reservoir simulations need to be blind tested against publicly documented field data. Main has proposed  a  ‘statistical reservoir model’ derived uniquely from production data (whitepaper on www.oilit.com/links/1105_6).

The researchers are now looking to engage industry professionals involved in enhanced oil recovery projects and are requesting interested parties to complete their survey (www.oilit.com/links/1105_7). The survey is designed to identify the best application domains for the SRA methodology. Recovery Analytics is also interested in hearing from potential investors. More from www.oilit.com/links/1105_8.


Identec announces ‘IndustrySmart’ oil and gas RFID framework

Certification program builds on ‘WatcherSeries’ personnel safety system.

Austrian RFID1 and personal safety solution provider Identec has announced ‘IndustrySmart,’ (IS) a RFID/sensor network certification program for oil and gas assets.

The IS program ensures that product tags are interoperable and meet ATEX2 and other certification standards used in oil and gas and other verticals.

The IS program sets out to enhance the quality of applications and services provided under Identec’s

‘WatcherSeries’ (Oil IT Journal July 2008), a personnel safety system offering mustering solutions, monitoring of lone workers, voice communication and access control. IS products feature long range reading stations leveraging an RSSI3 baseline concept for connection to real time field data.

Identec claims to be ‘number one’ in safety solution provision in the North Sea and Middle East. Customers include ConocoPhillips, Statoil and Swire Oilfield Services.

The company recently raised $7.5 million from private investors. The cash will be used to expand the group’s US operations where the company is a member of the Oil and Gas RFID Solutions Group (www.oilit.com/links/1105_21). More from www.oilit.com/links/1105_22.

1 Radio frequency identification tag.

2 Atmosphères Explosibles.

3 Received Signal Strength Indication.


Kuwait National Petroleum Co. settles on Intergraph

State oil refiner to deploy multiple SmartPlant components in mega-refinery revamp.

State-owned downstream operator Kuwait National Petroleum Company (KNPC) is to use Intergraph SmartPlant Enterprise (SPE) in a major revamp of its refineries. SPE engineering design and information management solutions will be used to retrofit and increase the capacity of KNPC’s 415,000 bpd capacity Mina Al-Ahmadi refinery.

SPE was selected following ‘intensive benchmarking’ of the major plant engineering and operation solution providers. KNPC senior engineer for engineering and maintenance, Mohammad Asad Alawady said, ‘We are pleased to have Intergraph as our technology partner to support the growth of our refinery. SPE integrated solutions, including the SmartPlant Foundation, will enhance our design process through the effective management of high-quality engineering data for all of our projects.’

KNPC is also to deploy SmartPlant’s 3D, P&ID, Instrumentation, Electrical and Review components. Project management will be provided by CyberMAK Information Systems in partnership with Intergraph’s Middle East distributor, Atheeb. More from www.oilit.com/links/1105_23.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.