Speaking at Emerson Process Management’s (EPM) GUEX industry forum late last year, Larry Irving observed that some 43% of its business comes from oil and gas. Contrary to a widely held belief, notably within the ‘digital energy’ community, Irving said that oil and gas often ‘leads the way’ in technology implementation and in delivering best practices for other industries to follow.
Irving opined that, in the face of larger and more complex projects, the trend across the industry is the use of main automation contractors (MACs). MAC engagement is a necessary but not sufficient condition for a successful project. In larger projects, extreme instrumentation counts stress both design and deployment. One Gulf of Mexico digital oilfield included 20,000 instruments and some 60,000 connections.
Enter EPM’s ‘characterization modules’ (Charms), electronic marshalling units that offer a wide range of wired and wireless connectivity. In the above deployment, Charms reduced connection requirements by 65% to a mere 38,000. Electronic marshalling also leverages pre-designed, pre-tested, off-the-shelf kit instead of the traditional custom cabinets. I/O count can grow substantially during a project. Here the flexibility gained from electronic marshalling saved one global engineering, procurement and construction (EPC) service provider some $375,000 in re-engineering and scheduling delay costs, as well as $100,000 in cabinet modification.
At the GUEX, Emerson was also showing of its ‘iOps’ center, a dual-use acronym that can be taken both as an ‘integrated’ or ‘intelligent’ operations center. iOps offers multiple screens, video conferencing and collaboration in support of activities from predictive maintenance to process safety. Three iOps use cases were on show at the GUEX—remote condition monitoring, wireless augmented safety systems and electronic trading.
One enthusiastic deployer of wireless iOps is Yong Chin Hieng, head of control and automation at Brunei Shell Petroleum. Hieng said, ‘We are stepping up wireless adoption following successful trials and Shell’s global smart wireless program. The trials began five years ago at Seria and at the Rasau production station. Deployment is now being extended to the monitoring of onshore and offshore oil and gas wells.’ Emerson partner Aisha Automation undertook the work under a global Shell-Emerson enterprise framework agreement. Emerson reports that to date, over a billion total hours of smart wireless operations have been clocked across 10,000 systems in more than 120 countries. GUEX also heard from ExxonMobil’s Sandy Vasser.
A report from GE describes a new ‘industrial internet’ backbone of technologies and services. This is set to optimize networks, plants and facilities, leveraging ‘big data,’ and software diagnostics, connecting machines to the business.
GE identifies three enablers for the industrial internet, technology (sensors, instrumentation and user endpoints), cyber security and new ‘cross-cutting’ roles such as digital/mechanical engineers, data scientists, software and security specialists. The report envisions a ‘second internet revolution’ as machinery gets increasingly connected. As CEO Jeff Immelt puts it, ‘The industrial internet is revolutionizing our services. We will leverage our $150 billion backlog to grow this technology and our revenues 4-5% annually.’
GE’s specialty, ‘things that spin’ is deployed in drilling, pipelines and ‘big energy’ plants such as LNG trains. Today’s industrial internet offerings include optimized facilities and a new subsea integrity management (SIM) system. SIM software combines data from sensors measuring vibration, temperature and leak detection across well heads, manifolds and production stations. Shell is the launch customer for GE’s SIM.
When you produce oil (or gas) from a reservoir the pressure in the reservoir goes down. This phenomenon is called ‘drawdown.’ How much drawdown there is depends on lots of things, like the size and geometry of the reservoir and how much oil or gas is flowing into the well.
The way pressure goes down following a period of production is used to estimate how much oil is in the reservoir and how much the well is capable of delivering in the longer term. If you have a very productive well in a smallish reservoir then drawdown (pressure drop) will be immediate and large. A poor well in a large reservoir may not have any discernible effect on pressure at all.
Often, the results of such tests are kept rather secret. In the event of a good test result, a company may not want to attract too much interest before it grabs available acreage nearby or before the insiders have bought lots of shares (only kidding). A poor result may not be communicated before shares in the venture are dumped (there you go again!).
But that is not all that is
‘drawn down’ when you produce oil from a well. When you start producing oil
from a well, the oil (or gas, especially gas) price
goes down too! How much does the commodity price go down? It is
just like reservoir pressure
—produce a lot of oil or gas from a restricted economic zone and the price will go down a lot. Produce a little from a large area and price will be relatively unaffected.
What is an economic ‘reservoir’ in this context? The USA’s gas market is a good example of such a restricted economic zone where gas production has ‘drawn down’ the market price to just over $3/mmbtu. In contrast, this month, CEO Christophe de Margerie revealed that Total’s evaluation of US shale gas was performed at $6.
In the US, oil production is experiencing the same drawdown—with a $15 gap between the US WTI price and the ‘international’ Brent benchmark. This is created by the restrictions of geography. Oil is in the middle of the country and the users are on the coasts. There was a great report in Bloomberg Business Week recently about how pipelines and storage are being re-jigged in Cushing, Oklahoma to adjust to the new situation.
Musing on drawdown made me think of my March 2008 editorial ‘Heat pumps, phlogiston and the world wide web’ which discussed heat pumps as an energy source. You may like to reread this if you want to understand what follows.
I was skeptical at the time as to the possibility of getting energy from ambient air, but stumped by the ‘heatpumpists’ argument that the cooling of the exterior unit demonstrated that calories were in fact being extracted for free. I described this at the time as a kind of Monty Hall problem—something of a puzzler. In fact it is a complete fallacy as I will now explain.
Think of the analogy of a regular pump being used to raise water from say, a lake. In the vicinity of the pump’s intake, the lake’s surface is drawn down. But rather than adding energy to the system, it actually costs more to raise the water the extra distance! The same reasoning explains the cooling of the outside unit of a heat pump where thermal ‘drawdown’ can be so great that it actually freezes up.
Notwithstanding the fallacy, heat pumps continue to be manufactured as sold as ‘green’ devices that will reduce your energy bill. Since I wrote my editorial I was surprised to find that these fanciful devices have backing from no less than the International Energy Agency (IEA), a unit of the OECD which has sponsored the Heat pump centre portal. Here you can read that although ‘heat flows naturally from a higher to a lower temperature,’ heat pumps are able to ‘force heat to flow in the other direction.’ This is achieved by inter alia ‘reversible air-to-air heat pumps.’ The mind boggles at such nonsense. But after all, the OECD/IEA is run, not by physicists, but by economists. Perhaps encouraging deployment of these curious devices is doing the world economy some good through some sort of ‘green’ economic stimulus.
Thinking about the air-air heat pumps (some are available as bolt-ons to your central heating boiler—and ‘extract’ calories from your cellar’s air!) made me wonder about other ‘geothermal’ systems like the serpentines that you can install under your lawn. Such systems, unlike air-air, are at least theoretically sound in that there is a heat source and a heat sink. They rely on a temperature difference between the air outside and the serpentine in the subsoil. That is OK for the theory, but what installers may fail to explain, and what purchasers may fail to understand is that such systems rely on an ‘impedance’ match between the source and sink. If you live on Vesuvius and have a zillion calories per second coming through the lawn, then you are going to need a very big fan and an extremely strong cold wind blowing constantly to make the system work.
Most times I imagine that such systems just chunter along producing, if not heat, then at least a warm feeling. I speak with conviction as a couple of years ago we had the outside of the house insulated. This has transformed the ugly old rendering into a Versailles-like façade.
But I am incapable of telling you whether it has saved us any money. To do so would require a search back through our accounts to find out how much we were spending on fuel over the last few years. And back through weather records to see how cold the winters were. And then through the fuel price tables to figure how many liters of fuel our bills represented.
I suspect that most folks can’t be bothered with such hard work and that, having spent a fortune on insulation, or, heaven forbid, a heat pump, are happy to tell themselves that it was a good investment and leave it at that. Perhaps that was the OECD’s idea.
O’Reilly’s formula for success is a) identify a good subject, b) find a subject matter expert c) perform brain dump. The Bad Data Handbook* (BDH) succeeds with step a) but that’s about it. BDH is a hotchpotch of essays from nineteen authors. Some manage to stay on message. Chapters on the cloud, social media and on caring for machine learning experts less so.
Kevin Fink provides an interesting peek (with code) at processing web log data. Paul Murrel offers advice on getting data out of ‘awkward’ formats like Excel (use XLConnect) and processing it with ‘R’.
We enjoyed Joch Levy’s chapter on ‘bad data in plain text’ with an authoritative account of character encodings and text processing in Python. Adam Laciano’s chapter on scraping data from web pages does a good job of showing what an ugly task this can be. For one website using Flash, this meant running Matlab scripts to extract text from screen grabs! Jacob Perkins’ ‘detecting liars on the web’ describes how Python’s NLTK library for natural language processing is used to classify movie reviews. Interesting but again, somewhat off topic!
A problem with BDH is that the subject means different things to different people. Phil Janert’s chapter covers defect reduction in manufacturing, analyzing call center data and making the most of data with statistics-based hypothesis testing. BDH is very much in the modern world of NoSQL, file databases and the web. The topics of database integrity and naming conventions are not covered—even though these are key routes to clean data.
Ethan McCallum makes a brave attempt to tie all this together but his is less of an editor’s role, more on of an applier of lipstick to the pig. Again, the problem with BD is the subject and the fact that the book is mostly about making sense of data as it is found on the web. The issue of how to avoid creating bad data in the first place is not covered. Which is a shame as this is arguably more important.
* by Ethan McCallum. O’Reilly 2013. ISBN 9781449321888.
Summing up the recent series of webinars on a proposed earth sciences ontology, Krishna Sinha (Virginia Tech) and Leo Obrst (Ontolog) described the event’s objectives as ‘exploring the current status and application of ontologies for a semantically enabled cyber infrastructure for earth sciences.’ The event was suported by members of the EarthCube, Ontolog, and IAOA communities. Speakers offered the usual ‘jam tomorrow’ promises of the semantic web, seamless data sharing, discovery and integration. Semantic technologies and ontologies were described as ‘key building blocks’ for next-generation scientific infrastructures.
Semantic Web technologies ‘appear to be’ widely applicable to large scale earth science data management and applications. A ‘semantic Broker’ is said to go beyond ‘typical geospatial discovery services’ allowing for automatic discovery of relevant and interoperable resources. The authors observe that the ontology/semantic web community has undergone several paradigm shifts over the years with a shift from top-down engineering to bottom-up construction allowing the discovery of information in ‘community-generated’ data available on the Web. Existing vocabularies, markup languages and ontologies ‘vary greatly in scope, design principles. Better definitions of terms from authoritative sources and richer formalized representations are required.
The miniseries also referred to the OGC’s GeoSparql that seeks to provide a formal representation of concepts within OGC standards. However this a geographic not geological standard. Despite a heads-up from Oil IT Journal, the geological equivalent GeoSciML appears to have been studiously ignored in the mini-series as indeed does the ‘prior art’ of the OneGeology initiative. More from EarthCube, earth science ontology and IAOA.
Speaking at the recent ECIM Well data management formats and workflows workshop held in Sandnes, Norway, Kronen Gård, Statoil principal analyst for data and information management reported on ‘GeoTracker,’ a new well reporting compliance system. Statoil wanted a system to assure compliance in reporting well data to NPD, the Norwegian regulator.
The system was designed to track all well reporting activity from initialization in Statoil’s corporate database through internal workflows, QC, storage in PetroBank and reporting. GeoTracker assigns roles for different tasks and allows users to monitor progress throughout the reporting workflow. The objective is to complete all reporting to Petrobank within six months of a well reaching total depth (a legal requirement in Norway). GeoTracker uses a shopping cart metaphor for data gathering. Wells are initialized from an internal database to ensure the correct total depth date is used. The system enumerates tasks and relevant databases and users can add comments on workflow status. The system notifies users what remains to be done at any point in workflow. Tasks may be flagged as ‘please complete’ or ‘blocked’ when they depend on others’ actions.
GeoTracker provides a heads-up when action is required as in, ‘the following well has reached TD but is not yet initialized in GeoTracker.’ Nagging reminders of deadlines and recently unblocked tasks are sent to the appropriate stakeholders. The system also provides summary wellbore status reports in HTML or Excel.
GeoTracker went into production in October 2011 and as of the end of November 2012, 133 wellbores had been reported to the authorities. Gård observed that reporting is like a relay race, ‘it’s not particularly complicated, but it’s important not to drop the baton!’ GeoTracker was developed by Kadme on top of its Whereoil system. More from Kadme and ECIM.
The 2012 edition of Weatherford’s Field Office is the first major release since Weatherford acquired CygNet Software (OITJ May 2011). The new release sees the adoption of CygNet’s Enterprise Operations Platform (EOP) as the infrastructure to Weatherford’s application portfolio. This includes applications for production optimization from conventional and unconventional reservoirs. The integration of real time SCADA data into Field Office is said to provide a ‘unified real-time production optimization solution for all artificial-lift types.’
Weatherford’s Steve Robb told Oil IT Journal, ‘Earlier versions of Field Office were well-centric. The 2012 edition is enterprise-centric and has a renewed focus on real time data. Today, most all wells require some form of artificial lift. Optimization requires real time communications from well to the office. Major producers now have to handle in the order of 5 billion data points per day. Unfortunately this data is often going into systems that lack a common platform, causing organizational chaos.’
Robb sees a general move from targeted engineering apps to a platform. Field Office can be extended to include third party tools (e.g. the Lufkin pump-off controller) through a ‘very open’ API. Today there are some 350,000 wells on Field Office, representing 75% of US gas and 70% of US oil production. Robb believes that operations should have a sponsor at the company board level. Alongside the CFO, CIO there should be a ‘C-OpsO’ for operations.
This month sees changes to the way several upstream learned societies deliver their information assets. The Society of Exploration Geophysicists (SEG) is transitioning its digital library, a ‘corpus’ of over 35,000 articles, abstracts and books, to a new system based on Atypon Systems’ Literatum platform. The old system used the American Institute of Physics’ Scitation platform. A new Literatum for Mobile provides a more readily usable format for accessing content via mobile devices and tablets. The platform is also said to enable faster online publishing.
The American Association of Petroleum Geologists has also revamped its web presence with a map based interface to its Datapages archive of around half a million maps, photos, Bulletins and other information. The new Datapages Exploration Objects (DEO) GIS sees a migration of an earlier GIS-UDRIL archive to more recent ESRI ArcGIS Server technology.
Finally another SEG, the Society of Economic Geologists has teamed with Elsevier to port its 15,000 map archive to the Geofacets geographical search too
Geoscientist and blogger Matt Hall thinks that most well tie software is ‘broken,’ particularly by the chaos caused by proliferating log curve names. Schlumberger alone has over 50,000 curve and tool names reflecting decades of development. For Hall, most attempts at standardization are doomed because of the need to compromise. They are not fit for purpose and are hard to maintain in the face of changing technology.
Hall says, ‘Instead of trying to fix the chaos, cope with it!’ What is needed is a search tool for log names that is ‘free, easy to use, and fast.’ It needs to be comprehensive, containing ‘every log and every tool from every formation evaluation company.’ The tool should provide human and machine readable output along with information on the curve or tool, and links to more details. Enter Agile’s fuzzyLAS, an embryonic curve library-cum-web service that currently contains 31,000 curve names along with a web API for fuzzy searches. Hall invites those struggling with rogue curve mnemonics to get in touch via Agile to ‘help build something useful for the community.’.
The 8.8 release of (formerly SMT’s) Kingdom Suite from IHS shows tighter integration with a number of IHS’ data products and web services. Kingdom Direct Connect now provides download of IHS international well data including headers, formation tops, well tests and more. IHS reports and US scout tickets can be viewed from inside the Kingdom Basemap and VuPAK components.
Kingdom 8.8 also offers web services-based access to IHS ‘Spatial Energy’ services including global web map and well services. New functionality in Kingdom’s seismic interpretation package includes automatically generated velocity grids for depth conversion and model update in real time for geosteering operations.
IHS’ GeoSyn 1D synthetic seismogram package is now integrated with the Kingdom database for the exchange of logs, checkshot surveys and computed seismograms. Data management is enhanced with an administration console to associate users with projects and control access with Microsoft Windows authentication and groups.
Tecplot’s RS 2012 reservoir simulation post processor adds a plot gallery, arbitrary slices through simulation grids and SUSE Linux Enterprise Desktop and Windows 8 support.
ClearEdge3D’s EdgeWise Plant 4.0 offers improved pipe extraction algorithms, ‘billion-plus’ point cloud visualization and dimensionally accurate component placement from a new ‘specification-driven’ library.
Energy Graphics has added touch screen technology to its Intellex suite, bringing its maps and datasets to digital wall maps and iPad devices. Maps can also be published from the Intellex GIS Data Server.
Kappa Engineering has released Kappa Server V5.0, a redesign of Diamant with enhancements including a ‘second generation’ wavelet and an ‘open’ server interface. Kappa also announced Kappa Viz V5.0 a 3D ‘virtual meeting place’ that manages models, logs, seismic blocks, geomodels and simulations from Kappa and third party applications.
Mechdyne Corp. has announced Connection Portal, a web-based interface that allows everyone in an organization to access computer assets from anywhere within the enterprise. ‘Ubiquitous access’ enables pooled blades, desk-side work stations, meeting rooms and collaborative work sessions.
The 4.6 release of New Century Software’s Facility Manager ArcMap extension enables centerline maintenance tasks such as new line creation, pipe reroutes, linking key documents and merging pipelines. The new release targets PHMSA MAOP requirement to ensure that records are ‘reliable, verifiable and complete’.
Pegasus Vertex has announced CemLab, a cement lab data management system. CemLab covers the time-consuming and expensive processes involved in designing and testing cementing slurries. The tool calculates ingredients, stores test results and generates lab reports.
Quorum Business Solutions is now offering its pipeline transaction management system (QPTM) as a hosted service. Operators can access contracts, nomination, scheduling and billing software from the ‘QCloud.’ QPTM addresses FERC regulated pipelines and storage facilities subject to NAESB regulations.
TD Williamson has released Interactive Report 2013, software support for inline inspection and pigging operations.
SPT Group’s Drillbench V6.0 offers dynamic modeling of influx and blowout situations. Drillbench now supports dual gradient drilling, integrates well paths from Petrel and reads pore and fracture pressure data from Techlog.
Petrosys has announced a ‘next generation’ edition of dbMap—its solution for managing digital seismic and log records, scanned reports, maps and and core photographs. The latest ‘international’ release is a customizable, web-based version of the core dbMap product. Petrosys dbMap started life as the ‘Prospects and leads database’ (PLDB), a bespoke development for Santos that has been in use for several years. Petrosys has negotiated the redevelopment and extension of PLDB using its dbMap/web architecture.
Petrosys CEO Volker Hirsinger told Oil IT Journal, ‘The area of prospects and leads is very broad and requires careful integration with client workflows and procedures. It is attractive for companies wanting this functionality to come on board early in the development cycle. Bringing a few such companies on board is our current aim. We have also developed FracDB, a dbMap extension for frac data and we are currently commercializing an implementation of the PPDM 3.8 records management (RM) module for unstructured data management.’
According to Hirsinger, the RM module gives clients ‘full ownership and control’ of their knowledge base and lets them leverage best of breed tools from other PPDM vendors. Petrosys’ dbMap is a data management offering that connects to PPDM data stores, providing applications and data management tools in the form of tailored DDLs and an extended data model. The latest 17.3 release of the Petrosys desktop includes improved mapping from Excel spreadsheets and layered/georeferenced PDF output. More from Petrosys.
Past NDR Chairman (Emeritus) Stewart Robinson has issued a call to arms to the national data repository community to encourage collaboration on production data reporting. A production data initiative was kicked off at the 2011 Brazil NDR meet. This saw the collaboration of NDRs, software developers and oil companies ‘seeking better ways of managing oil and gas production data.’
A position paper, ‘Data Structures for National Regulators’, authored by Eric Atherton of Data Horizon, has subsequently been reviewed and updated with input from several regulators. A demonstrator was shown at the Kuala Lumpur NDR12 meet involving web services-based data exchange between Data Horizon and Performix. The next step is to extend the ProdML schemas into a new Energistics data standard for regulatory reporting of production data. The work will be conducted by the Energistics ProdML special interest group. The group aims to develop a ‘simple, flexible standard’ for regulatory use. More from Energistics.
There was a record turnout of 3000 delegates for the 2012 edition of Petex held last month in London. Keynote speaker Tony Hayward, now CEO of Genel, looked at the future for the oil and gas industry. Hayward noted the rising importance of high performance computing in seismic imaging, horizontal drilling and hydraulic fracking as recent industry game changers. This is largely due to the absence of a US energy policy. The USA makes up for the fact that it is the largest unregulated energy market in the world by being also the largest capital market. Rapid capital allocation is driving the necessary infrastructure and funding a buoyant service sector. Natural gas prices are low because of the absence of a regulatory framework. Already shale gas makes up 30% of US supply.
Shell VP Glenn Cayley looked into the future of global gas development. Current natural gas supplies equate to some 250 years of supply. Moreover gas produces half the greenhouse gas emissions of coal. ‘Installation costs’ for natural gas are five times less than nuclear and fifteen times less than wind power. 3D seismic has been a ‘real enabler’ as has floating LNG. Shell is building the world’s largest LNG floater, Prelude. Gas to liquids is a key new technology. The largest GTL plant in the world is under construction in Qatar. And there are also GTL possibilities in Louisiana. In the Q&A, Cayley was asked, ‘Is shale gas economic?’ He replied, ‘I hope so.’
Schlumberger’s Carl Trowell agreed that non-conventionals represent a ‘true paradigm shift’ where the source is now also both reservoir and trap. While conventional scientific understanding is in its infancy, there is the potential for global changes in supply and demand balance. Today the big issue is water management. Fracking requires a lot of water and also produces a lot of ‘polluted’ water. Today’s approach is unsustainable and won’t happen in Europe. In the US, water usage amounts to 700/800 trucks per well—billions of gallons of water and millions of tons of proppant—which equates to ‘not sustainable!’ Trowell anticipates that technology will help with more efficient fracking using less water and less proppant. Better reservoir evaluation and completion will be achievable blending petrophysics, reservoir engineering geomechanics and completion engineering. ‘Digital geology’ will help too with better reservoir simulation and earth model building, ‘geology and geophysics are coming together, you no longer need to upscale.’
In the Q&A Trowell was asked if the shale gale was coming to Europe. He thinks not, ‘The US is a special case. Realistically, European activity will never reach the same level because of service intensity and the question of consumables. China may be more like the US.’ Answering a question on recruitment Trowel said, ‘the industry is obsessed with attracting people. In fact you can recruit easily in the far east, if not in the EU or US.’ One question though is do we need geologically-minded digitally aware geoscientists? Universities need to offer more education on the use of digital technology. Geomechanics and completions specialists are also in short supply. But much will come from more automation in interpretation technology.
Mark Hempton (Shell)
returned to the role of technology in enabling new discoveries. Exploration
is harder than it used be and technology reduces risk and cost while increasing
safety. Improved seismic imaging for sub salt targets is leveraging multi azimuth,
ocean bottom surveying, broadband and ‘million channel’ systems. Hempton cited
the Cardamon Deep Gulf of
Mexico development where wide angle seismic, anisotropic velocity modeling and
Shell proprietary technology are being applied. Elsewhere, 2D broadband is enabling
imaging through gas clouds. Million channel systems with fiber optics and/or
wireless sensor networks promise to lower cost and speed time to first oil.
Also novel techniques such as ‘fast scan’ seismic interpretation have cut turn-around
times tenfold. Forward stratigraphic modeling is also key to understanding sedimentation
and fluid flow. This has proved especially useful in frontier areas where there
is no well data. On the drilling front innovative rigs and extended reach wells
are lowering costs,
as is more automation. ‘Technology is a differentiator.’ Hempton was also quizzed on the shale gas boom—he responded that the ‘jury is still out on non conventionals.’
Chris Reddick described ‘at-scale’ deployment of enhanced oil recovery (EOR) in BP. Today a mere 3 ½% of oil production comes from EOR which historically has been the bailiwick of engineers and geologists. Today, chemists, geochemists and rock scientists are getting in on the EOR act.
EOR is not just about the subsurface but also about facility engineering solutions. Here there is another challenge—holding up a development while lab experiments (for instance in flooding) are carried out. BP is developing relationships with Universities and other scientific resources in the service industry.
One challenge is how to monitor a water flood—often working with other owners across a unitized field. This may involve relationships with governments and regulators. The real issue is to get the EOR technology out there and optimize BP’s portfolio with water or gas flooding where needed. EOR is also now being considered much earlier in a project’s lifecycle, especially offshore.
So far polymer injection has seen little take-up because of environmental concerns. BP is working on these issues and trying to get new technology out of the lab and into the field. Low salinity (LoSal) is a twenty year old flooding technique that now looks promising. LoSal costs around $3 per produced barrel and requires extra topside facilities. Overall the economics are favorable.
For the onshore, managing produced water is the big challenge. Here membrane technology and ‘Bright Water’ look promising to improve sweep efficiency. Currently the industry ‘produces’ around three million barrels per day from EOR (BP alone gets 100,000 barrels). LoSal, originally developed at the University of Wyoming was championed by BP for a decade before core flood lab tests finally demonstrated its viability. More from Petex.
Chairman David Alexander of Regency IT Consulting, a Cassidian/EADS unit set the scene for the 2012 edition of SMi’s Oil & Gas Cyber Security Conference held late last year in London with a quote from Eric Byres who described control system software as ‘a bunch of vulnerabilities wrapped in some SCADA control code.’ Subsequent presentations from cyber security specialists from oil and gas companies and the vendor community, described a variety of ‘advanced persistent threats’ (APT).
Boldizsar Bencsath from Hungary’s ‘Crysys’ lab described how APTs are identified. Crysys discovered and analyzed the Duqu virus (delivered via a Microsoft Word document) and has developed a Duqu detector toolkit. Crysys also took part in the initial analysis of Flame, the ‘most complex malware ever found,’ some 6MB of information stealing malware that can activate microphones and web cams, log key strokes and ‘call home.’ Perhaps most scarily, Flame infects computers by masquerading as a Windows update, complete with a fake Microsoft certificate.
Bencsath observed that malware ‘need not be technically perfect to be very effective.’ What can companies do? Extend protection beyond signature-based techniques with anomaly detection, heuristics, baits and ‘honeypots.’ Education is key as is forensics. Check suspicious network traffic. You never know what you might find! And put an incident response plan in place.
Finmeccanica’s Simon O’Gorman asked whether cyber risks are hype or reality*. Most attacks target well known vulnerabilities and ‘97% are easily preventable.’ Security has to be done on a risk/appetite basis within an available budget. There is no easy answer to the hype or reality question. Defense from an ‘air-gap’ has proved to be a myth with modern communications and individual behavior. Much SCADA equipment is updated from a USB stick. Wireless communications and mobile devices mean that perimeters no longer exist. Control systems and SCADA networks are open to public networks, witness Night Dragon, Flame, Gauss and Stuxnet/DuQu. So what is to be done? Apply ICT security basics and best efforts? This may be too simplistic. O’Gorman advocates ‘defense in depth’ leveraging frameworks such as those from Tofino Security or the UK’s CESG. Penetration testing is useful but may be difficult in a working plant. The best network is an invisible network, ‘you can’t hack what you can’t see.’
Oskar Wols (Shell) and Marcel Grooten (Information Risk Management) described the changing threat landscape confronting business critical IT and information systems. These include ‘increasingly sophisticated and professionalized attacks.’ But IT can’t fully address such issues. The business needs to be in the driving seat and accept responsibility for the potential consequences. While IT can help to minimize risks, zero risk is never possible. And in the end, it is the business that carries the can. The business needs to take ownership of cyber security, to be fully aware of the risks and be able to react quickly. Data flows need controlling and business processes should be documented along with roles and responsibilities. During execution, ‘each individual step needs to be approved.’ IT needs to be able to continuously monitor business processes and flag deviations or unusual behavior. The authors expect that the technology required to achieve this could be available in five years or so. In the meanwhile, companies should map out their information flows and decide what rules need to be implemented and discuss with industry and suppliers on a common approach.
Iain Brownlie and Olav Mo, both with ABB, reported from the cyber security front line. At Shell’s Ormen Lange and Draugen Norwegian North Sea developments ABB is the single point of contact for all automation-related issues including security. Part of ABB’s role is as enforcer of basic rules governing authorized access, managing and protecting passwords and not leaving computers unprotected. ABB’s secure client server management (SCSM) oversees Microsoft security updates, antivirus, patches and backups. An ‘advanced service appliance’ (from Industrial Defender) automates data collection and monitors operational events.
Waterfall Security’s Colin Blou observed most attacks involve password theft or persuading an insider to ‘pull’ your attack through by phishing or just calling the help desk. If you can trust your users’ workstations, what about their cell phones? Firewalls are software too and have vulnerabilities and ‘zero days.’ Complicated systems may be hard to configure and maintain. Mitigation is costly involving training, management, log reviews, audits and more. The ‘alternative,’ if not the answer, is to deploy industrial security best practices like application control/whitelisting, security information and event management (SIEM)/intrusion detection and unidirectional gateways. Waterfall’s ‘industrial security reference architecture’ is used to protect offshore platforms, refineries and pipelines with secure replication of historian data to the corporate network and remote vendor and IT support. Technologies from multiple vendors can all work together in a secure manner. But operations defense-in-depth is ‘very different from IT defense-in-depth.’
In a similar vein, Christian Probst (Danish Technical University) said we need to ‘mind the (security) gap!’ between IT and SCADA. Probst is working on such issues via the EU ‘Technology-supported risk estimation by predictive assessment of socio-technical security’ (TREsPASS) project. The goal is to develop an ‘analytic approach’ to identify and rank attacks.
GDF Suez’ Phil Jones outlined the UK government’s Cyber security guidance for business. The guidance document was released in September 2012 by CESG, GCHQ’s information security arm. Jones advocates a holistic approach to security with defense in depth, ‘offensive’ action and involving other stakeholders from HR, HSE, Operations and ‘anyone else who might be useful!’ Marius Brekke described how IPnett has developed GDF Suez’ automated, secure access control solution for GDF Suez’ offshore installations. This ‘dynamic’ access control in compliance with OLF104 via IPnett’s ‘Shield’ admission control system. Shield has been successfully deployed on the Gjoa project where it has been used to move five operators from the platform to the shore. Today, the Gjoa IT Manager connects from his cottage. The system is also used on ENI’s Goliat project. More from SMi.
* Gillian Tett writing in the Financial Times reports that they are real.
Archie Deskus has joined Baker Hughes as CIO replacing Clif Triplett. Deskus was previously with CIO with Ingersoll-Rand.
Dan Piette is the new executive chairman of TerraSpark Geosciences. He hails from Object Reservoir.
Nick Kontonicolas is to join the board of Coates International.
Beth Rosenberg has been confirmed by the US Senate as a board member of the Chemical Safety Board. Rosenberg is Tufts University faculty member.
Don Dudley has left Petrosys USA to work as sales manager for SeisWare.
Drilling Info has named Dave Piazza as CFO. He hails from QuadraMed Corp.
Expro International has appointed Jim Renfroe, as a non-executive director. Renfroe was previously director of Wood Group where he led the $2.8bn sale of the group’s well support division to GE.
Anindya Ray has joined FairfieldNodal as regional sales manager for South and West Asia and the southern FSU. Ray was previously with Knowledge Reservoir.
Actuant Corp has appointed Brian Kobylinski as executive VP (industry and China) and Sheri Roberts-Updike as executive VP Energy. Roberts-Updike joins the company from Tyco International. Kobylinski is a 20 year Actuant veteran.
Bill Barrett Corp. (BBC) has appointed Scot Woodall, COO, as interim CEO following Fred Barrett’s decision to step down. Jim Mogg is now non-executive chairman. BBC is now looking for a permanent CEO.
GE Oil & Gas’s subsea systems business unit is to build a new subsea centre at Bristol, UK’s Aztec West business park. The development will create around 200 new jobs in 2013 with openings for engineering, project management and commercial professionals.
GSE Systems has appointed Steven Freel to COO, a previously unfilled position. Freel has been with GSE for 16 years, most recently as CTO.
Chris Carter is the new CEO of PIDX International. He was previously with RigNet.
Ugur Algan is now a director of BGS International, an independent spin-out of the British Geological Survey.
Alvaro Bueno Buoro has joined Geovariances as a trainer in Rio de Janeiro.
Hart Energy has named Tammy Klein as senior VP downstream research and Mike Warren senior VP upstream Research.
ICIS has named Karl Bartholomew VP Americas. James Ray and Ed Sporcic have also joined the company.
IHS has named Rich Walker executive VP global finance and Todd Hyatt senior VP, CFO and CIO.
Jae Keun Ha is now oil and gas advisor for Intsok in Korea. Ha was previously with HHI.
Jay Lapeyre has been appointed chairman of the ION board following Bob Peebler’s retirement. Peebler continues to as a consultant.
Patrick O’Brien is the new CEO of ITF.
Egil Haugsdal has been named executive VP business development with Kongsberg.
Knowledge Reservoir has hired Mali Braun as sales and business development representative in Canada.
Tim Davis has joined Midland Valley as a geological application tester.
Libby Hanna has joined the PODS association as communications and conference coordinator. She hails from GITA.
Arkex has launched gradiometry.com, a website covering the theory, technology and application of gravity gradiometry.
Schlumberger has acquired Oslo, Norway-based GeoKnowledge, developer of GeoX, a prospect risk evaluation tool.
Aveva has acquired the assets relating to Global Majic’s 3D virtual reality simulation software.
Following its November 2012 acquisition of St John’s, Newfoundland-based asset integrity specialist Thrum Energy, Aker Solutions has taken full ownership of its Canadian associate AKCS Offshore Partner. Aker also has acquired Separation Specialists, a California-based provider of produced water de-oiling products and field services.
Barco has acquired Herkules Capital’s 61% stake in Norwegian ProjectionDesign.
Finland-headquartered Metso has acquired ExperTune which will be rolled into Metso’s automation services unit.
French engineering services provider SPIE has acquired Australian upstream engineer Plexal whose LNG, CSG and other offerings will integrate SPIE oil and gas.
Weir Group has acquired Mathena, a manufacturer of pressure control equipment, for a down payment of $240 million and a contingent $145 million maximum payable over two years.
Following X-Change Corp.’s ‘default on its payment obligations,’ 4C Tech Holdings has announced the termination of its sale of Guardian Telecom.
Oil Price Information Service has acquired El Paso, Texas-based LCI Energy Insight.
Det Norske Veritas (DNV) is merging with Germanischer Lloyd (GL Group). The new entity, DNV GL Group and will be a ‘major provider’ of risk expertise and certification to the oil and gas sector.
GSE Systems has engaged investment banker Valufinder Group to ‘help identify potential targets for mergers and acquisitions in the U.S. and abroad.’
IHS has acquired Exclusive Analysis (EA) and the business of Dodson Data Systems. EA provides intelligence on worldwide political and violence risks. Dodson provides advice on US oil and gas operations.
Proserv Group has acquired Houston-based process control equipment specialist Total Instrumentation & Controls. Proserv is a owned by Intervale Capital of Cambridge, Massachusetts.
Maria Milo has acquired Variance Reduction International from Sally Ulman who is now a senior VRI associate through her new company, Ulman and Co. Consulting.
Statoil and Siemens have signed up with a new EU-Funded semantic research project, ‘Optique’, a four year, €14 million investigation into ‘scalable, end-user access to big data.’ Optique sets out to bring a ‘paradigm shift’ for data access with a ‘semantic, end-to-end connection between users and data sources.’ Users will be able to formulate intuitive queries using familiar vocabularies and concepts and seamlessly integrate data spread across multiple distributed databases and streaming sources. Optique will furthermore exploit ‘massive parallelism’ for scalability ‘far beyond traditional RDBMSs.’ Project lead is Oslo University.
Another EU-funded project, ‘Geo-Know’ also launched recently. The three-year project will investigate interlinking, information management, aggregation and visualization of spatial web data, mitigating the ‘time-consuming and complex’ use of traditional GIS. One use case to be studied is supply chain management. The project is lead from the computer science department of Leipzig University.
On the ‘better late than never’ principle, we report that the much ballyhooed Norwegian ‘Integrated operations in the high north’ semantic R&D project produced its final report mid 2012. IOHN set out in 2008 to ‘design, implement and demonstrate a reliable and robust software architecture to be used in an Arctic setting.’ This of course begged the question, ‘what is special about software in an Arctic setting?’ The answer, IOHN’s rationale, was that operational models for such environments depend on ‘an extended support network that requires collaboration across disciplinary, geographical and organizational boundaries.’ Enter IOHN’s ‘open standards’ for interoperability, integration and data transfer.
These involved the development of a semantic web-based integration platform for sensor data and semantic models for the upstream. Along with the W3C’s semantic web technology, IOHN was to achieve its ‘flawless’ information exchange using POSC/Caesars’ ISO 15926 protocols. The integration platform was developed using Cambridge Semantics’ Anzo Enterprise. This provided ‘virtualized’ access to information in source data stores, including Microsoft Excel. Anzo leverages the Open services gateway initiative, a Java framework used in the open source Eclipse IDE.
Anzo was successful at combining data from Excel and other sources but the researchers determined that the underlying semantic web technology is ‘better suited for meta-data rather than sensor data due to the high overhead of RDF.’ However the team managed to implement and query instrument data in Sparql and the returned RDF/XML data could be parsed by applications.
Modeling on the Snorre field found that semantic software tools need improvement. ‘There are hundreds of competing solutions for mapping between relational/tabular sources and the RDF graphs.’ Moreover, modeling tools are ‘not mature and not usable by oil and gas engineering domain experts whose knowledge is required to build a semantic model.’
A sub-project involved developing a ‘Software-related technology qualification process’ for qualifying software systems and components for dependability. This included the qualification of ‘architectures, systems and components (ASC) used in complex software-intensive (CSI) and software-intensive (SI) components. One unqualified IOHN success seems to have been acronym development!
The team used the free software tool ProtoSeq to map the safety requirements of the IEC61508 standard into a collection of patterns using the goal structuring notation. Other software quality testing methods were trialed such as failure mode and effects analysis and a ‘cloned buggy code detector.’
One NTNU student developed ontology tools for Scada security and performed a literature survey to classify security attacks and incidents and to help developers protect the industry from cyber threats.
Another sub-program investigated ‘digital innovation dynamics’ i.e. the role played by digital technologies in ‘creating, not simply representing, the materiality of physical phenomena.’ Such insight is said to explain the ‘ongoing transformations of the offshore petroleum industry’ with the advent of the ‘fully digital oil field.’ Here a ‘conflation’ of the material and digital worlds is transforming the nature of work, technology and organization of the offshore petroleum industry.
Each of IOHN’s sub projects (and there are many more we have not covered) is summed up with ‘successes’ and ‘lessons learnt.’ The latter, while not always equating with failure, are in general more informative than the successes. If you view IOHN as a large scale R&D funding exercise this is OK. Unfortunately, over the years, IOHN has been hyped as a panacea for collaboration and integrated operations. In this context it is hard to identify a concrete outcome.
Whether it was wise to direct so much of Norway’s oil and gas R&D funding towards what remains unproven technology is another question. But Norway is in good company here. The EU has just launched ‘Optique’ and is to plough some €14 million into research on ‘semantic, scalable end-user access to big data.’ See above.
Accenture has teamed with SAP on the deployment of SAP upstream operations management and SAP production and revenue accounting solutions leveraging Accenture’s ‘advanced enterprise solution’ of customized SAP solutions.
Aspera and OvationData are to deliver high-speed data transfer and management solutions to the oil and gas industry leveraging Aspera’s patented ‘Fasp’ data movement technology.
eLynx Technologies has expanded its service offering in a deal with ViaWest, a data center located in downtown Las Vegas. eLynx’ web-based monitoring, alarming and oilfield automation services will be co-located at ViaWest’s secure 100,000 sq. ft. Lone Mountain facility.
Surgutneftegas reports successful trials of SAP’s Hana (a.k.a. ‘Hasso’s new architecture’) in memory computing. CTO Rinat Gimranov said that following the trial on a 200GB dataset, Hana ‘will see widespread deployment’ in Neftegaz.
Atos has teamed with SAP on sustainability ‘leadership,’ providing customers with sustainability and IT expertise leveraging SAP solutions for energy management.
Bolivia’s YPFB Transporte reports use of the ESRI ArcGIS pipeline data model. The ESRI geodatabase schema is used to provide GIS support to YPFBT’s pipeline mapping and in-line inspection activities.
FMC Technologies has received a $114 million subsea equipment order for LLOG Exploration’s ‘Delta House’ project.
Gazprom’s Geologorazvedka unit has contracted with Schlumberger for the supply of technology solutions and software products and development of a personnel training program.
Brazilian engineering company IESA has selected Intergraph SmartMarine and SmartPlant enterprise solutions for a major Petrobras project covering six identical floating, production, storage and offloading (FPSO) vessels. Local partner Sisgraf is to provide training.
IPL is to deliver E-doc, an electronic duty of care solution for the UK Environment Agency. E-doc development has support from EU LIFE+, the EU’s funding instrument for the environment.
Mechdyne is installing a ‘Cave’ virtual reality environment at the Energy Innovation Center (EIC) of the University of Wyoming. The Cave will be used inter alia to study how oil, gas, and water move and interact in the subsurface and to ‘increase recovery from unconventional reservoirs’.
Grupa Lotos has selected Allegro Development Corp.’s Allegro 8 energy trading and risk management platform to manage its international crude oil and refined products trading operations.
Inpex has selected Intergraph SmartPlant Enterprise as the information management system for its Ichthys LNG Project in Australia.
Odfjell Drilling has selected IFS Applications for Offshore Service. The contract includes licenses and the first phase of the implementation project.
Santos has awarded Jacobs Engineering a contract for the concept development study on its Cooper Gas Growth satellite developments in South Australia.
Shell has renewed its five year contract with Deutsche Telekom’s T-Systems unit for the provision of hosting and data storage services. T-Systems provides Shell with data centre infrastructure and computing services around the world—notably for Shell’s cloud-based SAP deployment.
Shell Gas & Power Developments has signed with the Technip Samsung Consortium to enhance collaboration on the design, engineering, procurement, construction and installation of future FLNG facilities.
The Standards Leadership Council (Oil ITJ September 2012) is to hold its first international meeting next month. The meeting is scheduled for February 26th at London Heathrow. A full day session is planned to discuss the work that has been done so far, the impact the alliance is having on the industry and plans for the future.
At the time of writing (25th January), the program for the meeting is unclear. However, an early announcement suggested that mapping between WitsML and the PPDM relational data model was on the agenda. The SLC is backed by Energistics, PPDM, PIDX, Mimosa, OpenGeospatial, OPC Foundation, PODS, PCA and the SEG.
Kista, Sweden-based ProAct, through its Norwegian unit, is to provide cloud-based data storage services to independent oil and gas company Lundin. With operatorship of the Brynhild and Edvard Grieg fields as well as a share in the giant Johan Sverdrup discovery, Lundin is experiencing significant growth.
Lundin CIO Martin Leslie said, ‘In 2011 we had 90 terabytes of data. This is set to increase to 320 TB in 2013. As our current NetApp solution is running out of capacity, we went to tender and ProAct was selected. ProAct has served as a strategic advisor and has references with small and larger clients both nationally and internationally.’
ProAct will deploy a hybrid cloud model where its equipment will be located in Lundin’s environment and mirrored to a redundant location. Following a ‘thorough investigation’ Lundin opted to stay with a NetApp platform (ProAct is a NetApp partner).
Sandvika, Norway-based Kongsberg Oil & Gas Technologies has been selected by Total E&P and Dong UK to supply a production management system (PMS) for the UK North Sea Laggan-Tormore project. Laggan-Tormore is an ambitious subsea completion located 125km north-west of the Shetland Islands in a ‘uniquely challenging’ environment. The PMS will be installed onshore at the Shetland gas plant processing terminal along with an ‘overall flow metering system’ (OFMS), also from Kongsberg.
The PMS is a dynamic model of the 140 km production pipelines and gas processing facilities that allows operators to understand process behavior and ensure safe and efficient production. To manage the liquids inventory, particularly the MEG* closed loop system, a high fidelity model is required. This is achieved with a dynamic model based on LedaFlow, a new transient multiphase flow simulator that was developed jointly by Kongsberg and Total. The OFMS provides production reconciliation and acts as a control for drift in the system’s multi phase flow meters.
* mono-ethylene glycol, a hydrate inhibitor
An article in GE Magazine by specialists from patent lawyers Yetter Coleman LLP suggests ‘five ways to win patent litigation.’ The authors opine that patent litigation over oil and gas technology is ‘vibrant’ with a recent $106 million award to WesternGeco in its suit against ION and other ongoing suits over rig automation control systems.
If it is within your means, you might consider building your own patent portfolio to deter competitors from suing as Halliburton did after losing a $98 million hydrofrac suit to BJ Services in 2003. It is also a good idea to develop a procedure for archiving documents proving the timing of the launch. Doing this now can strengthen the defense in subsequent suits. In the digital era it may be hard to produce ‘clear and convincing’ evidence of your work—especially as document retention policies may be working against you.
Sometimes a vigorous defense may scare off a litigant if it looks as though the prospects of success are declining. In some cases an out-of-court challenge to a patent may put a case on hold pending a decision from the Patent Office. But watch out when the patent emerges from re-examination intact giving your adversary ammo to use against you. Communication is key in patent litigation.
The authors observe that jury trials bring ‘frequent, sometimes staggering verdicts’ in patent suits. Paradoxically, although the burden is on the plaintiff to prove its case, often the odds may seem stacked against the defendant. ‘Feeling overwhelmed and gun-shy is exactly the wrong response.’ More from Yetter Coleman.
Blue Marble Geographics is claiming ‘gold-level’ compliance with the OGP’s GIGS* guidelines for the 6.6 release of its GeoCalc software development toolkit. The GIGS guidelines from the International Association of Oil & Gas Producers Geomatics Committee establish degrees of geodetic integrity in software tools according a 4 level scale. Gold compliance indicates an ‘extensive capability to perform coordinate operations, incorporating features that expand the range of applicability and/or reduce the probability of geospatial integrity violations.’
GeoCalc provides coordinate transformation capabilities in many interpretation packages used in oil and gas. The new release provides automatic downloading of horizontal and vertical grid shift files, a datum shift generation tool, direct coordinate transforms and support for the new INdicio 2.2.1 for the web hosted version of the EPSG database.
Blue Marble has been a contributor to the GIGS evaluation process since 2006 and its Geographic Calculator and GeoCalc SDK were used to develop the GIGS coordinate transformation best practices guidelines.
* Geospatial Integrity of Geoscience Software.
An article in the current issue of Aker Solutions Magazine offers an interesting perspective on how a modern offshore facility or drilling rig is designed and built. While rigs and platforms are often one of a kind, they are built from common components and make extensive re-use of components, a process called Knowledge-based engineering design.
KBeDesign includes design rules, standards and estimation tools stored in a database such that installations can be tailored to region’s environmental conditions and local legislation.
KBeDesign was developed around TechnoSoft’s Adaptive Modeling Language which covers modeling and simulation throughout the development cycle. AML supports automatic product configuration, design, manufacturing and production planning. AML provides support for standards including IGES and STEP. An underlying object-oriented database offers open access and links to geometry engines such as Parasolid, ACIS, and Shapes.
Aker believes that KBE is a significant improvement on traditional ‘static’ computer aided design particularly when leveraged in ‘intelligent scaling’ of previous designs. Where traditional CAD may require weeks or months to implement changes, one recent semi-submersible was resized in a day.
Kongsberg Oil & Gas Technologies is to develop a suite of ‘decision-support consoles’ for BP’s well construction teams. The project is a component of BP’s multi-million dollar ‘Well advisor’ program that sets out to integrate real-time data with best practices and expertise in well construction. Well advisor is being developed by BP’s Global wells organization (GWO) which was set up in Q4 2010, inter alia, to implement the findings of the Bly Report into the Macondo blowout.
Well advisor was touched on by BP recently in a wide ranging top level presentation of the company’s new strategy. Building its new drilling knowledge base has involved the hire of a ‘renowned’ well control expert, six blow-out preventer specialists and a new ‘operated by others’ team to poach, sorry, ‘attract expertise’ from other operators. BP has also brought in quality assurance skills from aerospace and has ‘recruited from the armed forces.’ The new consoles will be based on Kongsberg’s Discovery Web platform, a component of its SiteCom suite. Kongsberg has strengthened its development team significantly in support of the project.
A new white paper from the ECCMA standards organization offers help to those seeking to optimize their SAP materials management systems. The 25 page document offers advice on fixing data issues and maintaining master data quality in an SAP database and introduces ECCMA’s own system for building and managing a standards-based corporate dictionary. Author Peter Benson observes that one of the greatest strengths and weaknesses of SAP is how easy it is to change material names and descriptions. SAP provides no guidance on how to create consistent high quality material nomenclature.
Enter the corporate dictionary—derived from an analysis of existing material master databases and purchase orders. These can then be mapped to the ECCMA Open Technical Dictionary (eOTD), a new ‘open’ public resource of terminology from standards bodies and industry associations.
Benson advises against acquiring product data from the internet. It is far better to ask suppliers for such information—a process described as ‘cataloging at source’ (C@S). C@S can supports applications using the ISO 22745 suite of supply chain automation protocols. Benson’s work leverages the NATO stock number (NSN) registry, an inventory of over 20 million materials and their codes. More from eCDM and eOTD.
Boston Networks has deployed a real-time asset tracking system from AeroScout (a Stanley Black & Decker unit) at FMC Technologies’ subsea design and manufacturing facility in Dunfermline, Scotland. The system identifies and locates high value equipment inside the facility, leveraging FMC’s existing Boston Networks Wi-Fi infrastructure.
FMC manufactures subsea trees, wellhead systems and associated equipment for the oil and gas industry where assets are regularly on the move. Real-time visibility of components helps optimize operations, increase production and improve safety.
John Johnston, HSE advisor with FMC said, ‘The system tracks high value assets in real time and helps us comply with our specialist statutory licenses. We are now entering Phase II of deployment, extending the system to other items within the facility. The system is also being evaluated for use at other sites within our global organization.’
Boston Networks’ solution consists of AeroScout Wi-Fi RFID Tags and MobileView software which allows staff to quickly locate needed items from an intuitive graphical user interface. Tags are waterproof and ATEX certified. FMC also uses AeroScout to track regulated assets for compliance and audit. More from Boston and AeroScout.