There was some good feedback from our November 2013 lead ‘Object Management Group floats data distribution service (DDS) as performant alternative to OPC-UA.’ First from Michel Condemine (president of the French chapter of the OPC Foundation) who denied that ‘visibility and internet connectivity were missing from OPC-UA,’ offering a detailed explanation of how OPC UA is perfectly compliant with internet and can leverage multiple HTTP encodings. In the expectation that this might kick-off an interesting debate we went back to our OMG/DDS sources who have essentially backtracked their claim for superiority of DDS over OPC and/or Witsml—claims which, by the way, we accurately reported.
To set the record straight OMG CEO Richard Soley told us, ‘ DDS has a decade of experience on OPC, and the DDS designers have constantly updated the standard based on our own and others’ experiences with various middleware technologies. But we strongly believe that the real world will always feature many different solutions to any given problem, including middleware.
We expect to be able to bring you more on the OPC/ OMG debate and indeed their interactions with other standards bodies following the 2014 gathering of the standards leadership council in Paris next month.
I figured that I would editorialize this month by opening up my burgeoning ‘industry at large’ file. Mood music in the industry is changing rather fast with Shell’s backtracking from the Arctic and a less than stellar performance from other majors. But first, to cheer you up (?) a report from CareerBuilder on the ‘hot’ oil and gas jobs in the UK. There are some 450,000 oil and gas jobs in the UK North Sea and employment ‘is expected to rise over the next 20 years as the oil industry is overhauled.’ CareerBuilder’s ‘hot jobs’ reflect the sectors showing the greatest wage increase between 2010 and 2013. The N° 1 slot is held by ‘physicists, geologists and meteorologists’ up 25% 20.50 per hour. Next finance managers, up 17% to 28.90. IT user support technicians saw a 17% hike to 13.40 and mechanical engineers, a 6% increase to 20.65. These are the medians, your mileage may vary.
On the other side of the pond, the API reports that oil drilling rose 5% in 2013 but this was balanced by a 20% decline in natural gas completions. A survey by Urbach Hacker Young and the Oil & Gas Financial Journal has it that two-thirds of oil and gas companies plan higher capital expenditure in 2014, reflecting in part a need to ‘focus unconventional projects on improved production.’ I find this rationale intriguing. Perhaps the increased capex is required to fuel what IHS has described as ‘surprising resiliency’ in Bakken and Eagle Ford production despite ‘exceptionally high production decline rates, and the lack of a third major tight oil play.’
Production from unconventionals is at the heart of the latest report from the International Energy Agency which sees shale production as contributing to an energy price gap between Europe and the US which, according to IEA chief economist Fatih Birol, will ‘last at least twenty years.’
The January-March edition of the excellent Reservoir Solutions newsletter from Ryder Scott puts the IEA’s bullish and un-nuanced view into context. In his lead, John Lee of the University of Houston reports ‘serious unknowns’ in production forecasts of shale plays. Today, industry relies on industry empirical production decline models developed a century ago, ‘Industry has no models that totally account for the physical processes of [frac] flow regimes.’ The Society of Petroleum Engineers is planning a summit to look into such issues.
Recalling CEO Rex Tillerson’s comment on how ExxonMobil was ‘losing its shirt’ on shale gas back in 2012, I scanned through Exxon’s ‘Outlook For Energy’ (O4E) hoping to find some nuance. There was none. This year’s O4E toes the party line and sees shale gas transforming the US economy and having potential to do the same elsewhere. O4E has it that North America is expected to shift from being a net importer of natural gas to a net exporter by 2030, as production growth from shale and other unconventional sources ‘outpaces demand.’ Which could be seen as a Freudian slip since an outpaced demand will likely cause a price collapse, but I digress. One interesting facet of the O4E numbers is that they come, not from Exxon’s own research, but from, guess who, the IEA! Now who, I ask you, is best placed to evaluate US shale gas potential, ExxonMobil or the IEA’s functionaries?
The subtext in O4E is likely one of wishful thinking along the lines of, even if, as we think, there is probably less gas than the IEA has it, we will continue to ‘lose our shirts’ unless we are allowed to export it to the EU and Japanese markets. But why the IEA toes the Exxon party line is a puzzle. There must be some great lobbying going on behind the scenes. Similar wishful thinking is evidenced in the November 2013 report to the International Association of Oil and Gas Producers (OGP) on the effects of shale gas production to the EU economy. A ‘shale boom’ scenario sees some €540 billion savings to the EU economy out to 2015 although to be fair, the OGP study acknowledges many unknowns.
But politicians in the EU, notably David Cameron in the UK and Arnaud Montebourg in France (who by the way now seems to be convinced that propane is an environmentally attractive alternative to water for fracking) seem to have drunk the Kool-Aid. Perhaps one way to view all this is as an updated version of Keynes’ digging and filling in holes in the road as a way to kick start the economy. As Woody Allen says, whatever works!
A Redpaper authored by IBM’s Akram Bou-Ghannam, ‘Foundational ontologies for smarter industries’ (Fosi) shows how semantic-web based technology developed for oil and gas could help other verticals optimize their business. A companion volume ‘Smarter environmental analytics’ provides examples of the technology deployed at offshore oil and gas facilites. Fosi posits that semantic technology provides a ‘flexible integration approach that can accommodate change,’ one that promises ‘better interoperability of diverse information assets than traditional IT systems’ which are ‘inflexible and fragile’ when new information comes in.
IBM’s ontology is built from public-domain components including the W3C’s Semantic sensor network (SSN), NASA’s quantity-unit-dimension-type (Qudt), GeoSparql and BasicGeo. The technology that binds all of the above together is the W3C’s resource description framework (RDF) which should be familiar to Oil IT Journal readers. Bou-Ghannam repeats the usual, unproven assertion that RDF’s graph data model enables ‘artificial intelligence, machine learning and inference.’ IBM’s own contribution is the DB2 database which accommodates RDF. Semantic technology is also supported in IBM’s Rational data modeler and will be in ‘future versions’ of Watson.
Some will recognize the initiating projects as belonging to Norway’s Integrated operations in the high north’ and Statoil’s environmental emissions monitoring program. But the IBM Redpaper takes a new tack. There is no reference to the PCA ISO 15926 standard and its Fiatech derivatives. Fosi is also a departure from an earlier IBM Redbook on IBM’s chemicals and petroleum industry solution (2009/07/17). Leveraging third party ontologies is a great concept, but do they themselves have legs? Regarding the Qudt units of measure standard, NASA’s Paul Keller told us, ‘The Qudt handbook will be published this year as both a traditional document and as a web-based model. The latter refers to both a RDF/OWL/Sparql endpoint for semantic web usage along with web-resolvable identifiers to provide consistent access to information assets, semantic or otherwise..’
As to the SSN, the W3C’s final report, published in 2011, opened the door to deployment in contexts like the internet of things. GeoSparql too has some enthusiastic backers but take-up is likely under the radar of most upstream IT. This new effort to bring the semantic web to life suffers from the over-promising of ‘AI and inference’ and has a distinctive skunk works perfume. But using third party ontologies is smarter than rolling your own and Fosi should be worth a try, if you have access to DB2!
Following intense public comment, London-headquartered Amec has revealed that it has provisionally agreed to take over Foster Wheeler in a cash and paper deal worth approximately $3.2 billion. If the deal concludes, Foster Wheeler shareholders are to receive 0.9 new Amec shares and $16.00 in cash for each Foster Wheeler share. The deal would enhance Amec’s positioning in oil and gas, adding mid and downstream capabilities to its current upstream focus. An enlarged geographic footprint would ‘more than double’ Amec’s revenues in ‘growth regions’ and bring scale benefits.
On completion, Foster Wheeler shareholders will hold 23% of the enlarged Amec share capital. Amec expects to fund the $1,595 million cash component from its own reserves along with new debt financing. Concomitant with the transaction, Amec will seek a US listing. CEO Samir Brikho said, ‘Combining our businesses would be financially and strategically attractive. We expect double-digit earnings enhancement in the first twelve months. It would be a compelling proposition for our shareholders, customers and employees.’ More from Amec.
In the early 1980s, when I was working with a small E&P outfit we made a substantial discovery and had oil to sell. My vague recollection is that a junior employee was sent off to ‘negotiate’ a deal with a nearby refiner that settled our selling price for the foreseeable future. As Liz Bossley (CEO of UK based Consilience Energy Advisory Group) observes in her attractively produced and entertaining Guide to Trading Crude Oil* (GTCO), a lot has changed since then. Markets are more sophisticated and there are many opportunities to optimize sales and mitigate price risk. Some companies however may be stuck with the 1980s mentality, ‘celebrating the success of a 50 cent saving in operating costs while leaving sales revenue to the mercy of an oil price that may fluctuate by more than a dollar per day.’ Thus Bossley’s mission in the GTCO is to explain how modern trading can help companies ‘understand what is happening to revenue and manage it responsibly.’
The GTCO approach is rooted, not in obscure
options math, but on an understanding of the big picture. And a big
picture it is indeed. Bossley has a lot to say about the industry at
large, how oil is bought and sold and the plethora of contracts than
govern such activity.
Like any specialist field, terminology is both key to understanding and a potential stumbling block to the newcomer. Terms such as ‘contango,’ ‘backwardation,’ ‘demurrage’ are clearly explained (although an index would have been nice), as is Consilience’s own approach to trading—based on the absolute price (A), time (T) and grade (G) differentials. GTCO is packed full of information on the array of benchmarks that determine the oil price. Virtually no aspect of the above goes unquestioned in this detailed, informative and, I have to admit, hard to summarize analysis. One message is that the oil market today is at risk from new regulation (Dodd-Frank and Volker) whose good intentions may result in removing players from the market and limiting its efficacy.
There are some juicy titbits. Bossley describes the outcome of the 2011 report from IOSCO** into the activity of oil price benchmark reporting as ‘motherhood and apple pie.’ An appendix updates GTCO with breaking of news Shell’s re-jigging of its trading terms in the face of declining North Sea production and the ‘flawed benchmark’ that is Brent.
* ISBN 9780955083945. Available from Consilience.
** International organization of securities commissions.
According to BP’s Tarun Chandrasekhar, ‘Managing data is like losing weight. One-off projects are about as successful as crash diets. Both these goals need lifestyle change.’ This can be achieved with ‘sustainable, free-standing processes and systems that are embedded in the E&P lifecycle.’ Geospatial data management is under BP’s overall data management discipline with a cross-functional governance board. Geospatial data is divvied up into various categories, ‘spatialized,’ ‘mastered’ and ‘derived’ each with its own transformation and loading workflow . Special attention is given to coordinate reference system handling. BP keeps tabs on its complex array of GIS data stores by monitoring database and server performance with Vestra’s GeoSystems monitor while usage is monitored with Voyager GIS (2013/10/15). Pipeline centerlines, facilities, and equipment are housed in a PODS/Esri Spatial data store.
Chandrasekhar warns that a common problem in data management projects is underestimating the time and effort required. A typical ‘18 month’ program can easily stretch to over three years requiring extra resources or worse, a reduction of scope. GIS implementation affects multiple critical processes, making management of change hard. Sometimes it takes so long to implement the technology that by the time training starts, the technology is already outdated. BP gets around this by ‘putting people first,’ focusing on getting the right skills in the right roles and establishing geospatial governance across functions. In fact, ‘technology is the lowest priority.’
Mark Priest reported on RasGas’ ongoing effort to measure the value of subsurface data governance—initially by computing and allocating a monetary value to the time saved in a 2006 volume reporting and well back-allocation data project. This saw a move away from an ‘unsustainable and error prone’ legacy method based on spreadsheets and individuals. This had led to multiple versions of the truth and large amounts of engineers’ time spent searching for and ‘wrestling’ rather than analyzing data. The resulting solution (applications and a centralized database for measured and allocated volumes) cost some $1.4 million to implement but, over a 7 year period from 2006 to 2012, has ‘earned’ $1.9 million in time saved. Further leverage of the solution outside of the WBA project brought over $3 million of extra value for free. The subsurface is now selling the value of the solution to other parts of the business and to RasGas’ shareholders and has embarked on a ‘Value Calculation 2.0’ project with more ambitious goals.
Trudy Curtis presented on the latest PPDM 3.9 release which sees a 58% hike in its table count (to 2700) and a corresponding rise in the number of attributes (up 87% to 71,000). The new model introduces constraints (such as units of measure) that make the model more robust, support for the concepts defined by the ‘What is a Well’ and ‘Well Status’ work groups. The shale boom has driven two new subject areas covering source rock geochemical analytics. Better support for raw directional survey data has also been included along with master data functionality. Wes Baird’s talk highlighted the ‘sweeping’ changes in the 3.9 architecture along with advice on how and when to upgrade. Read the PPDM presentations here.
Tecplot has just released a new version of its oil and gas reservoir simulation post processor, data visualization and analysis tool. The 2013 R2 release brings improvements for loading, viewing, and exporting files and plots, including automating some functions while reducing the ‘number of clicks.’ To develop its latest release, Tecplot used some novel data access technology from PernixData to speed the compile and test process. Tecplot uses an automated code compiling infrastructure to support its development team. Usually compiling is a CPU bound process, but when last year Tecplot upgraded its virtualized build systems, the bottleneck shifted to its storage infrastructure.
Enter PernixData FVP, a host based, server side caching solution that provides both read and ‘true’ write-back caching. FVP is tightly coupled with the vSphere hypervisor, adding a pool of high availability cache transparently. Tecplot RS 2013 R2 supports Windows 8, 7, Vista, and XP (32- and 64-bit), and Linux (64-bit) platforms. US pricing starts at $6,600 for a single-user license. More from TecplotRS
Lloyd’s Register Energy and Senergy are participating in joint industry project to ‘align existing reservoir simulation models with high-performance inflow and wellbore flow simulators.’ Other project participants are the Danish Technical University and Danish provider of robotic intervention and completion solutions, Welltec AS. The €4 million ‘Project Option,’ funded by the Danish National advanced technology foundation, is focused on enhancing production from horizontal wells. Lloyd’s and Senergy will provide improved reservoir modelling techniques, which will be critical to the development of the next generation of industry software technologies. The increased understanding of the interface between reservoir and well performance will improve well and completion design to enhance productivity and oil recovery.
Senergy CTO Iain Morrison said, ‘We will apply our production optimisation expertise to model the interface between the reservoir and the wellbore where today’s commercial simulators have significant shortcomings. Wellscope, our computational fluid dynamics well and near-wellbore modeling solution is integral to the Option project.’ More from Lloyd’s and Senergy.
Speaking at a recent seminar co-hosted by France’s Ineris* organization and CMR/GexCon**. Benjamin Truchot outlined the use of computational fluid dynamics-based modeling to solve safety challenges. At issue is the use of ‘classical’ 2D ‘phenomenological’ explosion models as opposed to a full 3D/CFD approach. Both methods have their advantages and limits. 2D models may often include more domain knowledge but the 3D approach can provide more insights into specific scenarios—particularly with respect to asset configurations. Both approaches are subject to issues such as the accuracy of the underlying physical models, users’ understanding of the models and code and computational limits.
Henri Tonda described how uses a variety of tools to assess and mitigate risks in both asset design and operations. Modeling has increased in sophistication over the years. Piper Alpha was modeled as a straightforward TNT-like explosion. Today various combinations of gas dispersion, with or without fires are used to investigate interactions between potential leak points and ignition sources such as flares and vents. FPSOs and other assets are designed so that potential fires are controllable. Fire and explosion is the main hazard, causing 30% of all accidents.
Tools such as Phast, Firex and Fred are easy to use and good for multiple simulations but fail if used to model complex configurations and meteorological conditions. Gexcon’s Flacs takes the wind into account—and can give very different results. Tonda described use of 2D/3D modeling to explain why the Girassol FPSO was experiencing frequent gas detections due to tank venting. Gas was being detected on the bridge, alarms were sounding, production halted and crew mustered to safety boats. What was going on? It turned out that Girassol’s designers had used a ‘classical’ tanker design which did not anticipate very low wind speeds. Modeling with Phast concluded that gas from the vent could not reach the process deck. But 2D/3D modeling with Flacs showed the heavy gas pooling at sub 1m/s wind speeds. Various combinations of fans and ducts were modeled before Total settled on a redesigned annular, fan-driven vent which added turbulence and stopped the emergency shut downs.
Tonda concluded, ‘2D is not false but incomplete, 3D takes other factors into account like asset geometry, wind speed and deluge efficiency. 3D is key to accurate risk assessment.’ In the Q&A Tonda elaborated that in Norway, there is a push for probabilistic evaluation of explosion risk. But the problem is that risk is not generic but specific to an installation. Events such as an approaching supply boat, newly installed equipment may not be in the probability database. It is important not to extrapolate too far from generic models and perform a rain check against what it on your particular asset—such as where a crane is located. Total currently analyses over 100 potential risks in depth which is ‘both a lot and not enough.’ These focus on risks that could destroy accommodation or other ‘super critical’ cases. More from Ineris.
* National industrial environment and risk body.
** Developer of the Flacs 3D explosion modeling package.
The 2013 release of CMG’s ‘Cmost’ history match and analytics package adds a proxy dashboard, a study manager for collaboration and new sensitivity analysis workflows.
Enertia Software’s data mining tool ‘eGO>’ combines functionality from its eCube and eNav applications.
Safe Software’s FME Desktop and Server 2014 introduces ‘big data in the cloud’ functionality with support for GIS data on Amazon DynamoDB, RDS, Redshift, and S3, Google BigQuery and Esri ArcGIS online.
Geoforce’s GT1 satellite GPS tracking device now has IECEx/ATEX Zone 0 certification for use in hazardous environments.
INT has just announced GeoToolkit.JS, a cross platform library of tools to display of seismic data, logs and scientific plots on anything from desktops to mobile devices.
Ogre Data Systems has announced OGR3.com—an online app for oil and gas reserves and economics bundled with Texas lease production data.
OpenIT’s new LicenseAnalyzer is a function available in the 6.3 release of OpenIT that provides Windows and Android smart phone and tablet users with a dashboard to check and analyze license usage on the move.
One Virtual Source’s new model builder workflow automatically connects to upstream databases to build the well model. OVS also provides virtual metering to optimize flow rate estimation.
The 5000.8 release of Landmark’s Permedia petroleum systems modeller includes a new collaborative calibration system, custom source rocks and features for unconventional plays and more.
The 17.4 release of Petrosys’ eponymous mapping tool includes new depth conversion and 3D visualization functionality. The release is also the first ‘fully 64 bit’ Petrosys edition.
Red Hen Systems has simplified mobile data collection with MediaMapper Mobile 2.0. MMM lets smartphone users collect geotagged video, audio and high resolution imagery on the move.
SherWare has just released ‘Well Profits’ an application for investors and royalty owners to track revenue and expenses by operator and property, and integrate data with QuickBooks.
Schlumberger has announced ‘Vx Spectra,’ a ‘next generation’ gamma spectroscopy surface multiphase flowmeter for offshore and land applications.
SRI International has demonstrated an underwater chemical survey capability, deploying its in-situ membrane introduction mass spectrometry from a Bluefin-12 autonomous underwater vehicle from Bluefin Robotics.
Weatherford International has announced the OmniWell production and reservoir monitoring solution. Permanent downhole monitor data is captured to a scalable data management platform for relay to visualization and analytics tools.
There was a good turn-out (nearly 300) for the 2013 EU Esri petroleum user group (PUG) meet in London late last year. Esri’s Danny Spillman embellished on his compelling and ingenious case for GIS at the fictional ‘Clancy Energy’ company showing how multiple data sources, both internal and external, can be put to use in pipeline routing. Who would have thought of routing that leveraged unemployment data? Chez Clancy, GIS is the enterprise data portal.
Calum Shand (Shell) and Stuart Thomas (Cyberhawk) wowed the assembled geographers with a presentation of innovative unmanned aerial vehicle (UAV, ‘don’t say drone’) mapping of Shell’s Scottish onshore terminals. The geo information team, in a ‘moment of serendipity,’ discovered that multi-rotor UAV’s had already been used on Brent Delta for inspection and integrity monitoring prior to decommissioning. UAV service provider Cyberhawk was enlisted to acquire geo-referenced aerial orthophotos of the terminal sites using a fixed wing UAV. Data was rolled up with Shell’s other GIS data sets services and served-up as a general purpose/emergency response style integrated web map. Project scope has now expanded to include Google street view style 360° imagery and multi-rotor UAV acquired oblique photos for compliance, engineering and situational awareness usage.
Olivier Serrepuy presented Total’s ‘Geops’ a.k.a. GIS for development and operations. Geops kicked-off in 2009 with a vision of an enterprise database of asset location data. Geops comprises a web based visualisation tool, an asset data model of pipelines and platforms and a governance and management infrastructure for GIS data. Geops bridges the gap between detailed CAD/CAM drawings of infrastructure and multiple GIS-type data sets including aerial imagery, bathymetry and pipeline routing. The various datasets have been successfully re-purposed for subsea development, survey and inspection, vessel tracking and other use cases. The system is based on an ArcGIS Desktop and Server 10.0 and interfaces to SAP and Documentum. Serrepuy concluded that ‘GIS is naturally cross discipline and can evolve as needs change.’ But GIS implies new ways of working including new data management and workflows and GIS experts who are ‘close to the end-user, IT, data and software.’
Colin Grant (BP) presented on a joint industry project to coordinate oil spill response efforts. Worldwide, spills were on the decline from the 1970s to the 2000s, but then came Macondo. The JIP has been initiated under the auspices of the Oil and gas producers association (OGP) and IPIECA, a global association addressing social and environmental issues in oil and gas. The four year project is scheduled to conclude in 2014 and will deliver good practice guidance on the use of dispersants, on developing risk and hazard-based strategies for response preparedness and will promote research on response methods and assessment models. The group is also investigating the feasibility of an industry standard GIS data model for spill response. Read the EU PUG presentations here.
The second IQPC Digital Oilfields Summit held in London late last year provided a decent combination of oil and gas companies’ experiences and vendor innovation. Hatem Nasr provided an in-depth update on Kuwait Oil’s (KwIDF). In the 2000s, KOC was preoccupied with real time data. As the last decade drew to a close real time data was available but the question was, what to do with it? It had become ‘more of a problem than solution.’ What had been underestimated was the effort required to mine, clean and make sense of all this data. Today companies have the systems and technology to integrate and process data—and are achieving good results. Even so, it remains a struggle to assess the true value of digital, which impacts investment plans.
Many companies are not doing it right—there is too much focus on doing piecewise or a focus on a single technology like smart completions. Which, for Nasr, are not the digital oilfield. No more are smart downhole sensors, analytics, data mining or ESP optimization. All these are just pieces of the puzzle. The heart and soul of the digital oilfield is integration and collaboration across all of the above. The true digital oilfield delivers demonstrable value. Are you getting more oil and/or gas?
KOC has initiated multiple large scale DO projects, some across whole fields and representing investments of $10s or 100s of millions). These have been carried out with help from different suppliers and at several locations to see what works where and identify best of breed solutions. The key is not to ‘just sit there and watch the service companies.’ Even the big ones like Schlumberger, Halliburton, Emerson do not have all the knowledge and skills to really understand the DO problem. It is the operators that have the requisite knowledge of its fields—hence the need for a true partnership.
Change management is crucial to the DO project which represents an upheaval in work processes. This is an ‘ongoing challenge.’ There is no such thing as a small DO project. It is not just a glorified Scada deployment! The outcome may be uncertain. Production may not ‘rise by 10%’ and mistakes are inevitable. KOC has four projects underway. These can be considered as pilots but are actually very large—Sabriyah covers 49 wells, Burgan GC1 60, Jurassic 30 and West Kuwait 90. The aim is for an environment that encourages collaboration along with improvements to the corporate knowledge base. The idea is to ‘make it smarter’. One monster problem that has to be overcome is data management. A complete workflow includes real time data, models, ‘lots of AI’—statistics, intelligent agents, numerical simulation and forecasts. In the field there are major infrastructure changes with Wimax and an IT infrastructure. This has enabled continuous well testing. The Jurassic KwIDF project is entirely wireless. Wireless communications are now both a commodity and a game changer—you can just ‘pop a sim card on a well.’ KOC is getting value from its digital effort but Nasr believes that ‘the greatest value has yet to come.’ Projects start but they don’t end—this is a continuous improvement process. Technology helps but the DO is ‘mostly about change management.’ In the end, you ‘go digital or perish!’
The discussion on structured unstructured data highlighted the difficulty of making a clear distinction between data, metadata and ‘unstructured’ data. Metadata is a dataset in its own right and one repository’s metadata is another’s data. Nasr asked ‘How do you know how much a well produces?’ You may have a production test, a test at the gathering station or a reading from a $300k multi-phase meter all giving very different results. Which is correct? The answer is, ‘it depends.’
Getting back to the unstructured data issues there was a plea for a solution that would bundle Petrel project files and Adobe PDFs into a ‘standard format for interpretation results.’ One possibility would be ResqML or perhaps the embryonic efforts of the SEG’s EarthIQ Forum.
Wendy Valot described how BP divvies up its knowledge management into project information management (PIM) and KM proper. PIM manages data, files docs. KM manages know how, experience and learning in the context of continuous improvement—or, as Abe Lincoln observed, ‘it is a poor man who is not wiser today than yesterday.’
Valot started in BP’s drilling unit in 2004 where there was a strong KM culture. Her role now is to understand what had been achieved in drilling and expand it to thousands of other users across BP. For drillers this involves an after action review following a well. But in other sectors, there may be a time lag of many years as projects complete and individuals move on. Folks will still use the old best practice in interim and may fail to leverage the most up to date knowledge. It is therefore crucial to determine the value of a piece of knowledge and to position it in a quadrant of term of use and value. The best items are long term high value. This is allowing BP to deploy a systematic approach, tied to best practices and making knowledge accessible, reliable and shared.
To achieve this the company has defined KM roles and implemented training to make KM ‘systemic and repeatable.’ Previously knowledge was captured in long reports, books, notes and there was a reliance on search. This led to ‘big data overload.’ The new combination of support roles and technology is helping to establish connections and consistency and to ‘avoid long reports that nobody reads.’ Roles are being built into the information flow—injecting action items, profiles, alerts, distribution lists. Previously well-intentioned people stored lots of information in long reports. This is evolving into more succinct information items sent out to distribution lists along with alerts. BP’s ‘cyber librarians’ canvas users to see what products are relevant to them. Cyber librarians classify documents according to domain-specific taxonomies.
The idea is to create ‘purposeful social networks’ starting with project teams and working outwards. Valot’s team makes sure that knowledge gained in, say the North Sea, is immediately available to workers in Australia. BP’s knowledge management effort has now spread out to global wells and operations. Deployment is now in the works for downstream and refining.
Tim Fleet (McLaren Software) is an advocate of business process management. While generic BPM tools are fine for simple workflows, supply chain integration and complex engineering documents need more specialist tools. Fleet recommends that when launching a pilot project ‘don’t pick the low hanging fruit.’ This is because it such projects are likely to be easy to implement and are unlikely to be representative of real world BPM.
Jill Lewis (Troika) continues with her crusade to educate industry on the problems it is facing with exploding seismic trace counts and their solutions—standards-based data management. Her poster child use case is Apache’s ‘refresh’ of BP’s legacy seismic data over the UK North Sea Forties field which contributed to a 800 million barrels reserves hike. Apache is now implementing life of field (LoF) seabed seismics to further aid its production effort. It is an exciting time technology-wise for seismics with the latest 3592 barium ferrite tapes holding 4 terabytes each. Metadata is stored on a microchip on the cartridge and the new robot holds 2.5 exabytes. On the field recording front, Sercel has just announced a million channel system. All of which is mandating a new focus on data management. While SEGD 3.0 is ‘an absolute necessity,’ there is still no standard for tape on disk. Without proper organization, read times for a 600GB tape ‘go through the roof.’ Future LoF data volumes to be stored are huge compared to today’s. ‘Get a handle on it now.’ Troika, whose data loading package Magma is now embedded in Landmark’s SeisSpace, is training IT folks on seismics. Both IT and technical contract writers need to understand these issues.
The topic of document control on oil and gas projects was addressed by Matthieu Lamy (Talengi) who has been working with GDFSuez. Thousands of engineering documents undergo multiple reviews across the workflow—from design, purchasing and on to construction? The whole process today focuses on engineering deliverables rather than coordination and administrative documents such as deviations, change orders and even meeting minutes and reports. The master deliverable register is the key for all stakeholders and, for a medium size project, might include 10,000 documents. Stakeholders need to know that they are working on the latest version of a document. It can be hard keeping track and may introduce financial of information security risks. In the event of a problem it may be a legal requirement to demonstrate who did what, when and where.
Martin Black wants oils to consider using the same technology as capital markets to ‘get decisions right.’ His company, Datawatch , which acquired Panopticon in 2012, competes with Tableau and Spotfire on analytics of historical and real time data. The technology accesses multiple data sources—Excel, logfiles, feeds and databases and offers a configurable view across all data sources in what is described as a ‘tree map.’ These displays were previously referred to as heat maps (2012/03/26). Investigators can drill down through the map and investigate data as scatter plots etc. Black was asked how his software managed to connect with so many different sources. He answered that most data sources today expose an API such as JSON, XML or Odata. If a source is really unusual a bespoke solution can be built but that in reality, ‘Even if software is said to be radically different, it usually isn’t.’ For vendors who refuse direct data access, ‘Just give us a data file. We have years of experience handling this kind of behavior.’ More from IQPC.
The US National energy technology laboratory (NETL) has just launched the Energy data exchange (EDX), a knowledge-sharing network built to provide a single source for fossil energy-related datasets and the tools to use them. EDX is a common portal and toolset for accessing data generated by NETL researchers, other EDX users and outside agencies. EDX datasets include hundreds of fossil energy research projects and are designed to provide transparency to NETL programs and to inform federal, state, and local energy policy.
EDX also provides researchers with a workspace for collaboration. Public features, such as EDX tools and EDX groups, promote information sharing, technology development, and knowledge and technology transfer. EDX provides secure, tiered access to data sets ensuring that information is only shared with the correct partners. Open-access data is available to the public for download, fulfilling the EDX’s knowledge-transfer role.
One such research data set which is now in the public domain is the post-Macondo NETL study of the state of the art of well cementation. The six month literature search with input from a ‘bevy of industry experts’ culminated in a report concluding, in true researcher’s style, that ‘more research is needed.’
Pointers for such future work include cement placement and the use of tracers to monitor cement movement. NETL also found that ‘the conditions the cement encounters in the well often do not match the conditions under which it is tested in the lab,’ and suggest that thermal modeling of the wellbore may bring the lab closer to reality. The December 2012 paper on NETL’s cementation research can be downloaded here.
Aker Solutionshas hired David Currie as head of its UK business. Currie comes from FMC Technologies.
The American Petroleum Institute has promoted John Modine to VP global industry services. He previously directed the GIS department.
Arkex has appointed Chris Anderson as executive VP of its multi-client division. Anderson was previously with PGS.
Engineer Arup has hired Larry Wise as associate in its Houston office. Wise hails from Moffatt & Nichol.
Pat Mullen has been named executive VP and operating group president of CB&I. Jim Sabin is now executive VP global systems.
The UK’s Common data access (CDA) oil and gas information portal has launched UKOilandGasData, a gateway to information previously in the Digital energy atlas and library (Deal) and CDA’s own well and seismic data stores.
ConocoPhillips CTO Ram Shenoy has been appointed to the US (SEAB). ConocoPhillips ranked N° 2 in the 2013 500 list of top users of information technology.
Curtiss-Wright flow control has named Mark Rowitz aftermarket manager of its Farris Engineering business.
The US Bureau of Safety and Environmental Enforcement is soliciting proposals for oil spill response research projects and will spend ‘up to’ $7 million on such projects in 2014.
Encap Flatrock Midstream has promoted Morriss Hurt and Karl Pfluger to MD.
Jana Schey has been appointed VP operations with Energistics. Chandra Yeleshwarapu (Landmark) and Matt Vanderfeen (Weatherford) have been appointed to the Energistics board.
Ensco has promoted David Hensel to senior VP marketing.
Exprosoft has named Christopher McPherson director of sales, Americas. He was previously with Roxar.
Harry Elsinga is now VP HR with GE Oil & Gas. Rami Qasem is president and CEO of the company’s MENA oil and gas unit.
Karim Debache is now operations director of French service company Georex. Roselyne Friedenberg is commercial director and Roberto Magionacalda is head of data management.
Geotrace has promoted Jaime Stein to chief geoscience officer and Greg McCracken to CFO.
Michael Bittar heads-up Halliburton’s new unconventional and reservoir productivity technology center at King Fahd university of petroleum and minerals located in Dhahran, Saudi Arabia.
Amanda Turner is VP global multi-client solutions with Ikon Science. She hails from Esri and BP.
Paul Downe is now CTO of ISN. He hails from Dell services & solutions.
Bill Utt is to retire as chairman, president and CEO of KBR. The company is looking for his successor.
P2 Energy Solutions has named Amy Zupon COO. She was previously CTO.
Alkesta Maili has joined Petrosys’ Houston team as support specialist. She was previously with Intertek.
Brian Everitt has joined Ryder Scott as petroleum engineer. He was previously with J-W Midstream.
Ewan Whyte heads-up Senergy’s new London office and Nicolas Bianchi the new Jakarta location.
Mike Lewis heads-up Space-Time Insight’s new London office. He hails from SAP.
Speaking at a Cowen & Co. event last month Schlumberger ’s Pat Schorn revealed that the company has a total of 27 petaflops of compute capacity, the ‘4th largest private installed base in the world.’
The Israeli Natural Resources Administration has issued an RFI in regard of a future oil and gas national data repository.
Merlin Energy Resources founder and MD Chris Pritchard died this month following a short illness.
Mike Economides, professor at the Cullen college of engineering of the University of Houston, died this month of a heart attack aged 64. Read the Houston Chronicle obituary.
Following the spinoff of its enhanced drilling unit AGR is reviewing strategy for its petroleum services branch. Alpha Corporate Finance is advising.
Applied Industrial Technologies has acquiredTexas Oilpatch Services of Houston.
Inspection certification specialist Applus has acquired TesTex Inspection. Industrial Capital Strategies advised on the deal.
Badger Explorer has benefitted from the Research Council of Norway’s largesse following a successful application to Norway’s PetroMaks-2 program. RCN has awarded Badger 13.2 million NOK to develop its high pressure high temperature ultrasonic technology.
Berkshire Hathaway is acquiring pipeline drag reduction specialist Phillips Specialty Products in an exchange of stock. Warren Buffet said the deal ‘focuses growth on our midstream and chemicals businesses.’
has acquired Iron Mountain’s UK tape transcription business. IM is to keep its storage and tape management services and gets a 25% share in Katalyst. IM clients will gain access to Katalyst’s iGlass and SeismicZone solutions.
Petroskills has acquired oil country e-learning specialist Resource Development Company. RDC provides competency testing services to ‘tens of thousands’ of operators, technicians and professionals each year. PetroSkills was created in 2001 by BP, Shell and OGCI.
Rebellion Photonics has closed a $10.4 million financing round with Tinicum L.P. and other private investment partnerships. The monies will be used to promote Rebellion’s gas leak detection cameras.
from Thinklogical provides food for thought for those designing a
real-time operating center for oil and gas. The publication provides
guidance and best practices for the control and distribution of video,
data and other information to and from the RTOC.
Thinklogical CEO Joe Pajer observed ‘Oil and gas faces complex business and operational challenges. The RTOC provides the visual data and situational insight needed for managers to make better-informed decisions. Deploying a secure, high-performance video and data distribution system in the RTOC ensures that organizations can navigate these challenges.’
The whitepaper’s pitch centers on the use of fiber optic connectivity for keyboard, mouse video (KVM) remoting and data access to separate users from hazardous environments. High bandwidth, low latency, full resolution graphics are important for remote simulation, modeling and video surveillance. Fiber, as opposed to copper, also protects installations against certain types of cyber attack.
Writing in the Q4 2013 issue of ConocoPhillips’ ‘Spirit’ in-house publication, Sabrina Martinez reports on recent upgrades to CP’s IT focusing on increased collaboration, leveraging tools such as audio and video conferencing, networks of excellence and ‘The Mark,’ a new intranet and communications infrastructure.
CIO Mike Pfister is quoted saying, ‘By piloting and implementing new tools, systems and software, the team is helping to foster a culture that shares knowledge, skills and expertise worldwide.’
Key technology components of The Mark include Microsoft Lync and Adobe Connect. Polycom videoconferencing and Smart Boards enable team members to host global online meetings. CP now hosts an average of over 20,000 Adobe Connect, Polycom video and Lync meetings per month. Lync also lets CP communicate with its partners and suppliers, viewing and sharing written material and other documents.
The system allows storage and indexing of of documents in Livelink, SharePoint and file shares. Users can easily search and retrieve relevant material. CP also plans to introduce more social media-type tools, using functionality similar to Facebook ‘likes’ or Twitter ‘follows’ to keep track of information use. Already, 80% of CP employees participate in a knowledge sharing network. Lots more on The Mark and CP’s communities in the 2000 word article.
Stefan Hoppe, chair of a joint PLCopen/OPC UA workgroup reports on how cooperation between the two organizations is to facilitate the adoption of ‘Industry 4.0’ a.k.a. the ‘smart factory.’ The fast-approaching ‘4 th industrial revolution’ is driven by the integration of IT and communication technologies with industrial automation. Autonomous, distributed, intelligent ‘cyber physical systems’ (CPS) will ‘reconfigure and optimize and extend themselves without engineering or manual intervention.’
Data exchange over OPC UA will play a role in CPS as will connectivity with IEC61131-3 controlled devices PLCopen. This will require mapping the IEC61131-3 software model to the OPC UA server address space. The collaboration promises that a program running on different controllers from different manufacturers can access data in the same and ‘data flows will be faster, secure and more precise.’
Note that the buzzword-laden release originally included the word ‘semantic’ but does not seem to read any differently now that we have removed it.
Temporary power generation specialist Aggreko is rolling out Aggreko remote monitoring (ARM) across its N. American fleet of generators and compressors. ARM provides real-time asset monitoring and diagnostics from Aggreko’s experts.
ARM works from a remote operations center in New Iberia, LA. The center monitors equipment data and alarms, initiating the appropriate response.
Aggreko’s units are deployed inter alia on non conventional drilling sites. The company reported ‘rapid expansion’ in Texas’ Eagle Ford shale play last year. More from Aggreko.
IHI E&C International is to standardize its data and material management on Intergraph SmartPlant. IHI E&C will use the tools in the execution and delivery of front end engineering design and EPC projects—more.
Interica is to add Paradigm Epos support to its E&P project archiving and retrieval solution, Pars. Pars’ ‘project aware’ archiving and backup provides long-term knowledge retention, compliance and storage management—more.
Oil and gas information management solution provider QA Software has saved ‘at least 30,000’ by migrating its software to Iomart’s cloud—more.
Statoil has awarded Aker Solutions the Johan Sverdrup framework contract for the provision of up to 10 years of engineering, procurement and management assistance. The deal includes a 650 million NOK Feed study—more.
Applus has teamed with South Korean OMS on the provision of offshore safety training in the oil and gas sector—more.
Blue Marble Geographics has signed with local distributors Duta Astakona Girinda (Indonesia), Insaat Ticaret (Turkey) and SMC Synergy (South Africa)—more.
BP has awarded KBR a contract for detailed engineering and procurement support services on the Shah Deniz stage 2 project, Azerbaijan—more.
Chevron Technology Venture company Cubility has signed a letter of intent with Samsung Heavy Industries to install up to 24 MudCubes on Statoil drilling rigs. The company has also entered the onshore drilling market via a contract with TWI Oilfield Fabrication—more.
YPF has used Schlumberger’s Studio E&P knowledge environment to integrate and preserve data and information acquired across the Neuquén unconventional play, Argentina—more.
Petrobras has awarded Endeeper a three year contract for the provision of geological knowledge management services. The deal allows Petrobras to request customizations of Endeeper’s Petroledge, Hardledge and RockViewer systems—more.
Units of Foster Wheeler’s Global Engineering and Construction group have been awarded contracts by Saudi Aramco for engineering and project management services on the Fadhili gas development in Saudi Arabia’s eastern province—more.
Statoil has awarded GE Oil & Gas a subsea control systems upgrade contract for its Troll B platform. The deal includes GE’s SemStar5 subsea electronics module which provides support for both new and legacy electronics. GE also reports that Total is to deploy its Proficy SmartSignal predictive analytics and remote monitoring solution—more.
Harris Caprock has signed with Daewoo Shipbuilding for the provision of an integrated telecommunications solution for Chevron’s Mafumeira Sul project off the coast of Angola—more.
Kinesix and Invensys have signed a five year deal for the provision of Kinesix products such as Sammi to world-wide customers in the petrochemical and other market sectors—more.
Kuwait Oil Company has awarded . a $410 million contract for front-end design and project and construction management in Kuwait—more.
KOC has also awarded Technip a $400 million contract for consultancy services for project management and engineering—more.
BP has awarded Wood Group ODL a five year, ‘multi-million’ pound contract for document control and information management services to its North Sea business—more.
Petrofac and independent oil and gas company Taleveras have signed a five year memorandum of understanding with the Nigerian Petroleum Development Company to ‘explore options’ including funding, technical support, training services and asset development support in support of NPDC’s aims of furthering its indigenous technical capacity—more.
Petrogal Brasil has adopted Paradigm’s interpretation and modeling suite including SeisEarth, VoxelGeo and SKUA for its exploration and development projects—more.
BG Group has signed with Rock Flow Dynamics for the use of tNavigator, its fluid flow modeller, on its operated fields around the world. RFD also reports a sale to Canadian Penn West Petroleum—more.
The Technip/JGC Corp. joint venture Yamgaz has awarded Yokogawa a contract for the provision of an integrated control and safety systems on the Yamal LNG project—more.
PIDX reports progress on its RNIF2 refresh project, with a complementary training initiative and the commencement of a PIDX XML price sheet standard for the exchange and automatic validation of price information between suppliers and operators.
The W3C has published new recommendations for RDF vocabularies. A data catalog (DCAT) vocabulary provides information on available data sources. According to W3C DCAT is already used by national data portals. The data cube vocabulary brings the ISO standard for statistical data and metadata exchange to linked data. The organization ontology provides a mechanism for expressing roles and relationships within an organization enabling ‘interoperation of HR tools and emerging socially-aware software.’
The Fieldbus Foundation has finalized its ISA100.11a spec for integrating wireless field devices into its foundation for remote operations management.
The EU CEN organization has released a draft standard for electronic reporting in the engineering materials sector.
The EU’s cyber security agency ENISA has released a good practice guide for computer emergency response teams (Certs) working with industrial control systems (ICS). The manual provides advice on mitigating attacks on critical infrastructure such as energy and pipeline industries, where cyber-security knowledge ‘is often lacking.’ ICS are increasingly connected to the internet, streamlining process automation but exposing infrastructure to the risk of exposure to cyber-attacks.
Control systems are ‘lucrative targets’ for criminal groups, foreign intelligence, phishers and terrorists. Examples include catastrophes such as oil spills, floods, leakages of dangerous chemicals, major rail incidents, or power outages (although none of these are actually referenced). The ability to respond to and mitigate the impact of ICS incidents is crucial for protecting critical information infrastructure and enhancing cyber-security on a national, European and global level.
The poorly-written, repetitive, 43 page guide is replete with jargon and acronyms and unsupported scaremongering. It is to be hoped that the online training material that ENISA provides is more to the point.
IBM and SAP report the successful upgrade of Shell’s downstream SAP system covering operations in 35 countries, four global businesses and 29,000 users. The upgrade was carried out as a component of Shell’s ‘Downstream One’ program that sets out to ‘simplify and standardize’ its business processes from manufacturing to marketing. Downstream One initially implemented simplified business models, standardised processes, roles and responsibilities prior to the global roll-out of SAP. Shell Downstream users can now access the same system, with common data, common processes and a uniform basis for reporting.
IBM provided business consulting services and analyzed Shell’s existing processes. SAP’s business transformation services group added an ‘upgrade value assessment’ to identify areas of improvement based on their knowledge of new functionality and SAP solution roadmaps. IBM also led an IT asset refresh, which saw Shell switch to the latest IBM Power 7 Series (P780) servers in order to deliver greater computing power and improve IT efficiency. With the delivery of the upgraded system, IBM will continue to provide ongoing maintenance support. More from IBM and SAP.
Energy Solutions International has just released PipelineTrainer to help pipeline operators satisfy their HSE and performance obligations with a ‘rigorous, well-defined and ongoing training program’ that is claimed to reduce human error and incident rates across operations. Pipeline simulation systems built with PipelineTrainer accurately map the real-world operations of that pipeline down to the smallest details, leveraging ESI’s tools for pipeline design and modeling. Simulations can model the real pipeline network under virtually any operational scenario—what happens if a specific valve opens or closes or if a pump goes out? PipelineTrainer also realistically models leak or theft detection scenarios, pigging operations and emergency shutdown.
ESI also recently released a case history of its software use at Spanish utility, Enagas. Enagas operates around 10,000 km of gas pipelines throughout Spain along with underground storage facilities and regasification plants. PipelineTrainer was deployed to provide a realistic scenario for routine training and performance evaluation metrics for pipeline controllers and new apprentices. The simulation system needed to allow Enagas to simulate normal and abnormal operation of the pipeline based on a detailed model of the actual pipeline network. Scenarios appear to operators as high fidelity images of the actual SCADA system and its environment via the same graphical user interface.
The Mimosa and POSC/Caesar Associations (PCA) have just published version 1.0 of their joint Reference architecture framework (RAF) for engineering and operations. The RAF is claimed to benefit designers of new applications and IT systems and to serve as a classification system for existing oil and gas IT infrastructure. RAF takes earlier standards and models like PISTEP, OpenO&M, ISO 15926 and MIMOSA, adds in a goodly dose of high level concepts—TOGAF, and the Purdue enterprise reference architecture—to generate a set of models (actually PowerPoint slides) describing service agreements, system engineering, software interoperability, semantic ontology and standards utility.
Appendices cover various ‘instantiations’ of the different models—in fact sketches of previous joint industry projects covering engineering data handover, production optimization and logistics. The authors conclude that the RAF now needs to be extended with a ‘reference architecture methodology’ (RAM) to become a complete ‘generalized enterprise reference architecture methodology and framework.’ We wish them luck.
A recent issue of SAP Insider magazine described Marathon Oil’s deployment of Hana, SAP’s in-memory compute appliance. With help from consultants Deloitte, Marathon has implemented SAP NetWeaver business warehouse on the in-memory HANA hardware. Marathon is a long-time user of NetWeaver (2004/09/23).
Marathon’s legacy system, a combination of an Oracle database of SAP ERP financial data and an ETL data pipe to the SAP business warehouse, was suffering from slow overnight loads and inflexible queries. Marathon’s Marty Henderson commented, ‘Our picture was quite convoluted, and the long, complex process didn’t allow us to be as responsive to the business as we would have liked.’ The Hana appliance has displaced the Oracle database and now Marathon’s transactions follow a streamlined path to the data warehouse. Batch data transfers now run four times daily.
The Hana deployment was a part of Project Mustang, an upgrade of other SAP components including SAP ERP, BusinessObjects and Business Planning. The migration started in 2011 and went live last year. Marathon is now monitoring its Hana performance to determine whether to put the rest of its SAP Business Suite onto the appliance.
An article in the December 2013 issue of Total’s Techno Hub publication describes how the sustainable development team of Total upstream organization has worked with Toulouse, France based AidImpact to develop ‘Most,’ a management operational societal tool. Most manages social commitments undertaken in line with Total’s societal policy and provides a framework to manage stakeholder relationships, the impact of oil and gas activities on local communities and Total’s contributions to social and economic development.
Most has been in use since January 2011 at Total’s upstream affiliates and has been localized and adapted to each affiliate’s needs, terminology and procedures. Available in English, French, Spanish and Portuguese, Most is a modular solution that integrates with other Total IT systems including GIS and SAP and its stakeholder relationship management tool ‘SRM+.’ SRM+ was developed with support from Paris-based Altermondo Consulting.
BP North America unit Olympic Pipeline has commissioned a major upgrade to its Scada control system from Berkana Resources Corp. The new automated tank ticketing and virtual metering system provides accurate measurement of product in and out of the tank farm without the expense of turbine meters.
A tank farm just midway along the pipeline’s length is used as a holding space for all three product types (diesel, jet and gasoline) until they are needed for delivery. This site previously relied on sonic meters and hand measurement for product accounting. Berkana has augmented this with a virtual metering system that leverages data from flow computers to provide accurate product accounting.
Berkana’s Paul Zimmerman told Oil IT Journal, ‘We developed the system with Industrial Defender’s Rtap platform running on RedHat Linux. Rtap’s calculation engine and API library allowed us to customize the system without affecting the Rtap kernel. This means that the system can be easily and safely upgraded. Rtap also uses the flexible data storage paradigm of a ‘point type’ rather than preconfigured database records allowing for ‘fine-tuned, case by case design.’ More from Berkana.
At the 2013 gathering of the American geophysical union held last month in San Francisco, Elsevier and Ieda (the Integrated earth data applications unit of the Lamont-Doherty earth observatory) awarded the first prize in the earth sciences data rescue competition to the Nimbus team of the National Snow and Ice Data Center in Boulder, Colorado.
The Nimbus data rescue project managed the recovery, reprocessing and digitization of the infrared and visible observations of the Nimbus I, II and III satellites which were collected from 1964 to 1970, along with their navigation and formatting. Over 4,000 7-track tapes of global infrared satellite data were read and reprocessed. Nearly 200,000 visible light images were scanned, rectified and navigated. Data was converted to HDF-5 (NetCDF) format and freely distributed to users from NASA and NSIDC servers.
IEDA director and chair of the judging panel Kerstin Lehnert said, ‘The Nimbus project rescued data of high relevance to climate research, extending the climatic record in the polar regions back for at least 16 years. It involved the development of hardware and software to recover data from decaying media.’ Runners up were OldWeather a volunteer effort to transcribe and curate weather observations from old ships’ logs, a program to remaster nuclear explosion records on 8,000 Soviet-era magnetic tapes and Lockheed Martin Australia’s refresh of 40 year old Landsat imagery. More on the award here.