As we announced last month, the big event at the Madrid EAGE was the joint Shell/Baker Hughes announcement of a revamped Jewel Suite. Speaking on the Baker Hughes’ booth, Bettina Bachmann, Shell VP subsurface software, explained how, a couple of years ago, Shell decided to refresh its geological and modeling platform, adding user friendliness and a solid platform for algorithm development.
Shell, which uses a 50/50 buy and build approach, came to conclusion that Jewel Suite had the early stage tools it needed and decided to movefast. This was not so easy in view of the market penetration of incumbent software (read Schlumberger’s Petrel) and Shell’s assets’ requirements. But the development benefitted from a good ‘cultural fit’ between the partners and now, 18 months into the project it has delivered and surpassed expectations which is ‘not something you can say about many software projects.’
After the show we chatted with Martin Brudy, VP reservoir technology at Baker Hughes who described Jewel Suite as a platform-cum-ecosystem built around the Jewel Earth geomodel and software development kit (SDK). Applications share data objects in what Brudy describes as ‘lossless’ integration.
The core app is the 3D subsurface model which can be populated with stochastically generated properties and input to third party flow simulators (in particular CMG’s IMEX/GEM).
Another Jewel strength is its geomechanical modeling which now blends Jewel’s original tri-mesh modeler with ‘16 years of geomechanical R&D’ from Geomechanics International, acquired by BHI in 2008. Connectivity with Dassault Systèmes’ Abacus finite element modeler enables high-end well design.
Also on show at the EAGE was Jewel’s reservoir stimulation preview for unconventional development which brings together disparate capabilities into an easy to use workflow connected to MFrac and micro seismic data visualization. Jewel Suite is now seeing take-up from within Baker Hughes’ geosteering and wireline teams for in-house/service use leveraging Witsml real time data feeds.
Speculation was rife as to Shell’s motives for the joint develop-ment and its potential to displace Petrel. Some saw the move as a way of containing Schlum-berger’s hegemony, others as a way of plugging gaps in Petrel and/or Shell’s own GeoSigns toolset. We asked Bachmann to set the record straight and she was kind enough to give Oil IT Journal an exclusive interview which you will be able to read in our next issue.
Finally one ‘externality’ to the deal is, of course, Halliburton’s ongoing acquisition of Baker Hughes which could take the shine off Jewel’s future.
Katalyst Data Management (formerly Kelman) has bought the oil and gas data management division of Perth, Australia-based SpectrumData, provider of data management, tape transcription and scanning services to the geoscience data industry in the Asia-Pacific region. The unit will be rebranded KDM SpectrumData.
Katalyst president and CEO Steve Darnell said, ‘the acquisition is a milestone in our journey to a global presence in the major oil and gas data markets. With significant investment in North America and Europe, we will now be deploying our iGlass and SeismicZone end-to-end data management solution in Perth and Wellington.’
SpectrumData was formed in 2003 in the management buy-out of an Australian software boutique led by Guy Holmes. Acquired in 2010 by Ovation Data Services, another MBO (again led by Holmes) returned the company to independence. Following the acquisition, Holmes is to stay on to manage Katalyst’s Asia-Pacific operations. Katalyst now has offices in Australia, New Zealand and India and operates datacenters in Houston, Denver, Oklahoma City, Calgary and London.
Listen to the great and good go on about global warming and then look down the road. Cognitive dissonance (cogdiss) is everywhere!
A couple of years ago, when France was at the height of anti-shale frenzy, a couple of meetings were held in remote rural communities which effectively shut the door on shale for the foreseeable future. But how did the greens-cum-nimbies travel to these gatherings? Mostly in diesel powered motor vehicles, many of the SUV/off-road kind. With a special mention to the vehicle ‘fétiche’ of the alter mondialists, the Lada Niva whose gas guzzling propensity rivals the Humvee. Similar cogdiss will be on display as folks in their tinted-windowed limos and private jets congregate at COP21, the UN climate change conference, to be held down the road from us in what has recently been a swelteringly hot Paris.
At the EAGE last month, we heard how population growth and the need for more cars will ultimately save the oil and gas industry. This may be true but it is not much of a position in the context of COP21. Contrary views were expressed by the IEA and others who see the oil industry as a twilight concern which, although it will be ‘needed’ out to 2040, will not be growing. We also heard that the EAGE was getting ‘sustainable,’ but I’ll pass on that silliness.
Since then a group of EU oils wrote a ‘Dear Excellencies’ letter to the UN panel on climate advocating a carbon tax. ExxonMobil and Chevron quickly distanced themselves from this, stating that ‘carbon taxing without tax breaks will raise the cost of energy’ duh! Behind the EU oils’ self-flagellating entreaty hid a baser motive. The tax on carbon should be set so that it favors natural gas over ‘high carbon options’ i.e. coal. I think that the industry is on a slippery slope in dissing coal. After all, our main moneymaker, oil, is somewhere in between gas and coal in ‘badness.’ All of us purveyors of fossil fuel and biomass (a non-fossil, but carbon-rich fuel that is ‘green’ just because it’s green!) are in this together.
It is an uncomfortable fact that if policymakers and others are convinced that putting CO2 into the atmosphere is causing irreparable damage, then the answer is indeed a carbon tax. This would attack the problem at its root and all other efforts (fuel economy, building insulation, CO2 mitigation) would follow. This will undoubtedly be the direction taken by COP21, along with bags of cogdiss.
Not all are convinced by the need for regulation. David Porter, the new chairman of the Texas railroad commission railed against Washington’s recent ‘attack’ on Texas’ regulatory framework, the threat from the EPA and from Obama’s ‘war’ on fossil fuels. Ségolène Royale, France’s minister inter alia of the environment likewise argues against ‘punitive ecology.’
But oil is facing more adversity from the short sellers with a push, of uncertain effect, to encourage investors to divest oil company shares. Oils are also pressed to include future carbon liabilities in their accounts.
You would think that in view of the time that global warming has been an issue we would already have taken great strides towards using less carbon, in all its manifestations. But although the world has been greatly exercised about CO2, not much has is being done about it. Look around. What cars do people buy these days? Lower gas prices in the US are encouraging folks to trade up to gas guzzling SUVs. In the EU, popular cars include BMWs and Audis. Low interest rates and cash-poor consumers lead folks to buy on the never never, following the advice of their friendly car salesperson to go for the model with the extra horsepower.
All of which leads to the curious situation whereby Europeans and Americans, by choice, drive around in oversized over powered vehicles. By choice! Others eschew hydrocarbons and opt for electric vehicles which add low mileage between recharging stops to equation. Again by choice! Hold that thought…
To stay alive, the oil industry needs a cogdiss refresh. Instead of proudly displaying Formula 1 racing cars (Shell at the SPE) or MotoGP superbikes (Repsol at EAGE), industry needs to look more seriously at carbon capture and sequestration. So far the largely underfunded and unsuccessful efforts at CCS have focused on power stations that produce lots of CO2 on the spot. But what I am talking about is sequestration of vehicular CO2. This is a much harder problem that has been exhaustively analyzed by researchers at the University of Michigan who point out a downside to in-vehicle carbon capture. As fuel burns, it grabs bags of oxygen from the atmosphere. A kilo of gasoline produces around three kilos of CO2. This needs compressing, using more power.
More weight? More power? Where did we just hear that? Of course this is exactly what the status hungry consumer of today wants. But this time, the power is being put to an ‘eco-friendly’ use, rather than just rushing the red lights on Westheimer.
There’s another problem though, offloading all the captured CO2. This would involve two extra stops per refuel. But there again, we are only asking folks to align their stops with what a Tesla owner is already doing.
All this may sound rather improbable. But credits and tax rebates would encourage such behavior. Half price gas for every kg of CO2 offloaded? There will no doubt be an app for that.
We asked U Michigan’s Michael Sivak for an update on the program and he pointed us to Saudi Aramco’s Esam Hamad who in turn, produced a swath of papers and Aramco patents on intra-vehicular capture. There seems to have been a hiatus in the research since 2012. But if and when the C tax comes, Aramco will be in the driving seat!
How did your big data platform originate?
Mtell has oil and gas roots. We were doing predictive maintenance before ‘big data’ was invented. Things changed with the Deepwater Horizon wake-up call and the realization that not all data was available or visible from shore side. One driller, National Oilwell Varco (NOV) decided to beef-up its predictive analytics capability and selected Mtell. At that time, we used data streams on a rig to trigger notifications. But NOV wanted a solution that spanned multiple rigs with simultaneous operations and lots of historical data. This was not practical with the existing technology, so we contracted with MapR.
Is drilling data that ‘big?’ It’s a bit of a leap of faith to go to Hadoop!
We look at the problem as partly ‘tier 1,’ the algorithms that work on data from a single rig and ‘tier 2’ applications that work across multiple rigs. At the T2/datacenter level we do need a big data solution so that all rigs can share data and equipment vendors can learn from multiple facilities.
So Hadoop is really just a big file system to support your algorithms?
No it’s more than just the file system. We leverage Apache Spark in particular. This shifts focus from storage to big memory and lets us use machine learning ‘agents’ operating at the T2 level and deploy the findings on rigs for local processing.
What machine learning do you use?
We use a ‘deep learning’ ensemble approach plus signal processing and feature extraction. Deep learning scales well. Hadoop is not just for data but also table storage, historical data and learning algorithms. These include traditional word counts but extend to human/machine-defined algorithms indicative of failure.
What are ensemble models?
Small models that are tested on subsets of the data set or on different inputs. These are compared and ranked before synthesizing. They used to be called ‘random forests.’
But you don’t get big data over a satellite links!
Oils have made a big investment in data historians which can store up to around 1TB. But we are talking about 5-10 years of historical data plus real time sensor data at one second resolution. This is ‘big!’ Traditional historians may even have to resort to data compression, a major problem for analytics as data then has to be decompressed on-the-fly.
What systems are we talking about?
On the rig this will likely be a Windows machine running PI. In the data center, a large cluster. With 10k tags on a modern rig at 1 second and at 100 bytes/sample 32 TB/year/rig. We can also bring in weather data to say study the impact of heavy waves on loading of equipment, looking across many rigs for signatures that would be overlooked with a tier analysis. There is no way we could answer such questions without state of the art hardware and breakthrough machine learning. We are aggregating sensor, historical data and repair records to create a new kind of data.
Do you use for instance NoSQL?
Yes NoSQL is a critical part of our solution. It allows us to ingest time series data at high data rates. The downside is that there are fewer constraints than for a relational database. We need to handle these issues. It’s surprising that OSIsoft and Wonderware are now selling as ‘enterprise’ historians without Hadoop-style scalability*. MapR takes seconds to perform the kind of queries that take hours on such systems.
We have heard some of this before. Equipment manufacturers and oils lay different claims on data and have access to different data sets. OEMs may have more data on one kind of machine. Oils, less data on more heterogeneous equipment. Who calls the shots?
For sure GE wants to own the monitoring. In rail, companies may want to compare performance across motors from different vendors. But owner operators are really taking this in hand now.
What are the chances for MapR in geophysics?
does a lot more than vanilla Hadoop! Our CTO and co-founder M.C. Srivas
was with Spinnaker/NTAP and a contributor to Google’s BigTable. MapR is
five years ahead of the industry in the big data/HPC. Watch this space!
* Comment: OSIsoft has been researching Hadoop as a back-end for its PI System historian.
Speaking at a recent meeting of French oil and gas technologists (Aftp) Total’s Philippe Charlez* (Total) et Pierre Delfiner (Pétrodécisions) presented the outcome of a joint software project to evaluate nonconventional opportunities. The talk set out to demonstrate economic viability of shale with a comparison of a North American shale opportunity with a hypothetical European play. One problem with shale is the extremely high decline rate observed in a single well. This was addressed with an ingenious strategy of ramping up drilling operations, adapting the rig count to maintain a two year plateau of completions. At the end of which, some 1,000 wells had been drilled and completed from 100 surface locations.
The neat thing is that the build-up and plateau strategy turns the very high (up to 60%/year) decline rates of individual wells into a more manageable 10%/year for the whole project. There remains scope for improvement (and an increase in IRR and NPV) from better completions. Tax of course complicates things, here the EU situation is worse. Sweet spotting (concentrating drilling on more promising terrain) is advised against. Sweet spots are ‘like a very bad conventional reservoir.’ Conventional development is a phased process whereas nonconventional is a closed loop. You need to develop the whole area and find the sweet spots during development.
* Author of ‘Gaz et Pétrole de schist en questions,’ Editions Technip.
FFA/GeoTeric has announced Cognitive Interpretation (CI), said to be the answer to a seismic interpreter’s ‘cognitive overload,’ a situation where large data volumes lead to ‘confusion rather than understanding.’ Interpreting modern multi attribute 3D data volumes involves finding a match between what is on the screen and a known geological feature. According to GeoTeric, ‘no computer can achieve such a match as effectively as the human brain.’
Enter GeoTeric’s ‘cognitively intuitive’ display techniques. These include an RGB blended attribute volume that allows interpreters to scan through a 3D space and identify features characterized by varying seismic responses. Prior knowledge is used to identify, say, a channel system, and make inferences about its genesis.
CI uses ‘example-driven’ frameworks to allow for the simultaneous use of different data comparison techniques. ‘Adaptive interpretation’ helps select colors that best ‘reveal the geology.’ The tool also helps interpreters identify the best attributes to use in a particular context. More from GeoTeric.
Exprodat CTO Chris Jepps blogged recently on the maturing ArcGIS platform. Esri made the ‘platform’ announcement some three years back, causing some confusion in the community. Today, the platform is truly capable of delivering maps and location analytics to ‘people who don’t know or care what GIS stands for.’
For those in who do care, notably GIS-savvy explorationists, Jepps sees ‘clear benefits’ in moving to a platform. ArcGIS facilitates GIS data access through ArcGIS Online or Portal for ArcGIS. Users can easily find and run the apps they need to do their job. ArcGIS also decouples data and analytics from the desktop making them accessible across the organization and to non-specialists.
The ability to access maps from mobile endpoints is also a plus – especially for working in third party data rooms on farm-in opportunities. The platform delivers integrated views of key business data, apps for web-based maps and ‘Story Maps,’ even on an iPad. ArcGIS is free to users of Esri’s desktop and server products.
Big data search specialist Maana has ‘emerged from stealth’ to reveal over $14 million in funding from Chevron and ConocoPhillips’ technology venture arms and other VCs. Palo Alto-based Maana was founded in 2012 and describes its technology as ‘the first and only’ big data search engine powered by the open source Apache Spark, a general purpose engine for large-scale data processing.
In its stealth mode Maana has been under test at several unnamed Fortune 500 companies for machine learning and data mining. Founder and CTO Donald Thompson said, ‘Working with large technical datasets from different sources requires speed and sophisticated analysis. Previously we were running on Hadoop/MapReduce but this was not producing the performance we required. So we migrated to Spark which is easier to use and better. Spark gives our data scientists and customers familiar languages and tools. Switching to Spark was a business decision we didn’t take lightly. But we never looked back.’
Maana runs natively on Spark. In-memory caching allows for re-use of elements of an analysis, speeding performance by keeping frequently requested data in memory and reducing the need for database queries.
Co-founder and CEO Babur Ozden added, ‘Operations and maintenance involves connecting technical datasets from a large number of data sources. Search is now the easiest way to make such connections scalably. Maana is now expanding search to embrace core enterprise data assets.’ Other Maana backers include Frost Data Capital, GE Ventures and Intel Capital.
The US Chemical Safety Board’s (CSB) draft report into the 2009 explosion and fire at the Caribbean Petroleum (Capeco) terminal facility in Puerto Rico bears a striking resemblance to the 2005 Buncefield, UK oil terminal fire. The incident occurred when gasoline overflowed and sprayed out from a large aboveground storage tank, forming a 107-acre vapor cloud that ignited. While there were no fatalities, the explosion damaged approximately 300 nearby homes and businesses and petroleum leaked into the surrounding soil, waterways and wetlands. Flames from the explosion could be seen from as far as eight miles away.
The CSB found that the float and tape level measuring devices were poorly maintained and frequently were not working. An electronic transmitter card that was supposed to send the liquid level measurements to the control room was out of service. Investigator Vidisha Parasram said, ‘When that system failed, the facility did not have additional layers of protection in place to prevent an incident. The investigation concluded that if multiple layers of protection such as an independent high level alarm or an automatic overfill prevention system had been present this massive release most likely would have been prevented.’ The CSB has proposed regulatory changes to the OSHA, the American Petroleum Institute, and two key fire code organizations. For more watch the CSB’s ‘Filling blind’ video.
Actenum DSO/Upstream 5.0 and a new DSO/CX 1.0 collaboration tool are set to digitize the well delivery and operational scheduling processes.
A new version (2013.1) of Emerson/Roxar’s RMS adds new tools for structural modeling, uncertainty management and well targeting. Fault uncertainty modeling now includes sealing effects and fractures. Model-driven interpretation is enhanced with new tools for velocity model QC. Performance has been boosted with a multi-threading/parallel processing capability.
CMG has announced iSegWell, an advanced analytical wellbore modeling addition to its Imex black oil simulator.
GE Oil & Gas has released InspectionWorks Connect, a remote collaboration platform for nondestructive testing and inspection.
MicroSeismic’s AlertArray provides a ‘single solution’ to monitoring seismicity during hydraulic fracturing and fluid injection operations.
Peloton’s WellView 10.0 adds a well barrier template, data-drawn schematics for wellhead and more. A new water management solution is available in WellView and SiteView.
Trimble’s expanded geospatial portfolio for oil and gas includes new laser scanning instrument features, automated storage tank inspection and advanced pipeline data collection and analysis software.
A new ‘from the ground up’ rebuild of Headwave (V3.0) interactive seismic data visualization toolkit adds support for wide-azimuth gathers, interpretation, and well data. The new version supports stratigraphic and quantitative analysis, adding a data link with Petrel.
The 4.2 release of Interica’s Project resource manager/PARS supports Kerberos single sign-on, support for sparse files and performance improvements for seismic processing clients.
Kepware’s Industrial data forwarder for Splunk pipes industrial data into the big data platform. Custom tag metadata can be added for data enrichment, correlation, and aggregation in Splunk Enterprise or in the cloud.
Norsar has announced a data link/plug-in for Petrel. Now available from the Ocean Store.
Schlumberger’s Ocean Framework 2015 adds new APIs and a new perspective for data management plug-ins. Reservoir engineering, wells, drilling and other domains also see new added functions.
Opgal’s new EyeCGas FX optical gas imaging camera is designed for 24/7 monitoring of fugitive emissions in industrial plants and offshore platforms. The Atex-certified unit detects a variety of hydrocarbon gas emissions such as ethylene, methane, butane, propane and other volatile organic compounds.
Things have come a long way since Microsoft’s former CEO Steve Ballmer described open source software as a ‘cancer.’ Today, open source Hadoop has become a poster child for Microsoft’s big data offering as witnessed by a joint Microsoft/Noble Energy presentation at the 2015 Microsoft Global Energy Forum held earlier this year in Houston. Frank Besch (Noble) and Kelly Kohlleffel (Hortonworks) showed how aggregating data across Scada, subsurface and other systems has helped identify events that cause lost production ahead of time.
More prosaically, customers including ConocoPhillips and Wood Group Mustang presented on ‘transformational’ use of Microsoft Skype for Business with oil and gas-specific use cases and a roadmap for the continued integration of Skype into their organizations. Microsoft’s Mike Thompson explained that ‘user love’ of Skype is now combined with the security, compliance, and control that Microsoft Lync offers, creating the ‘most loved and trusted platform for doing things together.’ (help!)
Microsoft’s positioning in high performance computing is now firmly in the cloud as witnessed by Schlumberger’s ‘democratizing access to science’ presentation by Arun Narayanan. Reservoir simulation is now available ‘without the expense of a heavyweight infrastructure.’ You’ll still have to factor the expense of a license to Schlumberger’s Intersect fluid flow simulator, said to be ‘built on the Azure cloud*.’
Parminder Sandhu showed how Marathon has used SharePoint smarts from Gimmal to build MPCConnect, its business-wide solution for content and knowledge management. The bespoke records and information management solution spans customer service, project management, training and more. The solution embeds elements of Gimmal’s new ‘Structure’ software to address unified policies for retention and classification and to standardize processes for information lifecycle management. MPCConnect adds role-based processes that target operational efficiency and informed decision making.
Mark Snyder described how, with help from Access Sciences1102,
ExxonMobil has deployed a new enterprise-wide information management
framework to facilitate the flow of business-critical information ‘in
every direction and at all levels throughout the upstream asset
lifecycle.’ The project stemmed from the observation that oil and gas
asset information was not flowing smoothly across the exploration to
production transition. The lack of timely business-critical KPIs and
best practices was seen as a ‘serious impediment’ to operational
excellence. More from the GEF in our next issue. If you are in a hurry,
visit the GEF homepage.
* Intersect was originally developed on a high performance Linux cluster from Sun Microsystems (Oil IT Journal Sept 2005).
The 19th PNEC petroleum data integration conference held earlier this year in Houston kept up (in general) with its tradition of offering informational value with a majority of presentations providing insightful disclosure of upstream data management practices. Our reading of the proceedings also suggests a shift in focus from data management per se to a broader, holistic rendering of the business-data-IT technology triangle, with even a sortie or two into the digital oilfield space.
Jim Seccombe showed how BHP Billiton, with help from Halliburton and Infosys, ‘reverse engineered’ a data management solution around its existing decentralized wellbore data infrastructure that spans Petrel, Petra and OpenWorks. Project ‘O’Weld’ centralized well data using scripts to crawl servers to gather target files to a central Recall database. The project included much data de-dupe, clean-up and promotion of key data and is now kept up to date with nightly cron jobs. Other functionality includes the generation of workstation-ready curves leveraging Recall’s Raven data validation engine. Landmark/Petris Winds Enterprise adds a comprehensive search functionality.
Petroweb’s Brandon Schroeder introduced a ‘bigger, better, faster’ implementation of the PPDM data model that leverages a modern ‘big data’ back end. The problem with current PPDM deployments is that data volumes are expanding beyond the capacity of ‘traditional’ approaches such as denormalization and database tuning. Petroweb has been investigating a NoSQL solution using Hadoop, MongoDB and Solr. The tools were easy to set up, fast, scalable and, as open source software, free. Petroweb has added spatial indexing and search to Solr to expose PPDM content via a GIS. Schroeder observed that the findings are not specific to Petroweb. Others who have implemented home grown PPDM solutions could leverage the technology.
Mike Slee (Addax Petroleum) and Peter Black (EnergySys) presented an innovative, cloud-based data management solution. Sinopec-owned Addax has interests in Europe, the Middle East and Africa, presenting particular data and reporting requirements. Following recent mergers, Addax has consolidated its disparate spreadsheet-based production reporting to a hosted solution from EnergySys. The authors are scathing with regard to current oil and gas industry standards and standards bodies whose production often ‘languishes unread.’ Even ‘successful’ standards such as Witsml are ‘notable for their rarity rather than their impact.’ Addax is however a Petroweb, and therefore a PPDM user. The EnergySys approach is to leverage the generic, Microsoft/SAP-backed OData protocol that provides ‘straightforward integration’ of cloud and on premise systems, bringing together information from a variety of data sources. While the cloud itself does little to fix the problem of data silos, OData is ‘revolutionary’ in that it provides ‘cloudbusting’ technology that offers the data manager with some coding skills the ability to connect multiple desktop applications. Excel PowerQuery got a plug as did Petroweb’s GIS data viewer which also uses OData.
Petrobras’ Renan de Jesus Melo and Halliburton’s Ricardo Álvares dos Santos presented another use of OData, here to link Petrobras’ proprietary seismic trace database with Landmark’s OpenWorks. Landmark DecisionSpace Integration Server is used to consolidate and match data from the two environments and an OData link feeds up to a web-based client.
Notwithstanding the standards skeptics, Noble Energy’s Vijay Chitiyala and Oracle’s Carl Schuckenbrock teamed on a PIDX standard-based solution to manage Noble’s growing global inventory management challenge. The aim was to create a trusted and consistent foundation for engineering procurement and spend analysis. The adoption of HTS/ECCN product codes also facilitates international trade. The system comprises Oracle E-business suite and product hub and Pilog’s master data management. To date Noble has cleansed and enriched over 30,000 items to PIDX and UNSPSC standards.
Scott Sitzman (Conoco Phillips) and Ryan Hamilton (SpatialEnergy) presented another cloud-based solution for the storage and management of spatial imagery. To achieve this, a lot needs to be done up front to ensure that data is correctly georeferenced and usable by applications. But the benefit of global access to secure, hosted imagery has meant a 40% hike in delivery speed and over 3.5 million maps drawn every month from the 100 terabyte hosted set. Workflow integration feeds imagery into Conoco-Phillips’ desktop applications that include Geographix, Petrel, Esri, Kingdom and Petra.
Alberto Dellabianca and Elio Rancati presented Eni’s OSIsoft PI-centric interpretation of the digital oilfield theme. The authors observed that many digital oilfield initiatives have failed due to unrealistic goals and overarching master solutions to complex processes. Eni initially deployed a minimal solution built with standard tools from OSIsoft and OVS. This was then augmented using PI AF to create a set of templates for wells and other assets and to implement exception based surveillance of ESPs and rotating equipment. The OVS workflows now include interaction with GAP and Eclipse. GUI development leverages PI Web parts and Coresight in Sharepoint. Eni is now working to apply the framework at the enterprise level with the deployment of PI Asset Analytics to embed standardized real time KPIs in the global templates.
Marcos Pérez (Petrolink) presented work done with Pemex on the use of the (not-so-rare) Witsml standard to consolidate data formats from multiple service providers in non-conventional frac operations. In the frac van data access was limited to a single serial data port. Petrolink built a splitter box that replicated the RS232 data to TCP/IP ports. The asci data was then reformatted to vanilla Wits prior to ingestion by a Petrolink aggregator. Fully qualified Witsml data was then shared with HQ via a satellite link. The solution consolidated well head data from Schlumberger, Halliburton, Weatherford and CalFrac. Closed-loop real-time surveillance of fracking operations has optimized proppant use, reduced water use and helped manage fluid disposal.
Chris Josefy and Omar Khan described El Paso Energy’s oilfield of the future which seeks to break from current ‘less than perfect’ engagement strategies where the business seeks a solution to a pressing issue and at the same time supplies a ‘fully formed solution,’ engaging IT as a mere ‘order taker and lagging partner.’ EP Energy set up a team of engineers, managers and IT specialists to figure out ‘how we should be doing this.’ The result is Well 360, an OVS*-based portal that offers a single view of well data from disparate systems, and partially-automates well surveillance and other workflows such as artificial lift optimization. A quick win from the project was the elimination of ‘wasteful’ morning meetings that were eating up an hour per day of lease operators’ time.
Mara Abel of the Brazilian INF/UFRGS research establishment, with help from software boutique Endeeper has performed an ontological analysis of the PPDM’s lithology/core data model. The objective is software interoperability by ‘making apparent the meaning of geological objects represented in the model.’ This quasi-philosophical approach involves asking deep questions as to the meaning of ‘reservoir,’ ‘what is a rock?’ and so on. Ontological analysis is claimed to clarify the modeler’s intent and highlight conflicts of interpretation. The idea is to produce small models with a limited number of formally defined entities and attributes that facilitate data integration. Abel examined some 49 PPDM lithology tables using the Ontoclean process. The results, as far as we can tell, suggest that some geological concepts defy ontologically rigorous classification. Other ‘successful’ classifications may be rather hard for database traditionalists (or geologists) to follow!
While ‘what is a rock?’ may be a hard one, PPDM’s ‘what is a well’ (Wiaw) has got traction in ‘information-driven’ data management according to James Pipe (PPS) and Gary Winningham (Chevron). The authors eschew current siloed modeling and advocate a holistic, top-down approach leveraging a robust information model and data dictionary to align source systems with the business. A three step method starts with a logical model of the business, then maps this to applications and workflows before finally plugging in to the data sources. For Chevron, this meant refactoring existing systems around a modified and extended implementation of Wiaw.
Sean Sanders, with help from Halliburton, outlined Energy XXI’s seismic data management solution which includes a NetApp FAS 6210 Tier 1 storage unit atop an IBM/Tivoli storage manager. Troika’s newly launched data management suite is used to perform data clean-up and metadata collection prior to loading to Landmark’s PetroBank.
More Landmark technology was cited in Hussein Al-Ajmi’s presentation where he showed how KOC uses PowerExplorer as an entry point to an extensive back end including OpenWorks, Finder, LogDB, eSearch and ArcSDE spatial data stores.
As a parting shot we have to give a black mark to the presenter who chopped off the product name from his slides. Not really in the PNEC spirit! More from PNEC Conferences. Also, mark your diaries for the 20th edition to be held in Houston from the 17-19th May 2016.
* One Virtual Source.
The OVS* Optimization matters conference (OMC15), held in Austin Texas earlier this year, heard from a who’s-who of US and international opcos. Mark Crawford and Brandon Sokol discussed the role that OVS plays in Exxon-Mobil’s ‘EM2010’ technology integration, in particular with its worldwide production surveillance and optimization effort. Exxon contracted with OVS to build a toolset to manage and store production forecasts in a database. At the same time a legacy surveillance tool was replaced with an OVS-based solution co-developed by OVS and ExxonMobil engineers. Use cases include well test validation, PI tag management, volume reporting and downtime tracking. The talk ended with a list of current enhancement requests. These include better documentation, improved speed and Excel-style macro recording and pivot table functionality.
Optimization of mature fields was a common theme at OMC2015 and they don’t get much more mature than Zone Energy’s leases on the East Texas super giant, discovered in 1930. Robert Archer described how data from Zone’s water flood was initially handled in Excel. As work progressed, an OVS-supplied system was deployed along with a database with wizards for allocation and performance analysis. The system is linked to third party data stores and provides a drilling and well schematic repository, lease and well allocation workflows and more. Zone reported a ‘very affordable and quick implementation’ that has seen take-up across the company.
Ahmed Swedan (Wintershall Libya) has likewise replaced ‘complex nested’ Excel spreadsheets with an OVS-based workflow. This has automated surveillance and provides well test validation, gas lift optimization and enhanced back allocation. The system has been in operation for two years and has improved data configuration and quality and has enhanced inter-departmental collaboration. The system has helped engineering and asset management teams ‘realize the industry-wide push toward integrated operations with a single solution providing data integration, visualization, surveillance, and workflow automation.’
Mike Osborne showed how OVS has helped Pacific Coast Energy ‘change the culture’ of its operations on its super mature (discovered in 1901!) Orcutt diatomite cyclic steam flood operations. OVS has been used since 2011 to simplify digital choke management, smooth out steam generation, and optimize well injection decisions. More from OMC2015 in our next issue.
* One Virtual Source.
Peter Jenkins is now CEO at 4Subsea.
Curtis Samford is president and CEO of AFGlobal, succeeding Gean Stalcup who is now chairman.
Alan Henson is VP solutions architecture at EnergyIQ. He hails from Perigon.
Ian Evans heads-up Allegro’s new office in Dubai’s Emirates Financial Towers.
Mark Groves is now MD with Warren Business Consulting.
Martin Brown heads-up Aqualis Offshore’s new Aberdeen base.
Robert Baird is now CEO at Arma International.
Ian Little is the new CEO of Petrosys, succeeding Volker Hirsinger who remains on the board.
Chem Rock Technologies has named Barry Ekstrand as president.
Allen Ferguson leads Clariant Oil Services’ Canadian operations.
Remi Eriksen has been appointed president and CEO of DNV GL succeeding retiree Henrik Madsen.
Bob Schmitz is executive VP and CFO of Flotek Industries.
John Russell Baird is now global strategic advisor with Hatch.
Brian Harrell, Chris Luras, Ed Dobrowolski and Matt Blizard have joined Navigant’s newly formed risk management, compliance and security team.
David Donatelli is executive VP of Oracle’s converged infrastructure business. He hails from Hewlett-Packard.
Alejandra Veltmann has joined Paragon as VP and chief accounting officer. She hails from Geokinetics.
Penspen has made several appointments to its new office in Basra, Iraq. Ahmed Al-Dadah is business development director for E&PM. He hails from Amec Foster Wheeler. Na’el Barghouthi is director of asset integrity. He was recently with Wood Group Kenny. Michael Simm has been appointed regional director for Penspen’s Middle East business. Penspen Americas has appointed Carl Mook as EVP asset management, integrity and regional director.
Tom Faulhaber is CTO of cloud-based sensor network specialist Planet OS.
Suzie Turner is VP in charge of Rand Group’s Dallas operation.
Lee Hayford has taken the responsibilities of CEO and president of Real Time Measurements following Terry Matthews’ resignation. Matthews stays on as an employee.
RigNet has promoted Keith Stewart to VP global operations at its RigNet TSI unit in Aberdeen.
Glen Irving is now general manager for Siemens’ analytical products and Solutions business.
Chevron’s Janeen Judah has been elected as the 2017 president of the Society of Petroleum Engineers.
TAG Oil has named Toby Pierce as CEO and Director.
Chris Blackmon heads-up Tendeka’s new global completions, manufacturing and operations facility in Forth Worth, Texas.
David Porter is the new chairman of the Texas Railroad Commission.
Total has named Laurent Vivier president of its natural gas unit.
Kathleen Shanahan has been appointed to the TRC Companies’ board.
In our Interview (Vol. 20 n° 3) with IT Vizion we promoted Darren Steel to CEO. He is director of sales and marketing.
In Vol. 20 n° 4 we described David Johnson as Petrolink’s CTO. He is VP of research and innovation.
Our apologies to all.
Core Lab has acquired Viarmes, France-based Sanchez Technologies, manufacturer of cutting edge reservoir conditions pressure-volume-temperature (PVT) equipment and instrumentation. Sanchez is a leader in automated high-pressure, high-temperature mercury-free PVT instrumentation.
Mass spectrometry specialist 908 Devices has closed $11.6 million in series C funding to expand into the oil and gas market. Saudi Aramco Energy Ventures led the round. The monies will be used to expand commercial channels and develop additional high-pressure mass spectrometry products. 908 is also working with Schlumberger on a portable mass spectrometer.
Emerson has acquired Norwegian flow assurance and production optimization software boutique Yggdrasil. Emerson will incorporate Yggdrasil’s METTE production optimization solution into its Roxar reservoir management software portfolio, providing operators with an integrated workflow from seismic interpretation and reservoir modeling to reservoir simulation and production optimization.
After 14 years of association with Geo2X, Switzerland-based W-GeoSoft has regained its independence.
Willbros has finalized the sale of its downstream engineering organization to a group led by Bernhard Capital Partners. The unit now operates as Wink Engineering.
IFS is to acquire VisionWaves B.V., a provider of operational intelligence software, in an all cash deal.
Following last year’s merger, Blueback Reservoir is to rebrand as ‘Cegal.’
The Esri Petroleum User Group (PUG) held its US event earlier this year. Space precludes us giving the show the coverage it merits. But here are a few presentations that caught our attention. The Pipeline 20/20 kicked off with Tom Coolidge showing how Esri’s tools deliver mobile apps allowing field workers to view and update corporate data. Other tools let non specialists generate change requests and create quality map books for presentation and (even) printout. A demo featured an enterprise portal showing flow rates, tag data from PI CoreSight, real-time truckers’ behavior and weather maps.
Ayan Palit showed the beta edition of Esri’s utility and pipeline data model (UPDM), a ‘moderately normalized,’ componentized model that means that ‘you don’t need to deploy the whole model.’ The UPDM is also said to be ‘time aware’ for real time event display and management. V 1.0 is scheduled for 2016. (See Oil IT Journal Vol 20 N° 3 for more on the UPDM and PODS.)
Time-aware data was also the subject of Michael Graves’ presentation of OSIsoft’s PI Integrator for ArcGIS which uses the Esri GeoEvent extension/services to feed wellhead and pipeline data to ArcGIS operations dashboard.
Patrick Kennelly (LIU POST) reviewed advanced mapping techniques for 3D surfaces in an elegant presentation of contours and other displays. New Esri tools allow for hachures, illuminated contours and sky models. The latter are especially impactful ways of viewing densely sampled data with cliff-type features. Kennelly cited a classic 1950 paper from Tanaka on the relief contour method of representing topography.
Zheng Huang (Statoil) and Amar Garlapati (Progressive Global) described Statoil’s What’s New data notification system which pings users with an email and map as new data is loaded to their particular area of interest. The system crawls various databases for new stuff and generates a Microsoft Silverlight web map.
Statoil’s technologists may be needing a technology refresh soon as Denbury reports that its ArcGis Server/Silverlight web maps are now ‘unsustainable.’ Eric Sheehan described a migration to ArcGIS for Portal (with help from Logic Solutions Group).
Williams’ Wetherbee Dorshow showed a comprehensive use of mapping and geoprocessing to optimized pipeline route planning. Composite heat maps and cost surfaces are used to generate a smart footprint minimizing risk and cost. Iterating through various options and data QC allows for optimal routing.
Bernard South (Spatial Integration) described geoscience databases as ‘the 800 pound gorilla’ in digital exploration. Data issues are one of the main stumbling blocks to using ArcGIS in geology. Apart from bad/wrong data, the whole issue of geological nomenclature and meaning is fraught with the complexity of different methods of dating and describing strata. Geoscientists tend to squirrel away their own geological ‘nut piles’ of idiosyncratic usage, mixing chrono-stratigraphic and lithological terminology. On top of all that the geology standards from international authorities are a ‘moving target.’ South has developed a mapping table (in ArcGIS format) to sort out some of the difficulties. View these and other presentations online.
The US Department of energy’s Lawrence Berkeley national laboratory has published the world’s largest set of elastic property data for inorganic compounds. The data set targets materials scientists but may be of interest to earth science researchers. Researchers used the compute infrastructure of the NERSC’s Materials Project to calculate elastic tensors for 1,100 compounds, with dozens being added every week. The computed values are said to show an excellent correlation with experimental values.
The Materials Project uses supercomputers to calculate properties from quantum-mechanical first-principles, avoiding ‘difficult and tedious’ experiments. Although the method was invented 20 years ago, it is only now computationally feasible. We checked out the database by looking up the density of calcium carbonate and silica with surprising results. CaCO3 has 10 entries with densities from 2.1 to 3.3 while SiO2 density ranges from a superlight 0.5 to 1.7! The work has been published in Nature’s Scientific Data open access journal.
Yokogawa Corp. has announced an enterprise pipeline management solution (Epms), the product over 20 years of experience. Epms sits on top of multiple Scada platforms, providing monitoring, alarms, trends and reporting. The system is based on Yokogawa’s ‘IT friendly and secure’ enterprise operations platform (EOP). To cater for different pipeline systems and operational philosophies, Epms exposes templates and functions that can be modified without having to call in the specialists.
The Epms embeds multiple standards from the API and ISA for compliance with the US PHMSA and other regulators. Leak detection, volume balancing and hydraulic profiling techniques are used to detect possible integrity violations. More from Yokogawa.
KBC and Kongsberg are to co-develop integrated simulation software for engineering and operations workflows.
FEI is teaming with Weatherford Labs to offer advanced reservoir characterization services.
Endeavour Energy has awarded BMT a contract for the development of Africa’s first LNG import terminal.
Aker Solutions and Baker Hughes are to provide joint development concept studies, from reservoir understanding and well design to subsea and topsides facilities, including flow assurances and risk management.
McDermott Asia Pacific is to deploy Asset Guardian’s process control software management system on its pipelay vessels.
Atos and Digital Guardian are to offer cloud-based data loss prevention services in Europe.
Wuhuan Engineering has deployed Aveva’s integrated engineering and design software on its refining and petrochemical projects.
Statoil is to sponsor the latest phase of the Badger Explorer development program.
CriticalControl has signed a five year extension with a ‘senior’ Canadian oil and gas producer covering ongoing use of ProChart, ProTrend and ProMonitor Schematics. Recurring revenue from the extension is expected to exceed $CAD 1.4 million per year.
Wintershall has awarded DNV GL a $1.2 million frame agreement for inspection services on its offshore Norway projects.
BHP Billiton has selected Galenfeha to supply stored energy solutions, satellite asset location and tracking technology.
Harris CapRock has been selected by Hess to provide an integrated telecommunications solutions on the Gulf of Mexico Stampede field.
Freeport LNG Terminal is to use Honeywell Process Solutions technologies to expand its southeast Texas export operation.
Interwell is to use IFS Applications to streamline its operations. Maersk Drilling has extended its contract for the ongoing roll-out of IFS and the development of a new maintenance planning functionality. Capgemini is to train and certify consultants at the IFS Academy.
Schneider Electric has selected OpenText for its global supply chain.
Devon Energy has signed a three year deal for PetroDE’s cloud based upstream data browser.
Maersk Training has signed an agreement with Seadrill to deliver well control training under accreditation from the International well control forum.
Infosys has completed implementation of its Smart oilfield services solution for SAP ERP at FTS International.
JGC America has signed a seven-year agreement to use a hosted edition of Intergraph SmartPlant. Pertamina is also to deploy SmartPlant at its Matindok gas development. ST Engineering has implemented various Intergraph 3D design solutions.
Intertek has expanded its oil condition monitoring capability to Melbourne, Australia, providing clients with loss prevention technology and condition-based predictive maintenance. The company has also a tank calibration service in the UK for its European customers.
Petroleum Oil & Gas Services has signed an exclusive agreement for representation and support of Geophysical Insights’ Paradise in Mexico.
Pemex has awarded a deepwater Gulf of Mexico reservoir characterization project to Rock Solid Images.
Rock Flow Dynamics has announced a software link between tNavigator and Petroleum Experts’ IPM.
Ziebel has completed a distributed fiber optic sensing campaign for Statoil using its new gravity-deployed carbon composite cable, Z-Line.
Energistics’ Resqml V2.0.1 release candidate is now available for review and comment. The release adds a new activity model for capturing model workflows, a property series object for time variant data and representation of streamlines for fluid flow modeling. The new data objects are delivered in self-contained packages of the UML data model. Download the standard from the Energistics web site.
The Open geospatial consortium is seeking public comment on its SensorThings API standard for the Internet of things. The API is a free and non-proprietary lightweight specification that enables developers to connect to devices without having to use the heterogeneous protocols of different devices, gateways and services. The candidate standard follows REST principles and uses an efficient JSON encoding and the OASIS OData URL conventions. Download the API RFC from OGC.
OGC is also seeking comment on V2.0 of its web coverage service. The open standard supports retrieval of geospatial coverage data. A new transaction extension provides for transactions such as create, update, and delete, offering ‘versatile’ access to geospatial information from a wide variety of sources, notably ‘big’ geospatial applications.
XBRL has announced the XBRL US center for data quality to address concerns as to the utility of XBRL financial data filed with the SEC. The Center will provide guidance on consistent tagging of financial data and when custom tags are appropriate. A set of automated validation and error detection rules is to be developed and contributed to the Arelle open source XBRL platform.
The PIDX US Spring Conference, with 162 participants from 66 companies, was the best attended ever. Nils Svanberg presented a compelling use case for ConocoPhillips’ push for supply chain/e-business standards. Oil company debt is rising as low oil prices and high costs squeeze margins. Oils are forced to ‘confront cost and productivity challenges that others have not had the luxury to ignore.’ Outsourcing, with 70% of spend, has put focus on supply chain efficiencies, learning from the ‘factory’ model of shale operations. Standards (PIDX/UNSPCSC/OCI) are key but not a silver bullet.
Roger Bhalla, also with ConocoPhillips sees an opportunity for a new PIDX standard for supplier profiling. Bhalla asks, ‘how well do you know your supplier base?’ Operators need to know where a supplier’s legal HQ is located, its credit rating, ISO certification and safety record. He is proposing a PIDX work group for a ‘supplier pre-qualification and lifecycle management data standard’ to plug the gap.
Kristian Kalsing from SAP master data specialist Winshuttle introduced the data management ‘tax’ on businesses which struggle with poor ERP data quality. Kalsing showed how poster child ConocoPhillips (again!) has moved from 100% manual SAP data entry to an automation solution from Winshuttle (with PIDX standards), saving 1.5 minutes per record and allowing a constant headcount to handle a seven fold growth in data.
Some fifty participants took part in a round table discussion on future PIDX standards. This identified ‘hot-button’ categories including compliance, sourcing and procure-to-pay. Participation is invited for these and for ongoing initiatives on supplier KPIs, field tickets, terminal master data and business process documentation. Sign up for the next US/EU PIDX gatherings in October.
A white paper from ABB/Kepware outlines the process control technology that ABB has deployed on the ‘world’s first’ Queensland Curtis* LNG project. The first train of the coal seam gas development is in production and the project is now ramping up to a peak of 8 million tonnes of LNG per year from some 6,000 wells. Each well has around 50 tags captured every 30 seconds. Automating data collection was described as a ‘huge undertaking’ as the project infrastructure was constantly in flux.
ABB turned to Keware’s KEPServerEX to handle the project’s data management. KepServer provides connectivity tools, notably a Modbus TCP/IP Ethernet driver, which provides data exchange between OPC clients and Modbus protocol compliant controllers. This has enabled ABB to deploy its flagship 800xA integrated control and safety system across multiple OPC devices within a single communications platform, leveraging Kepware’s library of over 150 PLC, RTU, and other drivers, along with a database of configuration information. To accommodate the expanding infrastructure, ABB uses Kepware’s ability to import and export configuration data in XML, using a bespoke tool to keep the configuration current. An ‘advanced tags’ plug-in centralizes data processing at the communications server.
Kepware recently announced V5.18 of its server with high-reliability OPC UA-based tunneling, real-time changes to tag definitions and support for electronic flow measurement data from ABB Totalflow. ABB’s Scott Peterson observed, ‘these enhancements enable customers to leverage and integrate the full feature set of Totalflow systems with diverse applications and architectures, increasing the measurement and analytical capabilities of our solutions.’
* A BG Group/CNOOC unit.
Speaking at the 2015 Offshore technology conference in Houston earlier this year, Petrotechnics’ north American president, Mike Neill enumerated some process safety challenges that face offshore operators. Some risk control systems may suffer from the ‘silo’ phenomenon where information sharing is limited. Others are limited to broad brush KPI’s that may not be aligned with the current system’s state.
To address potential risks such as loss of primary containment, fire, explosion and blowout, operators erect multiple barrier systems. These are often described with James Reason’s ‘Swiss cheese’ model which postulates that although each barrier may show deficiencies, multiple barriers can eliminate the overall risk.
Neill recommends focusing on the health of the barriers themselves. Barriers are subject to many forms of deviations or impairments which impact performance. Understanding the cumulative effect of these impairments is necessary and some regulatory authorities, particularly when dealing with aging assets, have been asking operators how they account for this cumulative risk.
Assessing the risk impact of such issues can be a daunting exercise given the wealth of data. Moreover, to be relevant, assessment needs to done on the true configuration of barriers in time and space. This is where Petrotechnics’ Proscient technology comes in, providing real-time management of barrier health including the impact of ongoing impairments and frontline activities. Proscient produces cumulative risk indicators showing the impact of daily activity schedules on top of barrier impairments and allows work plans to be optimized for risk reduction. Output is compliant with API RP754 and other industry norms.
Fluid flow reservoir studies may involve giga cell models, but they won’t be fine-grained enough for in-depth understanding of chemical enhanced oil recovery (EOR). A new publication*, includes a chapter on the application of computer simulations to surfactant chemical EOR by researchers from Shell Global Solutions and Culgi.
Surfactant EOR is used to recover residual oil left in the reservoir after flooding. Finding the right surfactant formula is a complex and time consuming process that needs to be tailored to the oil composition. Previously, many surfactant-solvent combinations had to be tested in the laboratory.
Now Culgi proposes computer modeling as a more elegant and rapid approach to surfactant formulation. Starting from the molecular structure of the surfactant and the oil, Culgi and Shell have developed a novel ‘method of moments’ approach to calculate microemulsion properties such optimum salinity, from dissipative particle dynamics (DPD) simulations of mixtures of surfactant, solvent and complex oils. Culgi’s software uses a multi-scaled approach to combine molecular structure with thermodynamics.
simulations model surfactants at the molecular level. A typical
simulation is performed in a rectangular box with sides of a few
nanometer length containing around 100 surfactant molecules and 1,000
oil molecules. These run on desktop computers. More rigorous models
with millions of particles are said to be within the reach of current
* Analytical Methods in Petroleum Upstream Applications, Jan-Willem Handgraaf et al. CRC Press 2015.
Woodside has followed in Repsol’s footsteps with the deployment of IBM Watson as part of its ‘next steps in data science.’ Woodside also announced ‘WITH’ a innovation and technology hub. The hub will leverage Watson and other big data techniques to combine sensor and control system data, automating high-cost, high-risk or error-prone tasks. WITH will build on Woodside’s real world LNG plant expertise including new technology platforms such as NearShore LNG and ‘ultra-modular’ NextGen LNG.
Watson will be trained by Woodside engineers, gleaning insights from unstructured historical data in project reports. Senior VP Shaun Gregory said ‘the new toolkit brings evidence-based predictive data science that will bring down costs and increase efficiencies across our organization. We plan to use this predictive tool whereby every part of our organization will be able to make decisions based on informed and accurate insights.’
Watson’s cloud-based ‘lessons learned’ cognitive advisory service is said to ‘scale’ engineers’ knowledge and make insights and information accessible. A lessons learned function lets engineers ask complex questions in natural language.
GeoDigital, FLoT Systems, and Environmental Consultants are to team on an unmanned aerial vehicle-based (UAV) utility asset inspection service offering. The program combines high-resolution imagery collected by FLoT Systems’ beyond line-of-sight capable UAV, analytics and location-based work management software from GeoDigital, and virtual inspection of those images with utility, vegetation and asset management expertise from ECI. The solution will help operators of critical infrastructure including pipelines use UAVs (a.k.a. drones) to locate potential risks.
FLoT Systems’ CEO Chris Vallier said, ‘We have been developing the ability to utilize long-range UAVs to collect high value facility information. We understand how to operate within the current regulatory framework while helping clients prepare for full-scale commercial activity in the near future.’
A new Markets & Markets report puts the value of the oil and gas data management market at ‘$21.22 billion’ by 2020, up from a $6.08 billion base in 2015, at a remarkable 28.4% annual growth rate. The report sees a shift from traditional processes to smarter operations and a shrinking workforce at remote and offshore exploration sites. Key players in the global ‘software defined’ data center market include SAP, IBM, Wipro, NetApp and Oracle and VMware.
Technavio/Infiniti research has added three new reports on next-gen oilfield technology to its library. The ‘global’ market is to grow at 5% out to 2019 driven by growing demand for oil and gas. But ‘getting employees to embrace the new method of operation is a hurdle in the implementation of this technology.’ The automation/DCS/Scada side is set to grow at 7-9% through to 2019. The smart oilfield IT Services Market will only grow 5.93% to 2019. Corporate data security is ‘one of the strongest barriers to the adoption of IT services in the oil and gas sector.’
Technavio also released a report on the state of play of carbon capture and storage.