September 2014


CGG ‘Akon’ for Diskos

Norway’s national data bank migrates from PetroBank to CGG’s Trango-based system. Moving multi-petabyte seismic dataset to IBM 'elastic storage’ disk and tape robot proves ‘challenging.'

Speaking at the ECIM E&P Data Management conference in Haugesund, Norway this month CGG’s Kerry Blinston unveiled ‘Akon,’ a new data management offering from CGG and Kadme that is being rolled-out to Norway’s Diskos upstream data community. Diskos was initiated in 1996 by Norway’s NPD, the regulator, to rationalize upstream data storage and offer an online, entitlements based service for well, production and seismics.

To maintain competition in the upstream data services arena, the Diskos contract is the subject of regular tenders. It was first awarded to an IBM-lead unit, then Landmark and most recently, Schlumberger. Until now the seismic data management software has been Landmark’s PetroBank, originally developed by IBM.

All is about to change as CGG deploys ‘Akon’ (Greek for Javelin—q.v. Diskos geddit?), a seismic data management solution built around the Trango product, acquired when CGG bought Fugro’s geoscience division in 2012.

Trango will store Norway’s seismic data on an IBM ‘Elastic Storage’ (general parallel file system—GPFS) unit that uses a combination of disk and tape robot to provide a unified view of data irrespective of storage location. Trango embeds a data model running on Oracle and based on the PPDM data model, a ‘standard, completely open, fully published model.’

The NPD’s Eric Toogood provided more details on the transfer of operations, a delicate operation as it is the first time that Diskos has been run sans PetroBank. Both the Trango back end and Kadme’s Whereoil client needed customization for the new task and, behind the scenes, a major data migration project is underway. Despite (or perhaps because of) earlier claims that PetroBank leveraged another ‘standard’ data model (POSC Epicentre), the knowledge required for getting data out of the old system requires an understanding of the data model that is only ‘in the hands of a few key people.’ This is proving ‘quite a challenge’ and, with the new system scheduled to go live on 1/01/2015, ‘time is running out.’

Around one petabyte of data is being moved from PetroBank to Akon and a data improvement/clean-up project is to run in parallel with the migration. The new PPDM data model represents an opportunity to receive and store data more efficiently.

While the Akon name has class, it will likely be replaced by the more prosaic ‘Diskos Seismic DB’ when the system goes live. More from ECIM in next month’s Oil IT Journal.


Cegal, Blueback merge

Norwegian IT services provider, backed by private equity group, acquires Blueback Reservoir share capital in paper transaction.

With backing from private equity group Norvestor, Cegal, a Norwegian provider of IT-services to the oil and gas industry has acquired the totality of the share capital of Blueback Reservoir in a (mostly) paper transaction. Blueback Reservoir was founded in 2005 by ex-Technoguide (original developer of Schlumberger’s Petrel) personnel and provides, inter alia, plug-ins to Petrel. Cegal is a Stavanger-based IT services provider specialized in tailor made cloud technology, including managing and delivering large data volumes and software solutions to companies in the oil and gas industry.

Blueback will merge into the Cegal unit which now has over 300 employees in Norway, London, Houston, Calgary and Dubai. The merged company is owned by Norvestor, (52%), employees and board members (48%). Forecast revenues for 2014 are 550 million NOK with a target of 1 billion NOK for 2016 driven by organic growth and further add-on acquisitions.

Norvestor’s specialization is Scandinavian midmarket companies operating in ‘fragmented’ markets with the potential to achieve ‘a leading international position.’ More from Cegal.


Persisting Norway’s data. Communicating geosciences.

Neil McNaughton is back from Norway’s ECIM data management conference where he had a spot of déjà vu regarding the merits of a ’standard’ data model. His eye was also caught by a brave attempt from the London GeolSoc on how best to communicate awkward geo-facts to a skeptical public.

You have to hand it to the Norwegian Petroleum Directorate. Just when everything is running smoothly, every few years, in the interest of fairness and encouraging cooperation, it re-tenders the management contract for Diskos, the national geoscience data bank. To date, while operations have changed hands, from IBM, through Landmark to Schlumberger, Diskos has stayed with the same technology, PetroBank. This time around, everything changes. The contract has been awarded to a new CGG-backed group using new software. CGG is busy scaling up its Trango seismic data management system to handle the new job (see this month’s lead).

Back in the day, when PetroBank was IBM, Oil IT Journal, actually its predecessor, Petroleum Data Manager, followed such developments assiduously, in particular the ‘standards compliant’ aspect of such tools. In 1997 we wrote about the subtle differences between early versions of the POSC (now Energistics) Epicentre data model, as used by IBM, and the ‘Discovery’ project subset of the same model used in Landmark’s OpenWorks. At least that was what people said at the time.

There was also the imagining that using a ‘standard’ data model would be a sure fire route to interoperability and, nota bene, preserving the data investment in the event of a subsequent change of ownership and or technology. Well here we are with a change of ownership and apparently no easy route to data migration. It seems to be rather a matter of finding the right specialists with knowledge of all the tweaks and triggers that have been built into the database over the years to keep things up and running. What does appear to be working still is the idea that a ‘standard’ data model (this time it is PPDM, Trango seemingly is ‘100% compliant’) will offer all the same perceived benefits as the old ‘POSC compliant’ PetroBank was supposed to. There must be a moral here somewhere. Technology changes, data models die, but a good marketing spiel lives forever!

~

I was rather taken by a recent blog posting from the GeolSoc’s Nic Bilham discussing how to communicate ‘contested’ geoscience. No, we are not talking creationism here, rather how to persuade local populations and politicians that certain activities like carbon dioxide capture and storage (CCS) is, or can be, safe. In the eyes of the public it appears that CCS is considered to be just a tad less evil than fracking in its capacity to generate earthquakes and poison local water supplies.

Before looking at Bilham’s arguments, an observation. CCS is not really like the short term pumping activity of a frac job, but it does share a potential for earthquake generation with a disposal well. Fracking and CCS share a common ‘issue.’ So how do we argue, from a geological standpoint, that injecting CO2 or disposing of frac fluid can be safe?

Bilham reported on a London Geological Society conference held earlier this year on public concerns around radioactive waste disposal, shale gas, fracking and CCS. Geologists have a ‘privileged understanding’ of our planet and the processes that have shaped it. They are also comfortable dealing with uncertainty. It can be hard to communicate probabilistic assessments of resources and risk without these being perceived as an expression of ignorance, undermining public confidence in the expert. Worse, this gives ammunition to adversaries to, say, fracking to ‘play fast and loose with the evidence.’ If nothing is certain, evidence is cherry-picked and unsubstantiated claims get traction. Bilham also observes that ‘professional scientists are not above such guerrilla tactics’ in what he describes as the asymmetric warfare of science communication. Simple, but false, ‘certainties’ can have an appeal that more complex and nuanced explanations and assessments lack.

So what to do? Well this is of course where the going gets hard. Speakers at the GeolSoc conference suggested practical ways of improving geoscience communication: images that show clearly what is going on under the ground, finding the right narrative, using social media and making data ‘open and discoverable.’ There is also a problem in how researchers are trained. There is an impression that science is made up of three branches physics, chemistry and biology, that have little to do with each other. It would be better to highlight the overlaps between specialisms, raise awareness of other disciplines and stimulate interdisciplinary thinking. This requires reforming how we teach science in schools to ‘enable the public to be discerning in their approach to scientific claims about politically contested matters.’

All this is very well but it is not going to tip the balance in Europe from skepticism and mistrust to acceptance. Decisions like these need to go beyond geological feasibility and embrace politics and economics. What really drives acceptance, at least for fracking, is the existence of an oil and gas province that has resigned the population to the gains and losses that such activity brings.

If I might add an observation, areas with established petroleum systems appear so far to be the best candidates for successful non conventional production. So how about a different approach. Go into a new area and propose your fracking program. If the public object vehemently then you are probably looking in the wrong place. Anyone for ‘crowdsourcing’ exploration?

Follow @neilmcn


Book review—Integrated operations in the oil and gas industry

Norway’s ‘world leading,’ digital oilfield initiatives described in 400 page volume discussing pros and cons of remote operations centers and implications for situational awareness and safety.

Integrated operations in oil and gas industry (IOOG*), subtitled ‘sustainability and capability development’ is a 400 plus page large format, multi-authored publication, with contributions from (mostly) academia and industry. In their introduction, editors Tom Rosendahl of the BI Norway Business School and Vidar Hepso (NNTU) relate integrated operations (IO) to projects like Chevron’s i-Field, BP’s Field of the future and others. While most oil and gas companies have some such initiative ongoing, according to Rosendahl, Norway’s IO initiative is ‘regarded by many as the world’s most advanced.’ The essence of IO is the migration of operations and personnel from the offshore to the onshore, thanks to the deployment of information and communications technology (ICT) and remote operations/collaboration centers.

In his introduction, Rosendahl summarizes some of the ‘issues’ around IO. At the turn of the millennium there was an ‘overoptimistic belief’ in the IO and its potential benefits. Much of the early work was technology based, remote control was ‘heralded with enthusiasm.’ At the same time, others argued that IO is all about ‘people and processes’ and not about technology. For Rosendahl, both views have proved wrong. Successful IO deployment requires restructuring of work processes and management, particularly management of change, is the key.

This is where the ‘capability development’ terminology of the book’s subtitle comes in. This Rosendahl defines as a holistic approach involving human skills, work processes, governance and technology to effect change. On which topic, IOOG acknowledges the role played by Tony Edwards and his Step Change Global consultancy. Edwards is mentioned 48 times in IOOG and is said to have introduced the capability framework into Norway.

Despite the ‘holistic’ claims, IOOG is really about the soft side of managing technology. It is also weighted to an academic viewpoint. An early chapter introduces the ‘capability platform,’ an information ‘ecology’ and more key buzzwords. Both people and technology are required for success. Network centric design ‘moves the center of gravity of the organization to the edge’ and enables ‘generativity’ of new ideas. Statoil’s IT is discussed but there is curiously no Statoil authorship in IOOG.

The ‘pinnacle’ of IO is the collaboration room, an onshore facility where operators and experts direct operations. But Norway’s large offshore facilities are still manned—requiring the collaboration room to be replicated offshore in what has come to be know as the ‘glass cage.’ An anonymized (but you know who it is) discussion of IO at a Norwegian oil company found a dilemma in that while the onshore/offshore teams have good shared situational awareness, sometimes offshore leaders ‘spend too much of their time in the glass cage.’ Email is reported as a ‘time thief’ and is showing no signs of abating. Overall the impact on safety remains moot. Moving workers onshore improves their personal safety but lessens their ‘situational awareness.’

Remote operations’ impact on safety is complex. Removing operators from an offshore site moves them away from danger, but even with a sophisticated remote operations center, it is hard to achieve the ‘situational awareness’ of being on-the-spot. In fact we misread the title of the penultimate chapter ‘IO as a contributing factor to major accidents.’ Surely there should have been ‘avoiding’ in there? But no, the authors take a hard nosed look at the risks induced by moving command away from operations.

Oil IT Journal has been following developments in Norway’s Integrated Operations initiative since 2004 and in particular, the Integrated Information Platform (IIP) which inspired a POSC (now Energistics) IO special interest group. We were therefore surprised that IOOG, despite having over 20 index references to ‘information and communications technology,’ makes no mention at all of the IIP. What we have always understood to be the IIP’s key technology, ISO 15926 and the semantic web are only mentioned (with no development) once each. This omission likely reflects what IOOG describes a ‘tribal war’ between those who focus on the ‘data and technology’ dimension and those who focus on the ‘human and social.’ It is clear which side of the tribal war IOOG is on! The self-imposed requirement to ‘balance’ people and technology prevents any serious discussion of automation, an awkward subject which the authors have largely avoided. But whether you are ‘techno’ or ‘people,’ IOOG provides many worthwhile insights—too many to present in this short review.

* IOOG, IGI Global 2013. ISBN 9781466620025.


Apache Spark for ‘big data,’ in-memory seismic processing

Texas A&M professor reports on tests of novel ‘big data’ infrastructure.

At a recent meet of the Society of HPC Professionals, Lei Huang (Prairie View A&M University, Texas) presented the results of his research on the use of Apache Spark for cloud-based seismic data analytics. Spark, the latest member of the Apache open source software group’s ‘big data’ offering is claimed to better MapReduce and provide a unified, scalable, parallel processing engine for big data. Working at Texas A&M’s Cloud Computing lab, Huang has leveraged Spark in a ‘platform as a service’ for seismic data processing and analytics.

Spark was developed to overcome some issues with Hadoop and MapReduce which require tuning to a particular task. Spark’s developers’ goal was to design a big data system that is ‘as powerful and seamless as those used for small data.’ Spark offers a unified engine generalized platform with standard libraries for machine learning and ‘graph parallel’ computation on ‘resilient distributed data sets.’ Spark’s ‘data acyclic graph’ engine is said to support fast, in-memory computing. Spark-based jobs embed ‘sophisticated’ data analytics algorithms for image and seismic processing. Huang was previously a parallel programming consultant for Seismic Micro-Technology (now IHS).


Madagascar - seismics in the cloud

Open source seismic imaging system now available in SageMathCloud.

Sergey Fomel (University of Texas at Austin) reports that the open source Madagascar seismic processing toolset has been ported to the cloud. This has been achieved by leveraging the SageMathCloud, a platform for computational mathematics. SageMathCloud is a component of the Sage project, an open-source mathematics package, the brainchild of William Stein (University of Washington). Sage builds on other open-source libraries such as NumPy, SciPy, MatPlotLib, R and more.

Accessing the SMC is achieved through a Python-based language that provides a rich programming environment that can be been used to install Madagascar. Madagascar itself is now accessible interactively through the Python interface. Fomel’s blog shows how Madagascar can be run interactively, controlled through an IPython notebook hosted on the SMC and using IPython’s new ‘interactive widgets’ feature.


Eclipse parallelized on Univa Grid Engine

Grid computing boutique reports ‘amazing’ reservoir simulation performance on clusters.

Univa Corp. has announced that its Univa Grid Engine (UGE) has been certified by Schlumberger to run its flagship Eclipse reservoir simulator. UGE, a cluster management system, was developed by Univa CTO Fritz Ferstl who ran Sun Microsystems and later Oracle’s grid engine business. Oracle sold its grid IP to Univa in 2013 (an open source flavour, the Open Grid Scheduler, is now a SourceForge project).

UGE is now integrated with the 2014.1 edition of Eclipse. Ferstl commented, ‘Our teams have developed the scripts and completed testing. Running Eclipse using the UGE on high performance clusters is seamless and offers reservoir simulation at an amazing rate.’ In its latest manifestation (Version 8.2, released this month), UGE has improved scalability, performance and resource control for small and large clusters. The latest version also adds native support for Windows machines, both as clients and servers. Univa has also implemented the V2.0 of the Open Grid Forum’s distributed resource management application API0703 for improved application integration and control.


Terra3E’s Petrel plug-in promises ‘rigorous’ shale volumetrics

Full physics computation of adsorbed gases for different shale types.

French startup Terra3E (for energy, environment and expertise) has announced a shale volumetrics plug-in for Schlumberger’s Petrel geoscience interpretation platform. The plug-in provides ‘rigorous’ volume calculations for Petrel geological models using full physics ‘Langmuir isotherms’ to compute the adsorbed gas volumes. Terra3E’s tool computes initial gas in place and at surface condition. Both adsorbed gases and liquid-rich gases calculations are available for a range of shale types.

The tool provides 3D spatial distribution (maps) of shale facies and porosity. Other features include 3D total organic carbon, overpressure computation and a black oil representation of fluids in liquid-rich shales. Calculations can be compared with log derived values for cross-check and update of the geological model.

Uncertainty ranges for calculated values are provided as volume distribution histograms, P90-P50-P10 values and pie chart. The plug in also produces a Word document summarizing calculation results. Terra3D’s approach was described in a paper presented by Jeremie Bruyelle (Terra 3E) and Dominique Guerillot (now with Qatar Petroleum) at the 2014 International Petroleum Technology Conference held earlier this year in Doha, Qatar.


de Groot-Brill to release Open dTect V5.0 at Denver SEG

'Freemium’ based seismic interpretation system adds anisotropy, Matlab connectivity and more.

Netherlands-headquartered de Groot-Brill Earth Sciences has announced a beta version of Open dTect 5.0, the ‘official’ version will be presented at the SEG exposition in Denver next month. Open dTect is an open source seismic volume interpretation system marketed on a ‘freemium’ model. The open source edition is a free download while enhancements, plug-ins and services are available commercially.

V 5.0 sees the replacement of the Coin graphics library with the OpenGL-based Open SceneGraph. The new release sees improvements to 2D data handling. A directional texture attribute plugin from Austria’s Joanneum Institute provides multi-azimuth texture analysis for seismic anisotropy investigations. A Matlab connector, developed with funding from BG Group, enables Matlab programs to be run inside Open dTect.

A new interactive 2D horizon cube tracking worfkflow, developed with help from Open dTect’s SSIS consortium, provides GUI enhancements, systems tracts interpretation support and automated fault extraction. dGB is in the process of converting its help files and documentation to HTML5 using MadCap Flare which provides context-sensitive help, a searchable index and table of contents.


Software, hardware short takes

CGG, GexCon, Ecom, Paradigm, Schlumberger, Amalto, Blueback, CartoPac, EIA, Blue Marble.

CGG’s Hampson-Russell unit has released HRS-9/R2.1 with a new lithology log creation tool, 4D seismic modeling functionality and a link to CGG’s ‘Geovation’ seismic processing package. A seismic processing plug-in enables the development of custom processing algorithms.

Christian Michelsen Research unit GexCon has announced Flacs-Fire for computational fluid dynamics modeling of jet and pool fires. The Windows/Linux-based tool targets offshore oil and gas where these are major potential hazards. The tool is available either as an add-in to GexCon Flacs or as a standalone product.

Ecom is claiming a ‘world’s first’ for ‘Tab-Ex’ a family of ruggedized tablets certified for Zone 1/Div. 1 hazardous areas. The units are built on the Samsung Galaxy Tab Active platform. Users can to view and interact with Scada/DCS systems, SAP, Maximo, and CAD models. Secure connectivity is provided via Samsung Knox.

The latest release of Paradigm Geolog includes a plug-in for data exchange with Schlumberger Petrel and a new module for log-based pore and fracture pressure prediction for safer drilling. Geolog also includes tools for shale gas analysis and geomechanical workflows developed by Saudi Aramco. Paradigm has also announced Sysdrill 10, an update to its well planning and drilling engineering applications. The upgrade integrates with Peloton’s WellView and MasterView databases and a jar placement module based on technology acquired from Cougar Drilling Solutions.

Schlumberger’s Petrel Shale provides a custom user interface and toolset for unconventional resource development. Geoscientists can integrate geological, geophysical, and production data to define sweet spots, plan well locations and analyze production trends. The Studio E&P knowledge environment provides access to data and results across assets.

Amalto’s ‘Field-to-Finance’ is a field ticket to invoice solution for oil and gas that leverages the Salesforce platform and Amalto’s e-business cloud. F2F is accessible from multiple endpoints including IOS and Android.

The V15 release of Blueback Reservoir’s Blueback Toolbox plug-in to Schlumberger’s Petrel includes over 100 time saving tools for Petrel users and supports the new ‘Ribbon’ interface of Petrel 2014. Highlights of the new edition include a seismic wavelet editor and waveform classifier for identification of facies types.

CartoPac 5.0 extends the mobile platform with support for Windows 8 and introduces a new manager for enterprise geospatial data workflows. The CartoPac Workflow Manager, developed for regulated assets including gas pipelines, and creates an audit trail of edits that enables organizations ‘to show regulators they have a structured process for tracking and responding to the conditions of their critical assets.’

The U.S. Energy Information Administration (EIA) has launched STEO, a short-term energy outlook data browser providing analysis and visualization of historical and forecast data from its short-term energy outlook service. STEO covers US energy production, consumption, inventories, imports, exports and prices.

Blue Marble GeographicsGeoCalc 7.0 software development kit is now compatible with Microsoft’s .NET 4.0 architecture and includes a new ‘area of use’ polygon extension, an updated tidal datum model and a new ‘concatenated’ coordinate transform class.


FME conference presentation highlights open source

Devon Energy uses Safe Software’s FME, OpenGeo and Apache Solr to GIS-enable SCADA.

Speaking at the 2014 Safe Software user conference in Vancouver earlier this year Jerrod Stutzman showed how Safe’s FME flagship is used to manage GIS data synchronization at Devon Energy. The goal was a centralized spatial system for storing and displaying Scada data along with desktop and mobile mapping and ‘Google-like’ search and performance. Devon’s ‘spatial reasoning system’ joins scada data to its ArcSDE spatial database of record leveraging the OpenGeo Suite. Apache Solr provided the search engine.

Devon developed its mobile apps using FME for data creation and synchronization. The open source software stack works across Devon’s four million feature well dataset which is rendered ‘quite fast.’ PostGIS can store all three well geometries on one row, reducing database complexity and cost. FME Server handles scheduling and failure reporting via email.


Consortium corner

SEG SEAM’s life-of field model. Aptomar spill detection, Force open projects.

SEAM, the Society of Exploration Geophysicists (SEG) Advanced Modeling Corporation is looking for partners in a new Life-of-field project spanning geology, engineering and geophysics. The project is to develop an ‘industrial scale’ synthetic 4D data set for researchers. The 3½ year project is set to kick off early in 2015 with a life-of-project price tag of $210,000 per member.

Eni Norge, Statoil, Gdf Suez E&P Norge, OMV Norge and the Norwegian Coastal Administration have teamed with Aptomar to develop technology and a communication infrastructure for offshore oil spill detection and management. The ‘multi-million kroner’ project will allow Aptomar to develop its ‘Securus’ system and facilitates integration of multiple surveillance sources into Aptomar’s tactical collaboration and management system (TCMS). The JIP is expected to end this year.

Norway’s Force technology accelerator has announced the following ‘open’ projects: Safari 3 (virtual outcrop geology), GPlates (improving exploration with plate tectonics) and RM3D (geologically focused reservoir modelling).


Inaugural SMi Big Data and Analytics in E&P, London

New conference addresses the big questions of big data. For some it is a buzzword-laden rehash of what we already know, for others, a powerful addition to the business intelligence canon. Is ‘data scientist’ really the ’sexiest job of the 21st century?’ Is domain knowledge still required?

It would be great to be able to report that SMi’s Big Data & Analytics for E&P conference held earlier this year in London offered in-depth presentations of the application of big data/analytics (BDA) in oil and gas but this was not really the case. The conference included some familiar faces and presentations that were somewhat shoehorned into the big data theme. This makes us ask whether a) big data is already part of the upstream’s way of doing business or b) the big data movement has little to offer the sector or c) the technology is so promising that nobody wants to talk about it in a public forum.

John Greenhough (University Of Edinburgh) set the scene with a presentation on the potential of big data in E&P. Big data is classified as structured (i.e. in a database), unstructured (documents), real-time (sensor/log data) and ‘open,’ i.e. freely available data from sites such as the US’ data.gov and Statoil’s publicly released data from the North Sea Gullfaks field. Making sense of these large data sets has been enabled by developments in multi-core and cloud computing and new technologies such as Hadoop (a framework for distributed processing of large data sets on clusters), Cassandra (a key-value data store à la BigTable) and MongoDB (another ‘NoSQL’ table database).

For Greenhough, E&P data opportunities abound and span the field lifecycle from exploration through asset integrity. One concrete example of ‘big data’ is the shift from daily or monthly production reporting to high frequency real time data from smart fields. The university has been using such data sources to improve recovery from oil and gas fields and now holds several patents for its oilfield analytics. These provide ‘data-driven insights’ into reservoir connectivity, water-flood management and production forecasting. One application, the statistical reservoir model, uses Bayesian statistics to forecast production. Another uses ‘pairwise correlations’ of production between wells to estimate connectivity and map pressure changes in the reservoir. Following field trials the university is now planning a spinout, ‘Recovery Analytics’ to deliver software and analytics solutions.

John Edwards (Aston Business School) doubts that BDA is very new. Business intelligence was first used in the early 1970s. The term data mining was first used in 1983 and the data warehouse came in in the 1990s. All of the above are really just the current buzzwords for management science. However, the big data movement is new in that it brings less structured data into the analytics orbit. This includes messy stuff like social media, text, speech, images and sensor data. All of which may be more or less amenable to physics based modeling, statistics, machine learning and data mining.

Published work on big data in oil and gas is limited to the downstream where it has been used in smart metering. Video analytics has been applied to flare monitoring and oil spill/pollution monitoring. Clementine (now IBM SPSS Modeler) has been used to help control refinery waste products. Analytics has used in E&P for decades. Edwards offers the following advice to would-be BDA practitioners whose focus should be on ‘What questions would you like answered better that you can’t answer well enough now?’ Next you need to decide whether your own domain specialists can hack BDA or if you need external data scientists. The jury is out on this. On the one hand, it had been found that ‘Data scientists need domain knowledge’ and on the other hand, ‘Often someone coming from outside an industry can spot a better way to use big data than an insider, because so many new, unexpected sources of data are available.’

Nick Dyer (Front End Data) and David Bond (Panoramic Data) offered some lessons learned using real time analytics to track the action in the UK’s Premiership football (soccer). First with some pragmatic but insightful definitions viz. ‘Real-time’ translates as ‘faster than we’re comfortable with,’ ‘big data’ as ‘bigger than we’re comfortable with.’ So real-time BDA is about ‘making informed decisions quickly with vast amounts of input data.’ Traditional big data solutions store everything forever and are oriented to batch processing, analysis is done days or weeks after the event. Real-time systems can’t fail, must have the required performance and embed pre-defined queries. In Premiership football, real-time data provides tactical analytics for team managers, performance analytics for journalists and trend analysis for research. This is enabled by video tracking of players on the field and ad-hoc analytics providing heat maps of where and how the action is progressing. A sophisticated IT architecture blends metadata on player, teams with 3D models of stadia. 24 HD cameras record some 27 terabytes of data per match which is analyzed by the real time object tracking system to provide 4.7 million data records. Data is analyzed by ‘eventers’ which can be human or machine. A combination of a SQL database and Olap cube are used along with physics and math-based algorithms to generate KPIs for player speed, event successes (passes, shots) and so on. The IT involves various loops with the auto eventers working fast and the slower human trackers providing validation and data checking. One application that runs off the system is Venatrack, used by managers to review fitness, passing and shooting success and make substitutions. Others use the longer term data generated by such systems to evaluate players for transfer and team building.

Another telling use case is big data use in mobile telephony. Here operators are engaged in a constant battle with bandwidth and leverage strategies such as data compression and data parsimony (don’t use 8 bytes when one will do). In both systems, data significance determines the information path. A fast track discards and categorizes data at source while a parallel, slow data pipeline keeps everything for later ‘traditional’ analytics. Timeliness vs. completeness is another consideration as when delayed data changes information. Here, tactical decisions will use what’s available while strategic decisions will be made when the full picture is clear. Bond also says that you should ‘take data modeling seriously,’ and use multiple storage forms (memory, file, SQL, OLAP, custom) and technologies (memory, SSD, HDD & SAN). While this presentation was off topic it was very a propos for potential deployers of similar technology in a digital oilfield context.

The inimitable Jess Kozman (Mubadala Petroleum) asked ‘should E&P companies have a chief data scientist?’ The results of a straw poll held during a recent data management conference revealed that the title of ‘data manager’ was perceived as conveying experience and knowledge (no surprises there considering the audience) but that data ‘engineer’ came out ahead, with ‘data scientist’ a close third. According to the Harvard Business review, data scientist is the ‘sexiest job of the 21st century!’ What distinguishes data ‘science’ is the ability to go beyond the reporting function and provide ‘predictive insights and innovations’ from the data.

According to Kozman, a single oilfield in Qatar generates three times the data than that used in finding the Higgs boson (around 30 petabytes). With a little shoehorning we can recast the whole of seismic imaging (very big data there) into the new paradigm. Digital rocks likewise provide very big data volumes from very small samples. Pretty well all oilfield data now ticks one or more of the big data boxes of ‘variety, velocity and volume,’ especially the newly popular fiber optic distributed sensors which generate ‘tens of terabytes per day.’ Much current data loading and preparation is so slow that much time and money is lost while actionable information is extracted.

Compounding this problem is the fact that oil and gas data is unique in its scope and variety. Typical workflows blend data from a large number of sources. Kozman sees potential work for the data scientist in leveraging Apache PIG data pipelines, using the Hadoop/HIVE combination to ‘tame’ sensor data and Apache Storm to process ‘billions of events.’

Matthew Harrison gave a wide ranging presentation on big data at the British Geological Survey (BGS) which actively manages a growing 250 terabyte dataset along with several hundred kilometres of core data and 17 shelf kilometers of paper records, maps and reports. BGS is a contributor to the open data movement notably via OpenGeoscience, where users can view and download geology data as web map services, access over a million borehole scans and search and download photos from the GeoScenic geological photo archive. BGS also offers more specialist data such as temperature and strain gauges for monitoring earth movements and geophones for earthquakes. BGS manages its data leveraging standards for geographic data discovery and is a big time data modeller. The OpenGeoscience portal is underpinned by the BGS Rock Classification Scheme, a ‘practical, logical and robust system’ for classifying and naming geological materials as they appear at the scale of an exposure, hand specimen, or thin section. The BGS standards map shows around 20 different standards of varying scope with the venerable Codata as a global scientific standard. Harrison wound up observing that geoscience data is truly ‘big’ and growing rapidly especially at the complex end of the spectrum. Ways need to be found to capture the data and geological knowledge held in thousands of scanned images programmatically. Perhaps linked data approaches and ontologies are the way forward—although their creation can be problematic. He also sees potential in using social media and system logs to derive knowledge and business intelligence.

Another retrofit came from Jill Lewis (Troika) who argued plausibly that for seismic, big data is a done deal and that what is required is a continued focus on standards and data quality. On data acquisition, ‘Do it once and do it right,’ manage data pro-actively, use the right formats, populate mandatory fields and get ready for BDA.

Aapo Markkanen (ABI Research) described the meeting of big data and the internet of things (IoT) as the ‘great crossover.’ For ABI, the leading use cases include predictive maintenance, operational analysis and contextual awareness.

Dan Cornford (Aston University and Integrated Geochemical Interpretation) believed that quality, trust and quantified uncertainty are key to data-driven decisions. Big data sets can’t be used without standards compliance and copious metadata on quality, provenance and preferably user feedback, expert review and citation information. All this is rolled up into the concept of ‘meta-quality’ i.e. metadata of quality information. This can be used to validate judgements made on third party data. Unfortunately, it is very rarely provided, ‘even statisticians often don’t validate things properly!’ Cornford used the global earth observation system of satellites (Geoss) as an example big data set. Less than 20% of Geoss records contain quality information and only 1.3% track provenance completely. One issue is that the ISO quality standards (ISO-19157) are too complex for real use. Also end user tools can’t in general use quality information. Cornford asked if big data was a challenge in the upstream. While some data, like seismic, is ‘big,’ ‘you don’t want to mine this, you want to manage it.’ Cornford does not believe that today’s generic BDA has much to offer. Its focus is largely on business intelligence and data mining. In the upstream, the focus is models, insight and judgement. Here, the key to decision making is access to the right data, quality information and the right tools. ‘Make sure users can then find and access their data easily from their applications whether it is ‘big’ or not, otherwise, don’t expect good decisions!’

Duncan Shaw, (Nottingham University Business School) brought some BDA experience from the commercial world to the debate. Shaw has advised companies like Ikea and Marks & Spencer on BDA. In general, corporate data is under-utilized and companies are confronted with ‘confusing new toys and techniques.’ But the potential is definitely there as ‘All sectors are just scratching the surface of what can be done.’

Summing up, we suggest that the Scottish courts’ verdict of ‘not proven’ could be the judgment of SMi’s inaugural big data in E&P conference. The killer use case or start-up has yet to manifest itself. But that will not stop folks (including Oil IT Journal) from tracking the trend and unpicking the buzzwords. In fact you will be reading a lot more about big data in E&P in next month’s Oil IT Journal when we report back from sunny Haugesund, Norway and the ECIM E&P data management conference.

More from SMi Conferences.


Folks, facts, orgs ...

AAPL, Audubon, BP, Chilworth, Circulation Solutions, CyrusOne, Headwave, Detechtion Technologies, EnerSys, Halliburton, Integrated Drilling Equipment, Intertek, Merrick Systems, Geoforce, Open Geospatial Corp., PBF Energy, Petrofac, PetroSkills, Petrotechnics, PIDX, PODS, Shell, Siemens, US Seismic Systems, WellDog, Wood Group, ION Geophysical, geoLOGIC Systems.

Roger Soape has been elected president of the American Association of Professional Landmen.

Terry Mieni has joined engineer Audubon as VP Offshore Business. He hails from Universal Pegasus.

BP has appointed Spencer Dale to Group Chief Economist. Dale was previously with the Bank of England.

Lisa Hutto is now Senior Process Safety Specialist with Chilworth Technology. She hails from Phillips 66.

Zach Grichor is now VP business development and Mark Laurent VP operations at Circulation Solutions.

CyrusOne is offering a ‘virtual tour’ of its Houston West II Data Center a.k.a. its geophysical ‘center of excellence.’

Industry ‘stalwart’ Dan Piette has joined the Headwave board of directors as strategic advisor. He was previously with Terraspark Geosciences.

Jorge Ordonez has joined Detechtion Technologies as CFO. He was previously with Energy Solutions International.

John Craig is stepping down as president of EnerSys but remains as chairman and CEO. Dave Shaffer has been appointed to the new role of president and COO.

Jeff Miller has been promoted to president and appointed to the Halliburton board. Miller will now ‘complement’ chairman and CEO Dave Lesar’s leadership role.

Integrated Drilling Equipment has appointed Jim Terry as CEO replacing Stephen Cope who has resigned. The company has also named Marty Paulk as senior VP sales and marketing. Terry was with Particle Drilling Technologies, Paulk joins from Key Energy Services.

Abderrahim El Azzouzy is to head up Intertek’s new ‘state-of-the-art’ petroleum laboratory in Tangier, Morocco.

Frank Lusk has joined Merrick Systems as chairman of the board from IBM.

Geoforce has hired Richard Coffman as senior VP sales. Coffman was previously with AccessData.

The Open Geospatial Corp. has appointed Terry Idol as director of its interoperability program. Idol was previously with the US national geospatial intelligence agency.

PBF Energy/Logistics has hired Thomas O’Connor as senior VP, business development. He was previously with Morgan Stanley.

Petrofac chairman Norman Murray has stepped down for compassionate reasons. Rijnhard van Tets has taken his place and Thomas Thune Andersen has been named senior independent director.

Kevin Lacy has joined PetroSkills as executive VP, technical staff and disciplines. He hails from Talisman.

Petrotechnics has appointed Blake Herman, Sheila Berru, Keith Richardson, Courtney Brewer and Kimberly Caulfield to various business development roles.

Christian Garcia, Halliburton, and Nils Røynstrand, Statoil have joined the PIDX International board.

Kathy Mayo is the new executive director of PODS, the pipeline open data association.

Harry Brekelmans is the new projects and technology director for Shell.

Lisa Davis is the new CEO of Siemens energy sector. She hails from Shell.

Mark Bashforth is now president and CEO of US Seismic Systems. He was previously with CGG.

Sandy Hunter is now a non-executive director of WellDog. He comes from Ambrose Resources.

Nina Schofield is now head of HSSE at Wood Group. She comes from Amec.

ION Geophysical has filed a lawsuit against OJSC MAGE and AARI in Russia alleging infringement of ION’s patent for marine seismic surveying in icy waters. ION filed the lawsuit after licensing efforts were unsuccessful.

Correction

In our report from the ESRI PUG last month we wrongly reported Tim Downing’s company affiliation. He is with geoLOGIC Systems. Our apologies to all.


Done deals

Accenture, Hytracc, Geokinetics, Geospatial Corp., Silver Lake Partners, Quorum Business Solutions, SNC-Lavalin, Kentz, TUV SUD, RCI Consultants, Veeder-Root, FuelQuest.

Accenture has acquired Hytracc Consulting, a provider of services around Tieto’s Energy Components hydrocarbon accounting solution. The 100 strong team will integrate Accenture’s upstream production management solutions unit.

Geokinetics has acquired CGG’s North American land seismic contract acquisition business (excluding its land multi-client and monitoring businesses) in exchange for a minority equity stake in Geokinetics.

Geospatial Corporation has acquired ShaleNavigator, a provider of shale oil and gas information on the Marcellus and Utica shale plays. ShaleNavigator founder Edward Camp is to join Geospatial as team lead of the new ShaleNavigator unit.

Silver Lake Partners and Silver Lake Kraftwerk are to acquire Quorum Business Solutions from funds managed by The Carlyle Group’s buyout team and Riverstone Holdings. Key Quorum management will also participate in the new ownership.

SNC-Lavalin Group has completed its acquisition of oil and gas engineering and construction specialist Kentz Corp. The acquisition creates a 20,000 strong group of project experts in the upstream, LNG, shale and SAGD sectors.

TÜV SÜD has acquired Houston-based RCI Consultants, expanding its services to the oil and gas vertical.

Veeder-Root has acquired fueling software as a service provider FuelQuest. The unit will be combined with Veeder-Root’s Insite360 Fuel business to provide an end-to-end solution ‘from rack to nozzle.’


Cyber security round-up

Norwegian oil hack. Bechtel’s cyber R&D. Recommended reading from McAfee. NIST workshop.

As reported by The Register, some 50 Norwegian oil and energy companies were hacked earlier this year with Statoil the main target of nation’s biggest ever attack. Norway’s Nasjonal Sikkerhetsmyndighet passed on the warnings following a tip-off.

Bechtel is to fund research into cybersecurity at the US Los Alamos and Lawrence Livermore National Laboratories. Bechtel’s Tom Gioconda, deputy director at Lawrence Livermore, said, ‘The initiative will recruit and cultivate cybersecurity experts who can strengthen networks by applying experience from national security environments and from industry.’

The April 2014 edition of McAfee LabsThreat Report contains an in-depth analysis of the ‘Heartbleed’ OpenSSL vulnerability that ‘affected every IT organization—knowingly or unknowingly’ and is ‘likely to cost hundreds of millions of dollars to repair. The free report also provides a straightforward explanation of phishing, the pernicious ‘spear phishing’ and how to avoid them. A must read for managers and employees.

The US National Institute of Standards and Technology (NIST) is to hold a workshop on the Framework for Improving Critical Infrastructure Cybersecurity, Oct. 29 and 30, 2014 at the Florida Center for Cybersecurity, in Tampa. The workshop is to get feedback from initial experiences with the framework from critical infrastructure owners and operators.


Megaproject cost overruns

Ernst and Young investigation finds oil and gas projects overrun and over spend.

Ernst and Young (which now likes to be known as ‘EY’) has released an analysis of upstream megaprojects. The 16 page document, ‘Spotlight on oil and gas megaprojects’ finds that as the ‘easy’ oil and gas is fast disappearing, megaprojects are the new normal. These include unconventionals, deepwater and the Arctic. EY’s survey of 365 megaprojects found that despite their impact on enterprise value and share price, many fail to deliver on time or within budget. Project delivery success is actually decreasing, especially in deepwater projects where 64% experience cost overruns and 74% are delivered behind schedule. On average project costs are 59% higher than initial costs estimates.

EY questions whether such excesses are ‘sustainable.’ Other research suggests that ‘non-technical’ issues are responsible for most overruns. A 2013 Credit Suisse study found that 65% of project failures were due to ‘softer’ aspects such as people, organization and governance. EY recommends project management tools and best practices using real time data throughout a project.


An update from POSC/Caesar semantic guru

'Prototype’ semantic endpoint is ready for industrial prime time use.

POSC/Caesar’s (PCA) source of data for oil and gas construction and engineering is deployed as an ‘endpoint’ for queries using Sparql, the semantic web’s preferred query engine. But how near is this ‘prototype’ to real industrial use? PCA’s semantic technology specialist Håvard Ottestad told us the following.

The service is intended for industrial use. We are calling it a prototype because we still feel we can make substantial improvements on this latest iteration but the service has existed for 10 years. The previous iteration was a Java based client-server solution without machine to machine support. The current edition is based on semantic web technology.

The word ‘prototype’ is however limited to the service. The content has been production ready for many years. Some of it is based on 15926 part 4, other has been created to work together with the EPIM Reporting Hub solution used by all Norwegian operators. The endpoint is backed by our reference data library which contains an improved version of ISO 15926-4. For this, the best entry point is the ‘class of class’ structure. I also like to point first time users to the Pump class to let them browse the super and subclass hierarchy.


'Geomancy’ for shale exploration

Integrated Informatics upgrades decision support/planning tool.

Calgary-based Integrated Informatics has released V2.4 of its superbly named, ‘Geomancy’ decision engine. The decision support system is tuned to unconventional needs and is used for planning and siting of well pads, laterals, access roads and gathering systems.

V2.4 offers enhancement including automatic determination of connections between pipelines and delivery points, number of wells per pad and their orientation. Geomancy factors in external constraints on surface rights and ‘constructability’ challenges. The tool provides an integrated 3-D view off well pads, well laterals, pipelines, and access roads.


Sales, deployments, partnerships ...

Asset Guardian, Foro Energy, Amalto, Athens Group, Audubon, Badger Explorer, Endeeper, Inductive Automation, Glass Technology Services, eCorpStim, Geocap, Earth Analytics, IFS, Deloitte, Infosys, Merrick, Orange Business Services, ReadSoft, Wex, OneSubsea, Helix, Rolta.

BP has selected Asset Guardian (AG) Solutions’ configurable process software management platform for its West of Shetland Clair Ridge project. The AG toolset will manage BP’s process control software through a global frame agreement. AG also reports use by GDF Suez on its North Sea Cygnus gas field.

Petrobras has signed a technology cooperation agreement with Foro Energy for high power laser drilling R&D. The goal is to achieve a ‘step change’ in subsalt drilling performance.

Wellsite accommodation specialist Stallion Oilfield Holdings has signed-up to Amalto’s automated electronic invoice solution, the e-Business Cloud.

Athens Group has announced the Well Control Package (WCP), a technology assurance service for oil and gas drilling and production systems. The WCP helps assure that the whole well control/subsea equipment on offshore drilling assets is ‘fit for purpose.’ The service leverages Athens Group’s ‘proven practices’ and API-quality assured management system.

Audubon has launched a pipeline solutions business unit to ‘better serve the entire oil and gas market.’

Badger Explorer has signed a cooperation agreement with China National Petroleum Corporation Drilling Research Institute which will join as a sponsoring partner of the Badger ‘autonomous’ drilling tool.

Endeeper has signed a new contract with Petrobras for development of new modules for its geosciences packages, Petroledge, Hardledge and RockViewer. The deal includes a training element.

Enerchem has deployed Inductive Automation’s ‘cross platform’ OPC-UA-based Scada solution at its Slave Lake refinery in Alberta, Canada.

Glass Technology Services has joined eCorpStim’s ‘clean’ shale stimulation R&D consortium. Research will focus on the use of dual component fracking using heptafluoropropane and a silica proppant.

Norwegian Geocap is raising its US profile in a joint venture with Earth Analytics Inc. The company reports that Cavea, the Center for advanced visualization and experiential analysis at Metropolitan State University of Denver is ‘looking to use’ Geocap for improved decision making.

IFS has signed a partnership agreement that makes Deloitte its strategic partner for the deployment of IFS Applications in the offshore, oil and gas industries in the Benelux.

BP has signed with Infosys for the provision of IT services including application support and development. The deal covers corporate, upstream and downstream segments, energy trading and marketing. Training will be delivered from the Infosys Technology University.

Merrick Systems has been busy of late. The Houston based company is expanding in Canada, working with Renslip Consulting and Lumina Consulting to offer implementation services around Merrick Production Manager. Merrick has also signed with Petropartner to provide production operations support in Vietnam. Yet another deal was struck with Singapore-based CSE-EIS. On the sales front, Merrick reports that Anterra Energy is to deploy Production Manager at its Canadian operations.

BW Offshore has awarded Orange Business Services a $12 million contract for satellite communications between its Oslo headquarters and some 14 FPSOs. Orange has dubbed the service ‘office at sea.’ No double entendre there!

An unnamed, but we are assured very large US independent has selected ReadSoft’s SAP-certified invoice automation application to streamline accounts payable. The deal is worth $480,000.

Wex has signed an agreement with Shell to process its new prepaid fuel card transactions within its commercial fleet business in Europe and Asia.

Cameron/Schlumberger joint venture OneSubsea and Helix Energy Solutions have signed a letter of intent to form an alliance to develop technologies and deliver services to ‘optimize’ the cost and efficiency of subsea well intervention systems.

Rolta is to implement a ‘comprehensive’ engineering information system at the Sadara Jubail chemical complex, Saudi Arabia, in what is described as a ‘multi-million’ dollar extension to an earlier contract for engineering and information technology.


Standards stuff

PIDX RNIF 2.1. Saskatchewan adopts PPDM well ID. API RP17W. New kg from NIST. DBPedia 2014.

The oil and gas e-commerce standards body PIDX has delivered its RosettaNet Implementation Framework 2.0. PIDX also reports progress on XML Price Sheet, a schema for exchanging price information between buyers and suppliers.

Saskatchewan is the first jurisdiction to implement the new PPDM Canadian Well Identification System (CWIS). Released in 2013, CWIS provides identifiers for a well, a well bore and a well reporting information stream. PPDM also reports that BP has contributed ‘hundreds’ of rules to the Professional Petroleum Data Management (PPDM) association’s online repository of business and is encouraging others to contribute to the knowledge base.

The American Petroleum Institute’s new Recommended Practice for Subsea Capping Stacks, RP 17W2 includes design, manufacture and usage guidelines for equipment used in the event of a spill. RP 17W resulted from the post-Macondo joint industry task force recommendations for subsea well containment.

After a couple of centuries of relying on a physical standard housed in Sèvres, France, the kilogram’s definition is to change. The US NIST is to link its definition to other constants using a ‘fourth-generation watt balance’ with an ‘absolute uncertainty’ of +/-20 μg.

DBPedia, a structured mirror of the information resident in WikiPedia has a new 2014 edition and now holds some 4.6 million ‘things’ represented. The dataset now has links to Freebase, Wikidata, Geonames and GADM.


Aveva announces Information Standards Manager

Aveva’s Nils Petter Ottesen provides the background to Aveva’s new ISM tool.

Aveva Information Standards Manager (ISM) is a software tool for EPC contractors and owner operators that improves project engineering information flows by rationalizing existing code and imposing consistent and compliant standards to improve information quality and reduce project and operational risk. The tool allows owner operators to accurately communicate information requirements to contractors. Contractors working with multiple clients can more easily manage projects that require different standards.

Aveva’s Nils Petter Ottesen explained, ‘There are no generally accepted information standards in oil and gas, if there were, you wouldn’t need ISM. Out of the box, ISM does not contain any standards. These are up to the client. But we are now working to provide ‘best practice’ information standards that reflect typical industry and asset requirements. To my knowledge no oil and gas EPCs and operators are anywhere near 100% compliant with industry standards. But most require some compliance. Maybe not with ‘pure’ industry standards, but they might contain aspects of these such as the ISO 15926 Part 4 dictionary, to ease data exchange between applications and 3rd parties. But although Part 4 brings some value, the reality is that it does not cover everything that is needed. Maybe JORD will be an improvement in this area, but that remains to be seen. ISO 14224 for maintenance and reliability data and the Norsok engineering numbering system are also potential candidates. Note that our best practice standards won’t depend on ISM, or vice versa. But together they will add value, allowing clients to tune them to their own business requirements.’


Dxstro’s data replication helps with PTTEP’s data compliance

DXRE replication technology keeps remote asset management systems synchronized.

PTTEP Australasia has leveraged data replication technology from Dxstro to synch data across its South Timor Sea operations including a floating production, storage and offloading (FPSO) vessel that supports its Montara, Swift and Skua oilfields. To meet regulatory requirements PTTEP needed a computerized maintenance management system (MMS) for the FPSO and selected IBM Maximo as the core of its asset management/MMS system. Then came the challenge of keeping maintenance and procurement in sync between its onshore offices in Perth and Darwin and the FPSO across a low bandwidth, occasionally intermittent satellite link.

The different remote Maximo instances were synched with Dxstro’s DataXtendRE2 flagship, an asynchronous data replication solution that compresses data to minimize bandwidth requirements and automatically synchs data across the linked Maximo instances. Even when connectivity is lost, the FPSO’s instance can still provide maintenance and procurement support, ensuring that PTTEP complies with its regulatory obligations.


Dell One ID manager for Williams Energy

Identity management system streamlines employee on and off-boarding!

Williams Energy has deployed an identity management system from Dell Software to streamline identity and access management. Williams replaced its legacy access request application that was struggling to handle requests from 6,500 employees and contractors. The old system was ‘cumbersome and time-consuming’ and ran on hardware that went down ‘almost daily.’ Even with replacement hardware, the application was difficult to configure.

Williams decided against a custom identity management development from IBM and opted instead for Dell’s One Identity Manager. Dell OIM streamlines the process of provisioning and managing user identities, privileges and security enterprise-wide, removing dependence on IT for user management and access control, and placing it in the hands of the business. The system offers automatic provisioning for new users, as well as expedited ‘off-boarding’ of terminated users, mitigating business risk associated with unauthorized access to corporate data. While the company’s IT staff previously manually administered 3,000 to 4,000 provisioning tasks per month, Identity Manager has cut that workload by 50%.


Shell backs Fiatech materials management push

Fiatech announces Integrated materials management project for owner operators and suppliers.

Speaking at a recent Fiatech webinar, Martin Swaine (Shell) introduced Fiatech’s new Integrated Materials Management project. Trillions of dollars of investment are planned for energy development within this decade but the difficulty of managing materials, equipment, and skilled resources across complex projects remains a primary contributor to project cost overruns and delays. New methods are needed to improve visibility, predictability and control of construction and operations.

Fiatech’s Integrated Materials Management (IMM) sets out to facilitate industry consensus, establish objectives and requirements and cultivate deployable solutions. One key objective will be to make it easier for owners to plan, coordinate and track the material acquisitions by multiple contractors to improve the predictability of project success. Some may see this as too big and complex an issue but Swaine suggested looking at manufacturing, automotive and aviation where inefficiencies have already been substantially reduced.


Forecasts and fantasies

ABI research on energy big data spending. Markets and Markets sees growth in the digital oilfield. Transparency Market Research forecasts a $35 billion wireline market for 2020.

ABI Research forecasts that in 2014, the energy industry will spend $7 billion on big data, of which over 60% is for upstream operations. This will rise to $22 billion in 2019 ($14 billion in upstream).

A new report, ‘Digital oil field market by services’ from Markets and Markets has it that the digital oil field market will be worth $38.49 billion by 2024 (dontcha love the two decimals!). This reflects a 4.6% growth from $24.60 billion in 2014. The report covers automation, DCS, SCADA, PLC, smart well, IT services and more. Another M&M report on the oil and gas analytics software market predicts 35.5% growth from $4.29 billion in 2014 to $19.65 billion in 2019 with the Middle-East and Africa set to be the biggest market.

And finally, Transparency Market Research values the global wireline services market at $19.10 billion in 2014 and predicts 10.36% growth to $34.96 billion by 2020.

Our new, occasional Forecasts and fantasies feature was inspired by the Financial Times’ Tim Hartford who observed, ‘Our predictions are about the future in only the most superficial way. They are really advertisements...’


Google Earth Outreach uses Street View cars to map leaks

Environmental Defense Fund maps natural gas leaks in Boston, Indianapolis and Staten Island.

Google and the Environmental Defense Fund have published online interactive maps showing natural gas leaks beneath the streets of Boston, Indianapolis and New York City’s Staten Island. The maps are the first phase of a pilot project that uses specially equipped Google Street View mapping cars, under a partnership between the Fund and Google Earth Outreach. The partnership is exploring the potential of new sensing and analytical technologies to measure environmental indicators and make the information accessible to everybody. The Fund observes that although methane leaks rarely pose an immediate safety threat, leaking natural gas is a powerful greenhouse gas with 120 times the warming effect of carbon dioxide.

Google’s Karin Tuxen-Bettman said, ‘This pilot project is meant to explore and understand the potential for the Fund and others to map and visualize important environmental information in ways that help people understand both problems and solutions.’ The Fund has worked with several leading utilities to validate the findings, which offer a new way for operators and regulators to focus and accelerate upgrades. The project’s algorithms will be published in a peer-reviewed scientific paper later this year, and made available on an open-source basis.


Seven Lakes battles ‘archaic’ oil and gas ERP software

Devon Energy endorses Seven Lakes Technologies’ professional assistance.

Seven Lakes Technologies of Westlake Village, CA, has just published a white paper vaunting the merits of data as the ‘new game changer’ in oil and gas, with emphasis on the onshore upstream. Fracking in North America has opened wide the data pipeline but today’s software solutions for enterprise resource planning (ERP) are ‘archaic,’ duplicating the same information across a distributed architecture.

Often data is captured by hand and may be re-keyed into multiple spreadsheets before it gets to a system of record, if ever! Lack of integration and old-style graphical user interfaces make for sub-optimal analytics.

Seven Lakes’ fix to this revolves around a master data repository development service that includes analysis of client data to organize it into an efficient database structure. Next Seven Lakes’ Field data gathering iPad based app is deployed to let pumpers collect data directly into the network. Having made QC’d data accessible, clients can deploy a variety of data visualization and predictive analytical applications for operations and forecasting.

One happy client is Devon Energy whose data analyst Ken Dalton has endorsed Seven Lakes for its ‘expert professional assistance.’


Canadian academics launch WellWiki

New data resource scrapes public domain data from shale exploration to ‘foster transparency.'

A new data resource, WellWiki.org, has just been announced which by now should hold data, news and history on oil and gas wells from the United States and Canada. The site provides data on some 420,000 wells in Pennsylvania and West Virginia, with an emphasis on Marcellus shale wells along with around 600,000 wells in the Appalachian Basin (Pennsylvania, West Virginia, Ohio and New York).

Eventually, WellWiki is expected to cover the four million wells drilled in North America since Colonel Drake’s first effort in 1859. WellWiki was designed by Joel Gehman (University of Calgary) and Dror Etzion (McGill) to ‘foster transparency and public dialog’ by providing user-friendly, searchable data.

The site scrapes public data from state oil and gas agencies and published sources. This is compiled in a GIS system and linked to data on operators, waste facilities, well pads, impoundments, violations and more. Diego Mastroianni (McGill) built the infrastructure with a LAMP stack that includes Ubuntu, Apache, MySQL and PHP. Michael Hall, developed the Mediawiki front end. The (rather grand) ultimate aim is that WellWiki will become the ‘Wikipedia of oil and gas wells worldwide.’


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.