Oil IT Journal: Volume 20 Number 8


Shell’s Project Vantage

Intergraph SmartPlant cloud key to Shell’s ambitious lifecycle engineering data management solution. Vantage spans design, construction, handover and operations. ISO 15926 downplayed.

First unveiled at the Hexagon user conference earlier this year, Shell’s Project Vantage appears to be a holy grail for engineering and construction data management. Vantage covers front end engineering design, through construction, handover and on into operations and maintenance. Shell’s Global engineering and operations unit has standardized on Intergraph as its primary engineering design tool across its worldwide operations. Shell is also to go ‘all in’ on Intergraph’s SmartPlant cloud platform.

Traditional document-based handover leads to ‘broken’ project data flows. Project Vantage’s data-centric approach fixes this with ‘safer, faster and better’ capital project execution. The platform provides Shell, partners and contractors with a ‘single and continuously updated version of data’ throughout the project and asset lifecycle.

Project Vantage evolved out of the ‘Data move’ and ‘integrated engineering environment’ that we reported on last year (OITJ June 2014). The cloud-based solution provides engineering tools for EPCs, plant owners and vendors and enables Shell’s replication philosophy of ‘design once and build many.’ Vantage provides collaborative, standardized and integrated information management, leveraging the ‘leading products’ in the marketplace in a single data platform.

Vantage includes ‘tag and trace’ functions to track people and equipment. ‘4D/5D’ technology provides visualization of planned and actual construction sequences, including materials availability. This allows for optimization of construction planning, avoiding incompatible concurrent activities while safely maximizing work fronts. An online portal provides registered users with access to Shell design and engineering practices. A vendor catalog exposes a single quality controlled source of equipment model information ready for use in detailed engineering for approved vendors.

Shell has been a longtime advocate of streamlining engineering processes and Shell personnel were instrumental in the development of the ISO 15926 suite of standards for lifecycle engineering data management. Vantage represents a shift from a pure-play standards-based solution, which would entail all stakeholders adhering to the protocol. Instead, standardization is taking place at the product level, a more pragmatic approach.

Despite recognition from vendors and a couple of flagship projects, ISO 15926 has struggled to gain widespread acceptance. A recent trial of the protocol by UK’s Crossrail concluded that ‘despite being available for some time, business solutions based upon the ISO 15926 standard are still at an apparently low level of maturity.’


Infosys bags Noah

$70 million transaction sees Houston-headquartered upstream information management boutique bought by Indian management consulting behemoth.

Bangalore, India headquartered Infosys Consulting is to acquire Houston-based Noah Consulting in an all-cash transaction valued at $70 million. Infosys claims ‘one of the largest oil and gas practices in the world’ with a reported 3,500 consultants. Noah was founded in 2008 by John Ruddy, Shannon Tassin and Stuart Nelson, a diaspora of consultants from HP’s Knightsbridge Solutions business. Today, Noah’s 120-plus team of consultants operate out of offices in Houston and Calgary, providing information management consulting services to the oil and gas industry.

Commenting on the addition of Noah’s upstream data management expertise to Infosys’s service offering, Infosys EVP Rajesh Murthy said, ‘The oil and gas industry is facing unprecedented challenges that require a well-defined and executed information and data strategy. This acquisition positions us to offer end-to-end data management services to oil and gas companies globally.’ Current plans are for Noah to operate as a wholly-owned subsidiary of Infosys Consulting through to the end of next year. More from Noah and Infosys.


Lies, damn lies and … big data!

Editor Neil McNaughton argues that ‘big data’ is old hat! In the fifty or so years since numerical classifications methods were trialed by geologists, some learnings appear to have been forgotten. A timely issue of Nature addresses the dangers of ‘point and click' statistical software.

Speaking at ECIM (more on page 6), Teradata’s Niall O’Doherty opined that one of the myths of big data is that it is something new. His words came back to me as I sat in on the analytics/big data track at the Society of Petroleum Engineers ATCE in Houston a couple of weeks later (more next month). For yes, big data and its accompanying statistical toolset is almost as old as computing itself, as any user of the venerable SPSS* package will tell you.

My own first exposure to what was then ‘numerical taxonomy’ dates back almost 50 years. At the time I was studying geology, not computer science. Moreover I was not a very good student. This was the late 1960s and there was other stuff to do.

Having said that, this bad student who, a long time ago, studying a different subject in a half-hearted manner, did grasp a couple of things that seem to have subsequently been forgotten. First, it does not always work very well. Fiddling with input parameters makes it possible to classify everything as either ‘all the same’ or ‘all different.’ This is due in part to the fact that it is hard to find a sufficient number of independent, non-correlated input measurements (dimensions). The other problem is that that the technique often failed to produce statistically significant results because there were not enough numbers to crunch. Or, in modern parlance, the input matrices were too ‘sparse.’

To be charitable, data today is ‘bigger’ so hopefully, sparseness should be less of an issue. But the risk of correlated inputs appears to have escaped some of today’s big data protagonists. In the dash to apply analytics to shale prospectivity, one SPE paper proposes using no less than 39 attributes (dimensions) derived from a post stack seismic dataset. Now some of these might be just a little correlated, no?

A query of a philosophical nature which I like to ask seismologists is, given a 3D post stack volume of 4 pretty independent dimensions (x, y, z/t, amplitude), what is the maximum number of truly independent ‘attributes’ that can be computed? I am sure that it is less than 39. I suspect that you cannot add any truly non-correlated inputs to the mix at all but such reasoning is above my pay grade.

I am not sure that I am a better student today than I was then, but over the years I have developed a sneaky way of learning. I avoid stuff that seems too hard on the (probably unjustified) assumption that the author doesn’t really know what he or she is talking about. And I home-in on what seem like insightful comment and summary. Thus I learned on page 110 of David Hand’s excellent ‘Very short introduction to statistics’ something of how ‘statistical computing’ really works. An insight, by the way, that no end of presentations on the business benefits of ‘big data’ have provided. Hand explains how data is split into a training data set from which some statistical relationship is derived. This is then tested on the remainder of the data that was not used for training.

Of course you have heard this before, but, as Hand points out, it is only a part of the story. The real application of big data is that the process is repeated multiple times inside the data set. Randomly selected subsets are used for computation and compared with the remaining data to provide ‘an overall measure of likely future performance.’

Curiously, oil and gas practitioners prefer to do this manually, using half the data for stats and to test, in a suck-it-and-see fashion, on the other half. I suspect that this falls short of statistical best practice even if it gives a warm feeling when it comes up trumps. It should be better to plug all the data in at the start, to have as little sparseness as possible and let the machine tell you how good the forecast is with some hands-off measure.

Speaking of warm feelings, statistics and experiments, there is a great section in a recent edition of Nature which deals exactly with such questions and what to do about it. Nature’s feature, titled ‘Fooling ourselves,’ by Regina Nuzzo, discusses how cognitive bias, our desire to find results that confirm our preconceptions, plagues scientific research and publishing. This usually comes to light when other teams of researchers have access to the same data set and fail to reproduce the findings. Nature reports on a study by Stanford’s ‘Meta-Research’ Institute which found that only around one-third of 100 psychological studies proved reproducible. Elsewhere, a measly 6 out of 53 ‘landmark’ studies in cancer research proved good. One problem is the ‘widespread use of point and click data analysis software that has made it easy for researchers to sift through massive data sets without understanding the methods.’

Reading the article made me think back to my seismic interpreting days. Seeking confirmation for a preconceived geological paradigm was exactly what we did all day long! For a refreshing insight into such thinking, check-out our article on page 4 for an example of cognitive bias in interpretation from GeoTeric.

Nature suggests that one way of avoiding bias is for publishers to publish packages of both analysis and the original data such that its reproducibility can be tested. Seemingly, in particle physics, ‘blind’ data analysis is de rigueur, when researchers do their stats on jumbled data that is only unscrambled after the calculations.

A special issue of the Journal of Librarianship and Scholarly Communication addresses these very issues in an attempt to ‘map the landscape of research data.’ A new role is proposed for activist librarians who, ‘as experienced knowledge workers [are] vital players in the research data management enterprise.’

* Statistical package for the social sciences – first released in 1968. Now owned by IBM.

@neilmcn

ECIM keynote and interview - Patrick Reidy, CSC

Computer Sciences Corp. chief information security officer warns ECIM conference, ‘Cyber risk is a chronic disease that needs to be managed.’ In a follow-up interview with Oil IT Journal he discusses the risks of the cloud, of ‘free’ search, cyber standards and the risk from scada control systems.

In his keynote address to the 2015 ECIM data conference in Haugesund last month, Patrick Reidy, VP global cybersecurity with CSC traced the evolution of cybersec thinking in oil and gas. Companies can no longer rely on just deploying a security patch and getting on with life, ‘the threats are way better than our defenses.’

Today’s CIOs are ‘disruptive innovators’ deploying mobile, ‘third platform’ apps to ‘millions of users and billions of things.’ ‘Smart’ pipelines may offer great leak detection but their scada systems were not designed for security. Industrial control systems in Iran and Germany have been breached causing physical damage. Drones too now represent a ‘huge new attack surface.’ The techniques once only available to government agencies (Reidy used to work for the FBI) are ‘moving down the criminal food chain.’ ‘Black hat’ sites such as Antidetect provide low cost identity masking and IP spoofing.

The average time from a threat’s arrival to its detection is a whopping 229 days - the longest 2,000 days. Reidy stated that the oil and gas sector is ‘47% penetrated and getting worse.’ Yet energy spends a measly 2% of its IT budget on cybersec (finance spend is 17%). Energy cybersec maturity is low and falling behind general industry.

For Reidy, the solution is to ‘integrate intelligence’ into your cybersec stance and to stop treating it as an IT problem. This involves asking ‘who are your enemies? What do they do and how?’ It is no longer possible to ‘stop the bad guys’. Cyber risk should be seen as a chronic disease that needs to be managed with a focus on assets and potential threats. In this context, useful resources include the Cyber intelligence tradecraft project at Carnegie Mellon and the Ponemon Institute’s incident response service. The days of a ‘moat and castle’ defense are over, with some 5 billion devices connected to the internet of things.

Interview

What risks does the cloud pose?

Possibly from an individual who’s just been fired uploading files to the cloud. This may be infrequent but can be extremely damaging.

What about information leaking out of a company, say by employees searching for sensitive information with Google?

If a service is free then you are the product! Search term analytics could be used to see where a company’s next hot prospects are located.

We have a hard time tracking the many US government initiatives in cybersec...

In the US, NIST and the Department of Energy are having an existential crisis. Folks are wondering who regulates what. To my way of thinking, it is the government’s job to regulate with high level standards although some are concerned with this approach. Cybsersec is expensive, so why not go elsewhere.

We have reported on process control cybersec as it has swung from ‘deperimeterization’ and back to ‘reperimeterization.’ Where are we now?

Everything is connected. Pipelines are connected to control systems which are connected to the internet so you can send people to the right place. But these systems are not ready for the challenge and threats. Some of these things people do are dangerous.

What do you think of data diodes?

They are OK for read-only applications. But the real value is in communications with the control system so that, for instance, you can shut down a gas leak. Usability will always win over security even though I’m not sure this is a good thing! Companies want automation and machine to machine interaction, taking people out of the loop, lowering costs and raising efficiency.

This is something like a counsel of despair. There seem to be more problems than solutions.

Actually no. In the internet of things there is a finite amount of things that say, a pipeline or plant should do. So it is actually easier to protect than some consumer environments. A pipeline will never visit Yahoo or browse the Internet. So you can check the outgoing traffic or for a control system, see if the commands are coming from Russia!

But these can be spoofed à la Antidetect above…

There are ways of spoofing IP4 packets. But at a higher level, attackers are unlikely to have enough context to spoof convincingly. You can also add protection from e.g. biometric technology as used in healthcare. In the end though, given enough resources, anything can be spoofed. More from CSC.


Secure NOK’s Guidebook to drilling cybersecurity standards

A ‘myriad’ of ICS/scada standards cover drilling’s unique cybersecurity problem set.

One problem that confronts a would-be cybersecurity practitioner is the plethora of standards and initiatives that have been proposed by various stakeholders. Secure NOK, a Norwegian cybersecurity consultancy has done a great job of bringing all these together in context in its ‘Guidebook to current drilling control systems cybersecurity,’ a 67 page free download from Secure NOK.

The Guidebook covers relevant standards for industrial control systems and scada systems, from top level generic ISO 27001/27002 standards through US NIST SP 800 standards, Norway’s Norsok/OLF derivatives and ISA/IEC 62443 standards for industrial automation control systems.

The ‘myriad’ standards can seem hard to understand and implement and are too often an exercise in ‘compliance’ rather than a contribution to cybersecurity. The Guidebook addresses technical cybersecurity, industry best practices and legal and compliance issues.

Drilling systems present a particular attack surface to hackers and DCSC does a good job of mapping the standards maze. To navigate your way through it, you may find you need help from Secure NOK’s consultants.


LMKR updates Geographix, rolls out new Gverse suite

New digital oilfield toolset embeds quantitative interpretation by Lumina Geophysical.

LMKR has launched ‘Gverse,’ a suite of geoscience and engineering applications that target workflow optimization and productivity. Gverse is inter-operable with third party geoscience software and is claimed to speed interpretation time. LMKR has partnered with quantitative interpretation boutique Lumina to improve geoscience data interpretation and data integration.

LMKR CEO Atif Rais Khan told Oil IT Journal, ‘Gverse is a completely new modern suite of tools for the digital oilfield. They are tightly integrated with third party tools and run on Windows 10.’

Gverse enables rapid database to database exchange, visualization and streamlined workflows that ‘maximize data value while reducing costs.’

LMKR also recently released the 2015 edition of the Geographix interpretation suite. Halliburton has extended the distribution rights it granted to LMKR in 2010 for another two years until mid 2017 at which point the agreement will terminate. LMKR will retain the rights to service current licensees at that time.


ConocoPhillips’ geoscience software, instrumented wells

Major reveals software choices. Legacy Ventures uses fiber to 'unravel’ shale mysteries.

The Q3 2015 issue of ConocoPhillips Spirit magazine includes an article describing the company’s geoscience and engineering software revamp. This began in 2010 with a global upgrading to Landmark’s R5000 interpretation software and database. The program ran for two years, involving IT and petro-technical professionals in ten countries.

The next step was the deployment of Ikon Science’s RocDoc rock physics and pore pressure tools for deepwater, Perigon’s iPoint for digital core analysis and management (especially for unconventionals) and Schlumberger’s TechLog.

In 2013, Landmark’s DecisionSpace was deployed along with an infrastructure refresh to boost power and bandwidth. A global licensing deal for Schlumberger’s Petrel and Eclipse reservoir simulators was signed in 2014 and OpenIT’s technology was implemented to reduce licensing costs.

Another article in the same issue of Spirit describes how Conoco, with help from consultant Bruce Smith of Legacy Energy Ventures is ‘unraveling’ unconventional reservoir mysteries with various instrumented wells projects. Conoco’s Technology & Projects unit has been using fiber optic, real time monitoring of fracture performance to optimize completion design and well spacing.


Meera launches digital energy cloud

Ex-Fuse technology revamped, positioned as cloud-based data solution.

Since it acquired Fuse IM early this year, Meera has rebranded and consolidated Fuse’s technology into the ‘Digital Energy Cloud’ (DEC). Central to DEC is Centrum, a PPDM-based ‘meta model’ and database.

Centrum connects to third party data stores using an integrated data virtualization engine. This allows structured, unstructured and spatial data to be combined leveraging custom business logic, scripts and workflows, and business. A built-in JBoss-based business process management engine automates and monitors complex workflows. Centrum can be hosted in the cloud (Fuse was an early adopter and tester of Amazon’s web services) or run on premises or by a preferred service provider. An enterprise edition is optimized for high availability, clustered installations.

An OpenLayers client interface provides map-based access leveraging the OGIS web feature service federating Bing/Google and Open Street Map. Various GIS technologies can be deployed, Open-GeoSuite, GeoServer and PostGIS. As Meera CTO Jamie Cruise told Oil IT Journal, ‘unlike some, GIS standards are actually standard!’ as he demoed visualization of Petrel ZGY time slices on a Mac. More from Meera.


GeoTeric on interpretation pitfalls

David Roberts takes issue with seismic envelope’s use in stratigraphic determination.

Geotric guest blogger David Roberts (3-DMR) questions the pertinence of the seismic envelope in understanding the subsurface. GeoTeric’s image processing technology is used to enhance geological features that may be hidden in conventionally displayed seismic data. An earlier post showed how the ‘cognitive interpretation’ technology (Oil ITJ Vol 20 N° 6) applied to a New Zealand 3D dataset highlighted some progradational features in the data that could be indicative of reservoir-grade sandstone.

Roberts was doubtful, ‘The dramatic change in reflection geometry between the migrated data and the envelope attribute alarmed me so I had a good look at the data with GeoTeric and Landmark’s GeoProbe and DecisionSpace. The new data link makes it easy to work with all three applications.’


Roberts’ in-depth analysis suggests that the steep dips in the envelope are probably artefacts due to steeply dipping seismic noise. He encourages GeoTeric to check out his findings and make interpreters aware of such interpretation pitfalls. GeoTeric is investigating and expects to provide a blog update soon. More from the GeoTeric blog.


Software, hardware short takes

Tecplot, Landmark, SeisWare, Technical Toolboxes, Spectra Logic, O'Meara Consulting, IDS, Ikon Science, Yokogawa, Halliburton, Epsis, DGB, APS Technology, AIMS Global Consulting, Assai, Pegasus Vertex, Johnson Battery, Faro, IHRDC, Entero.

The 2015 release of Tecplot RS, a post-processor and visualization tool for reservoir engineering offers bubble and pie chart enhancements, an interactive equation editor and support for multi-reservoir Nexus models.

At the Life user meet earlier this year, Landmark rolled out its ‘Earth’ interpretation solution. A DecisionSpace bundle delivered either from the cloud or from an on-site appliance.

Following its acquisition of Recon earlier this year, SeisWare 9.1 is now integrated with Recon 4.4 through dynamically linking with the new SeisWare API Server, automatically updating changes made in either application.

Technical Toolboxes has introduced SkyBoxes, cloud-based versions of its upstream and midstream tools that can be run from the desktop or mobile devices.

Spectra Logic reports on the imminent arrival of LTO-7 tape technology with 15 terabyte (compressed) capacity and compatibility with Spectra TSeries libraries.

O’Meara Consulting is now a certified partner for Schlumberger’s Ocean/Petrel ecosystem. The 5.6 release of O’Meara’s Geo2Flow patented technology helps identifying reservoir compartments.

DataNet 2.5 from IDS includes a port of the software from its Flex/Flash to a modern HTML 5 GUI offering access to the software from multiple endpoints without the need for client software. HTML 5 offers a ‘rich and robust’ experience across different browsers.

The 6.3 release of Ikon’s RokDoc includes 3D enhancements for multi-component pre-stack inversion, 3D model building and ‘supra-horizontal’ well handling. RokDoc now also can import ChronoSeis structural models.

Yokogawa’s Exaquantum R3.01 plant information management system delivers a six-fold increase in bandwidth and can handle up to half a million tags. Exaquantum is now positioned as Yokogawa’s data historian.

Halliburton reports successful deployment of its SmartPlex downhole control system in a six-zone multilateral well in the Middle East. SmartPlex enables remote actuation of downhole control devices using electro-hydraulic control lines from the surface. The system provides reliable control of up to 12 interval control valves in a single wellbore.

Epsis TeamBox 6.0 sees a new architecture and data service for larger deployments. The collaboration solution also now offers workflow libraries, allowing for fine-grain control over what users can see and edit.

DGB’s OpendTect Pro (a.k.a. V6.0) from dGB Earth Sciences offers ‘movie-style’ inspection of 3D data, a new 3D auto-tracker, a ‘thalweg tracker,’ ray-tracer and PDF-3D output. Watch the video presentation.

APS Technology’s ‘SureShot’ EM MWD System has successfully transmitted from a depth of 3,080 meters in an oil-based mud, achieving a 6 bit per second data rate.

AIMS Global Consulting’s ZynQ 360 offers high definition, 360° spherical photography and video for real-time collaboration and visual review of an asset. The tool provides a bridge between CMMS and ERP systems and users.

A new ‘Brava’ HTML 5 client for AssaiDCMS offers ‘zero footprint’ collaboration for users of engineering documents and images.

Pegasus Vertex has announced the imminent release of ‘CleanMax,’ its new wellbore cleanup software. Cleanmax optimizes completion displacement calculation, minimizing spacer interfacing and reducing rig time and costs. A CleanMax+ edition targets deep water operations involve displacements using the choke, kill and boost lines.

Johnson Battery Technologies and BP have developed a solid-state battery suited to wireless and remote applications in high temperature environments.

The 16.5 version of Faro’s PointSense and VirtuSurv laser scanning software complement native Autodesk programs with point cloud functionality.

IHRDC’s new IPIMS 3.0 e-learning software includes updates to the user interface and library. IPIMS e-Learning series comprises 136 topics covering upstream technology disciplines with a new tablet-friendly user interface.

Entero’s Mosaic 2015 includes improvements to capital planning and performance analysis, new type curve process, ‘SMOG’ reporting and a new 64-bit edition.


PGS, Kaust, Aramco and Bull/Atos enter new Top500 list

Cray, Dell and Bull power oil country petaflop-scale high performance computers.

The latest TOP500 list of the world’s fastest supercomputers includes a few new oil and gas entries. Kaust*’s Shaheen II, a Cray XC40 is the fastest in the category with a 7 petaflop peak. Next is Abel, PGS’s Cray XC30 at 5 pf. Saudi Aramco’s Makman-2, a Dell PowerEdge R630-based machine follows at 3.0 pf. All these run Intel Haswell-class processors sans accelerators.

Three machines from Atos-owned Bull Computer come next, all installed at Brazil’s Laboratório Nacional de Computação Científica. Together the HPC infrastructure comprises the biggest supercomputer in Latin America with a peak capacity of 1.4 petaflops. The supercomputer will offer HPC services to the Brazilian government and, inter alia, Petrobras researchers.

The Santos Dumont range comprises different configurations of the Bullx B700 direct liquid cooled series.

Atos CTO Philippe Vannier said, ‘We are also opening an R&D Center in Petrópolis, just North of Rio de Janeiro, that is fully integrated with our global R&D,’

* King Abdullah university of science and technology.


ECIM 2015, Haugesund

NPD, ‘data management key value.’ RocQC, ‘data management in bad shape.’ Statoil’s model store. Shell’s data health checks. Schlumberger’s data/IM as a service. Halliburton, CGG, Teradata on upstream analytics. CDA on data in the downturn. OMV and ArcticWeb. Hampton’s remote data QC.

The NPD’s Maria Juul celebrated 20 years of ECIM and 50 years of activity on the Norwegian shelf, stating that data is a keystone of the Norwegian exploration model. NPD’s role is to advise government and to govern data publishing, release and use. Industry and government’s data objectives may differ as companies seek exclusivity and confidentiality while the government strives for sharing and cooperation. Tax breaks mean that 78% of data costs are refunded, so government ‘has some say in the matter. Competition for new licenses should be based on creativity and new concepts rather than on access to data. Diskos has been a major contribution to the Norwegian economy and data management has been key to value creation and growth.

Ian Barron (RocQC) observed that while there is a general perception that oil and gas data management is ‘healthy and doing well,’ he sees it as in ‘bad shape and getting worse.’ Presentations emphasize data management successes and the same issues are ‘solved’ again and again. The reality is that the last few years have seen more catastrophic failures and users are striving for better data. Barron enumerated a few data management busts, wells without depth reference or with bad geoedetics. Elsewhere, different departments in the same NOC use different coordinate reference systems. Data is being trashed during loading to interpretation systems producing an ‘inextricable mess.’ There remains a universal misconception that a lat/long pair specifies a unique location! Budget cuts means that often there is nobody left to fix such issues. Barron concluded asking how many companies were prepared to certify their data to an ISO quality standard. There were no takers.

Maria Lehmann presented a data ‘success’ in the form of Statoil’s ‘corporate model store’ (CMS) for subsurface geological models. The CMS is designed to provide life-of-field support, with audit trails of who did what, what software versions and parameters were used. The solution was developed to fix rapidly growing storage requirements and unsustainable archiving and management costs. The CMS has cut the number of models from 67,000 to 17,000 with good metadata. 110 terabytes were deleted in the ‘massive’ cleanup of (mostly) Roxar RMS projects.

Shell’s Andries Helmholt presented on the value of technical data ‘health checks.’ These are conducted at Shell’s business units in a combination of self-assessment and peer review twice yearly. A week is spent checking data metrics against business rules and compliance. A ‘checklist-guided conversation’ leads in to a ‘no surprises’ meeting with the business and final report. ‘It may sound dry but it is real fun and puts a happy face on the business.’ Technical data management is intense but rewarding and essential to ensure that data remains ‘alive and kicking.’

Teradata’s Niall O’Doherty proposed to bust some big data myths as follows. Myth n° 1, big data is something new. O’Doherty cited Oil IT Journal’s account of the seminal PNEC presentation of Wal-Mart’s Teradata-based data environment. In 2006 Wal-Mart had a 900 terabyte data environment and could ‘answer any business question at any time.’ Today this has expanded to multi petabyte store in a single repository. Myth n° 2, big data is ‘big.’ It is actually more about flexibility in data exploration. Myth n° 3 it is about Hadoop. ‘This one drives me crazy.’ Few companies have achieved value from Hadoop. Yahoo and Google don’t use it. They use Spark, Ceph, Presto, Apache Mesos. The key is to use technology that relates to your problem and to remember that ‘explicit or implicit there is always at least one schema!’ Myth n° 4, the age of science is over, the data will tell us everything. No! Simple models and lots of data always trump elaborate models based on less data. Your experts are still needed. Myth n° 5, it is for big companies. BD is for all. It is a component of ‘data management 2.0.’ What is needed is some self- education on what is possible. You should be able to answer your management re Presto, Ceph and so on and be ready to bring a competitive advantage to your data strategy.

Anne-Sophie Beck Sylvesteren described how Schlumberger is performing remote data and IM as a service. The client, a UK oil major (who could that be?) was struggling to get buy-in for a multi-year multi-million dollar data project whose value of project was not appreciated by the business nor by data managers. Moreover, data management did not fit into the ITIL framework used by IT. Objectives were finally met with a shift from project to service focus. Data management is achieved remotely using Schlumberger’s secure VPN link to its service hub in Pune, India. Data managers and petrotech support work on data that stays in the US and UK. A web-based system allows end users to load data into project workspaces.

Halliburton’s Lapi Dixit opined that E&P makes a legitimate claim as one of the top industries dealing with big, complex data but that realizing the value is hard. Part of the problem is poor definitions and ‘old business intelligence solutions repurposed as big data.’ Geology doesn’t fit the paradigm and predictive analytics don’t work on seismic data. The big data toolbox may not be so applicable to E&P. Having said that, acquisition-through-processing workflows are inefficient and an obstacle to holistic seismic analysis. Machine/deep learning could be used to speed seismic processing, to analyze stuck pipe and other causes of non-productive time. Data should be kept alive in a virtual pool leveraging a big data compute architecture (Apache Spark, SparkSQL, Streaming, MLib and GraphX) and Landmark Earth, the ‘industry’s first complete E&P cloud offering’ a.k.a. a ‘converged infrastructure appliance.’

Henri Blondelle (CGG) provided a more concrete take on how to wield the data science Swiss army knife in the service of upstream data management. CGG’s iQC prototype big data application replaces manual data load and QC with machine learning-based database population. CGG’s new tools of the trade include MapReduce, decision trees, latent semantic analysis and clustering. iQC establishes a link between unstructured (test) information and the database. A hybrid approach involves a first-pass automated classification which is checked by a human expert for errors. The errors are then fed back into the classifier to improve the classifier. An iQC prototype was demoed on 4000 documents from the UK’s CDA database to extract well names and document types. CGG plans to embed iQC in its ‘Plexus,’ ‘next generation’ data management platform and is inviting interested parties to join in its big data learning effort.

Duncan Irving (Teradata) presented a machine learning approach to basin analysis on a dataset from Japan’s Taranaki basin. Well logs were processed using ‘SAX,’ Teradata Aster’s ‘symbolic aggregate approximation’ a.k.a. Sax-ification. This represents log curve values as discretized alphanumeric buckets – AABBCBAA. Facies are identified with ‘dynamic time warping’ using nPath Teradata/Aster’s regular pattern matching toolset. The approach identified unnoticed features like hot shales and other facies. Pierre Marchand showed how KMeans clustering and Npath can be used to analyze production volumes, well status and stratigraphy. KMeans and NPath are just two of a 100 or so big data techniques available in Teradata’s Aster 6 toolset.

Terry Alexander provided a summary of the ECIM/CDA workshop held earlier this year on the subject of staying ‘sustainable’ at $60 oil. Speakers called for less tax and more (even free!) seismics and better IT and R&D collaboration. Big data/analytics is seen as an antidote to the loss of corporate memory as folks are laid off. Decommissioning is seen as a growth area for the data managers, once the associated regulations have been established. Geotechnical data has bypassed everybody, let’s get on the bandwagon!

Jens Jacobsen showed how OMV is using the ArcticWeb data portal to curate large volumes of public data and planning HSE/medevac for its exploration wells. The app captures medevac regulations, distance to shore, weather windows (icebergs), and other parameters that allow OMV to judge whether a more detailed analysis is required. ArcticWeb embeds Kadme’s Whereoil system.

Arun Narayanan (Schlumberger) and Lapi Dixit (Landmark) both set out their big data/analytics stalls. For Narayanan, analytics need to embed end-user workflows and must be ‘more than dashboards.’ Analytics is a parallel track to the classical ‘engineered’ approach of well and seismic interpretation. Showing a slide with a hundred or so buzzwords, Narayanan intimated that we need to ‘understand all of these.’ A study on public domain data from the Eagle Ford shale enabled an ‘84% accurate’ forecast of production.

Landmark advocates a ‘different path’ to data management where ‘open source’ is no longer a dirty word. Although the oil industry is in decline, tech is booming. Open source presents an opportunity to leverage new technology even as E&P’s IT spend is down. Dixit’s buzzword pastiche included Spark, MLib and GraphX, ‘all in memory,’ arguing for a vendor operated solution to for all of the above.

Wally Jakubowicz (Hampton Data Services) commented on the dichotomy between the data manager’s desire for order and the reality of multiple file systems and duplicated data. Attempting to replace such chaos with ordered, cleansed databased information is ‘difficult and mostly impossible.’ A preferred route is to use continual, domain specific data mining to monitor changes. This is done in a cloud based service that provides the E&P data manager with a dashboard and GIS window showing what objects have been added or edited, what is duplicated and where it is located. Hampton Data’s GeoScope offers remote analysis of company metadata in a ‘cloudy’ data management solution.

Tim Hollis (Schlumberger) presented a structured approach to migrating Open Works and GeoFrame data into Studio for Wintershall. The approach used methodology from the Technology services industry association. One key enabler was the fact that Wintershall gave Schlumberger access to its IT environment. More from ECIM.


Slow start for CGG Akon chez Diskos

Replacing Petrobank has proved a tough challenge for new operator.

As we reported earlier this year, Akon, CGG’s seismic data management solution that was selected by the Norwegian Petroleum Directorate to replace the venerable Petrobank, has had a troubled start up. Speaking at this year’s ECIM conference, the NPD’s Eric Toogood provided an update on the data transfer and operator changeover. The whole process has taken some three years and has been costly but necessary to comply with stringent Norwegian government procurement rules. Previously, changeovers did not involve major technology shifts as Petrobank has powered Diskos from the outset.

This time, CGG’s Akon, derived from its Trango seismic data management solution, needed further development to fulfill the Diskos requirements. Data migration from Petrobank to Akon also proved problematical, requiring understanding of both data models and duplicating complex built-in database triggers. Toogood also offered a mea culpa on behalf of the NPD, acknowledging that the tender could have been clearer.

Data migration has revealed data issues requiring clean-up of company names and data ownership. CGG is now visiting end users to collect requirements and fine tune the solution. A web service API is under development for the WhereOil front end. The new Diskos website is up and running and offers open access to Norwegian production data.


Schlumberger flaunts new data solution at ECIM

Last year’s loser in the Diskos tender shows off fancy new 'WorldMap’ data management system.

As if to say ‘see what you are missing’ Tim Hollis unveiled Schlumberger’s new WorldMap system at ECIM. WorldMap, rather like Diskos, offers access to seismic, well and production data. Schlumberger was one of the big losers in the latest Diskos tender.

Unlike Diskos, WorldMap does not house data, rather it is a window into third party data providers such as DrillingInfo, Geofacets, DigitalGlobe and more. WorldMap makes particular use of the SpatialOnDemand window into the DigitalGlobe/SpatialEnergy mapping resource.

WorldMap provides a GIS interface to original format data, along with tools for QC and load into applications. Geoprocessing functionality allows for spatial filtering such that well data can be constrained to particular area of investigation. Other queries can restrict data to wells with specific log types or depths. Right click pulls up metadata and hyperlinks give access to additional web-based documentation. WorldMap’s tag line is ‘Make existing subscriptions work for you.’


Folks, facts, orgs ...

Eccma, ABB Canada, Aqualis Offshore, AIMS Consulting, Aker Solutions, Allegro, Ancre, Aptomar, Arria NLG, Aspen Technology, Aegex Technologies, CGG, Dassault Systèmes, Divestco, EMC, Fiatech, Geological Society, GSE Systems, Holland Services, NERC, NetApp, National Oilwell Varco, IO Oil and Gas, Oildex, Parker Drilling, Schram, Smith Flow Control, Statoil, Uptake, Wood Group.

Peter Eales and Clay Pereson have joined the Eccma standards body.

Nathalie Pilon is now MD of ABB Canada. She hails from Thomas & Betts.

Michael Rennie head-up Aqualis Offshore’s new Doha office.

AIMS Consulting has named Robert Smith as head of its new Americas HQ in Houston.

Ex Schlumberger VP David Clark is now regional president of UK and Africa for Aker Solutions.

Allegro has appointed Jon McCabe to CFO, Mark Crosno VP Americas sales, Joachim Koch as VP sales, operations and alliances. Bob Clancy, Ed Fleming and Michael Wolk have joined Allegro’s Americas consulting services.

Didier Houssin is the new president of Ancre, a joint R&D venture between France’s CEA, CNRS, CPU and IFPen.

Anders Meldal heads-up Aptomar’s new ‘Aptomarin’ marine control center in Trondheim, Norway.

Sharon Daniels has rejoined the Arria NLG board as a non-executive Director.

Karl Johnsen is now senior VP and CFO at Aspen Technology following Mark Sullivan’s retirement.

Timothy Cox heads up Aegex Technologies’ new opened an office in Dubai.

Michael Day has joined the CGG board replacing Terry Young who has stepped down. Day was also appointed to the strategic and technology committee.

Raoul Jacquand is now CEO at Geovia, a Dassault Systèmes brand. He replaces retiree Rick Moignard.

Bill Tobman has resigned from the Divestco board of directors.

Laura Sen has joined the EMC board.

WorleyParsons’ Jim Purvis is the new Fiatech chairman. Dow Chemical’s Deborah McNeil has joined the board of advisors.

Sarah Fray is executive secretary of the UK Geological Society. She hails from the Institute of Structural Engineers.

Chris Sorrells is GSE Systems’ interim COO. The appointment is part of a restructuration that sees the cessation of one R&D project, office closings and company-wide staff reductions.

Brian Poe has been appointed Holland Services’ regional manager for Houston.

As part of its oil and gas innovation programme the UK’s Natural environment research council is to fund £1 million worth of projects. Closing date for applications is 3rd December 2015.

Technology veteran, Mark Bregman is now NetApp CTO. He hails from SkywriterRX.

Jose Bayardo is now CFO of National Oilwell Varco. He was previously with Continental Resources.

Chris Freeman is now director of field development at IO Oil and Gas Consulting.

Michael Weiss is now Oildex CTO.

Former Weatherford COO Peter Fontana has been elected to Parker Drilling’s board.

Sean Roach is to lead Schram’s expanded office in Houston.

Peter Wall is the new EU business area manager of Smith Flow Control.

Bård Glad Pedersen is now Statoil VP media relations. He was previously with the Norwegian Ministry of Foreign Affairs.

Hector Acevedo is to lead Uptake’s Oil & Gas Unit. He hails from Oracle.

Robin Watson has been appointed CEO of Wood Group following Bob Keiller’s retirement.


Done deals

geoLOGIC Systems, BV Investment Partners, Integraph, EcoSys, Divestco, Invico, Weatherford, Technical Toolboxes, EICE, Wood Group, Automated Technology Group, Emerson, Spectrex.

geoLOGIC Systems of Calgary has received a ‘major investment’ from BV Investment Partners, a middle-market private equity firm. BV has acquired the shares of founders Joe and Denise Harris. GeoLogic president and CEO, David Hood, keeps his entire ownership interest in the Company. The deal is the fifth transaction in BV’s Fund VIII vehicle.

Intergraph is to acquire enterprise project controls software specialist EcoSys. The deal broadens Intergraph Process, Power & Marine’s portfolio with the addition of EcoSys’ Enterprise Planning & Controls suite that is available for on-premise use or as a hosted solution. EcoSysy was founded in 2000 by the original developers of what is now Oracle Primavera.

Divestco has received a $3.2 million secured loan from Invico Diversified Income, repayable in 2017. Proceeds of the loan were used to repay $0.6 million in shareholder loans and as working capital.

Weatherford has elected not to pursue its previously announced public share offerings. The company was unwilling to sell securities at prices ‘that do not reflect the value we have created.’

Technical Toolboxes has ‘merged’ with Energy Industry Consulting & Engineers, adding engineering and software expertise and a scalable outsourcing capability to its SkyBox ‘next generation’ computing platform.

Wood Group has acquired Automated Technology Group, a UK-based supplier of control and power solutions for industrial automation. The deal expands Wood Group’s Mustang business unit into the manufacturing automation market.

Emerson has acquired Spectrex, a manufacturer of flame and open path gas detectors. Spectrex will join Emerson’s Rosemount portfolio.


Safety first ...

IOGP 2014 accident reports. ConocoPhillips’ PIER emergency response. DNV certifies GE’s overpressure detection solver. Nexen self-reports maintenance documentation failings. Helm on crew safety.

The UK’s IOGP has issued its reports for 2014 covering fatalities and high potential events in the oil and gas industry, enumerating causes of accidents and recommending preventative measures.

An article in ConocoPhillips’ Magazine (Issue N° 30) describes the company’s Public information for emergency response (PIER) system. The website was developed by Witt-O’Brien’s.

GE Oil & Gas has commissioned DNV GL to carry out inspection and certification of the logic solver element of its overpressure protection system for subsea pipelines. The project involves verification of equipment design, configuration and architecture to ensure it meets the IEC 61511 standard.

Under its process safety management program, Nexen Energy ULC initiated an internal audit of its corporate pipeline integrity management system and identified a number of non-compliances related to documentation of maintenance activities. Following self-disclosure, Nexen is working with the regulator to fix the issues.


DNV GL is working with ExproSoft on a risk-based subsea barrier valve test procedure combining ExproSoft’s WellMaster repository of reliability data for wells with DNV GL’s risk management expertise.

A study by Kate Pike and Emma Broadhurst of Southampton Solent University for Helm Operations finds that 50% of offshore support vessel crews are willing to compromise safety rather than say ‘no’ to clients or senior management.


Kappa releases new edition of Dynamic data analysis

700 page free handbook includes new chapter on unconventional evaluation.

The latest edition of Kappa Engineering’s handbook of Dynamic data analysis (DDA), a 700 page free resource that can be downloaded from the Kappa website, includes as new chapter on unconventional resources. The handbook is not specific to Kappa’s tools and should be useful to well-test operators, students and researchers alike.

The 70 page chapter on unconventionals provides an authoritative analysis of the evolving state of the art of evaluating these extremely low permeability reservoirs. DDA’s authors emphasize the relatively short time frame of current shale production that contrasts with very long transients required for modelers. DDA provides an in depth discussion and analysis of current research on the physics and evaluation techniques of shale.

A discussion of commonly used decline curve analysis techniques (Arps, power law, Duong ..) includes a warning that none of them corresponds to reservoir engineering theory. Simple and advanced modeling approaches are presented along with Kappa’s own workflow.


PIDX Projects update

Standards body reports progress on industry standard data sheets project.

Speaking at the 2015 PIDX Fall Conference in Houston this month, Roger Bhalla, who chairs the strategic initiatives committee, reported progress on the Industry standard datasheets and definitions (ISSD). This program sets out to provide a practical, standardized way to build, maintain, exchange and use datasheet-related information, based on existing datasheets. The program will validate the standardized set of machine readable records to that it will support full lifecycle information migration from design, through construction, operational handover and on to decommission.

PIDX is working to define what ‘full lifecycle’ datasheets are needed for each industry specific document set along with which data elements are required for each sheet. PIDX is also to facilitate mapping from datasheet elements and the internal formats used by owner/operators, EPCs and equipment manufacturers. More from PIDX.


ISO design standard behind newbuild Houston control room

Keil Centre ergonomics leveraged in major’s monitoring facility.

Architects Brad Adams Walker (BAW) report on a new build control center that it has delivered to a Houston-based major.

Following the Deep Water Horizon disaster, the company set out to ‘reinvent’ real-time safety monitoring of its own offshore drilling rigs and contracted BAW to create a control room focused on safety, collaboration and real time monitoring.

BAW leveraged the ISO 11064 standard for control room best practices in collaboration with the UK’s Keil Centre, a world authority on human factors and ergonomics. ISO 11064 provides principles advice and best practices for control center design and layout, displays and environmental requirements. BAW’s clients include Chevron, ExxonMobil, Shell, Fluor, and Honeywell.


Sales, deployments, partnerships ...

ABB, Kepware, Aecom, AGU, CMMI, Aker, Aveva, Calgary Scientific, P2 Energy, Landmark, EcoSys, SAP, Emerson, OSIsoft, Tintri, Eurotech, FMC, Gilbarco, Harc, Honeywell, IDS, SK Oilfield, Infor, Intertek, IT Vizion, Palantir, LMKR, Cleantech, Peloton, Oracle, Rock Flow Dynamics, Yokogawa.

ABB has embedded Kepware’s KEPServerEX in its data center infrastructure management solution, Decathlon for Data Centers.

Aecom is to provide construction services to Dakota Gasification’s Great Plains Synfuels plant in Beulah, North Dakota.

The American Geophysical Union is to adapt the CMMI Institute’s data management maturity framework to the needs of earth and space sciences.

Murphy has selected Aker Solutions to deliver the subsea production system for the Rotan Deepwater natural gas development offshore in Malaysia. Aker has also delivered the world’s first subsea compression system to the Statoil-operated Åsgard field in the Norwegian Sea.

Kværner has selected Aveva Bocad Steel and Offshore for use at its design and fabrication yard in Verdal, Norway.

Calgary Scientific’s PureWeb Design is available from the Amazon web services marketplace. Local, secure models are rendered remotely on Amazon’s GPUs.

P2 Energy’s Bolo accounting software has been selected by CrownQuest Operating.

Eni has signed a multi-year contract for Landmark’s suite of exploration and production applications.

V 7.0 of EcoSys’s Enterprise Planning & Controls has been certified to integrate with SAP applications.

Emerson has partnered with OSIsoft to develop a PI Connector for HART-IP. Stork has installed the first Roxar SandLog wireless sand erosion monitoring solution on a major North Sea asset.

Tintri has appointed Eurotech as ‘key seller’ of its storage products and solutions across EMEA.

FMC Technologies is to provide hardware and control systems to Shell’s Appomattox deepwater Gulf of Mexico development. FMC has also been awarded $172 million EPC contract for work on Statoil’s Johan Sverdrup field.

G&M has partnered with Gilbarco Veeder-Root for an extensive EMV-certified technology upgrade at 120 locations across Southern California.

The US National academy of sciences has awarded a $125,000 grant to Harc to develop a ‘virtual offshore disaster training site.’

Honeywell has signed a 4-year agreement with Lundin Norway to provide services and support on the Edvard Grieg field.

IDS has appointed SK Oilfield Equipment as its exclusive representative for the Indian market.

Strategic Maintenance Solutions is to deploy Infor’s enterprise asset management solution.

South Oil Company has awarded Intertek a multi-year agreement to assure the quality and safety of its oil and gas projects.

IT Vizion has partnered with OSIsoft to deliver operational intelligence and decision support solutions to oil and gas customers in the Americas and Europe.

Landmark and Palantir have partnered to develop a fully integrated economics, planning and decision support framework.

LMKR has signed a reseller agreement with Cleantech to supply its geoscience solutions to oil and gas companies in Europe.

Lockheed Martin has invested in Peloton Technology’s truck-platooning technology.

Engie has selected Oracle HCM Cloud to support its worldwide HR operations.

Marathon Oil and Petronas have acquired Rock Flow Dynamics’ tNavigator advanced simulation technology.

Yokogawa has been awarded $11 million contract for a terminal automation systems and safety instrumented systems at 11 Bharat Petroleum Co. terminals in northern and eastern India. BP has awarded Yokogawa a 10 year renewal for the provision of main automation contractor services to its upstream entities.


Standards stuff

CEN updates quality management standards. API’s new frac protocol. PPDM - ‘compliance with teeth.’ ISO updates greenhouse gas reporting.

The EU standards body CEN has adopted revised editions of its international standards for management systems. Two (EN ISO 9000 and EN ISO 9001) cover quality management systems while a third (EN ISO 14001) addresses environmental management. The standards are claimed to improve company performance, productivity and efficiency and to save costs, energy and reduce waste.

The American Petroleum Institute has published new versions of its standards for hydraulic fracturing. The two new standards cover pressure containment and well integrity, as well as environmental safeguards, including groundwater protection, waste management, emissions reduction and worker training. The standards should be used in conjunction with the ANSI/API Bulletin 100-3 that provides guidelines for community engagement to help operators communicate effectively with local residents and ‘pursue mutual goals for community growth.’

PPDM has announced a charter for a new ‘Compliance with teeth’ work group. The work group sets out to codify the core best practices that are used in deployment of the data model. Consensus-based best practices should help accelerate data model uptake and help different implementations communicate with each other. Success will be measured based on the ability of participating implementations to use the data in a compliant PPDM implementation ‘out of the box.’

ISO is updating its range of standards that are claimed to help organizations mitigate their impact on the environment. Amongst these is the ISO 14064 family for greenhouse gas emissions. Review of ISO 14064-1 and ISO 14064-2 has reached Committee Draft stage, leaving two months for vote and comments. The revision should be complete by April 2016.


Cybersec round-up

Cimation for Shell. Aramco ‘foils’ hackers. Atos/Radware DDoS mitigation. Forrester on OPM hack.

Shell has selected Cimation to provide industrial cybersecurity consulting, network engineering, endpoint remediation and system hardening for its process control IT in a global program to standardize cybersec technology and processes. Cimation’s consultants will work on the multi-year program in collaboration with other Shell suppliers. Cimation previously worked with Shell to develop an industrial cybersecurity professional certification.

A report in the Houston Chronicle’s Fuelfix minisite has it that Aramco foiled a fraud attempt involving scammers pretending to work for India’s state-owned Oil and Natural Gas Corp. Aramco’s statement followed an Indian Express newspaper report that claimed that the company had paid $30 million to fraudsters who spoofed an ONGC email address.

Atos has partnered with Radware to launch a cloud-based denial of service mitigation solution, adding Radware’s Defense Pipe into Atos’ cybersecurity portfolio. DDoS attacks are cheap and simple to launch and require little expertise. They may be used to distract security resources during more sophisticated attempts to breach critical systems. The service deploys local appliances for rapid reaction to intrusions and cloud technologies to provide scalable defense against volumetric attacks.

Forrester Research’s report on the hack of the US Office of Personnel Management (OPM) concludes that compliance to regulations such as the Federal information security management act is not enough to protect confidential data. While OPM adhered to the guidelines, an audit noted security concerns including a lack of valid authorizations. Projects lacked a consistent security approach as well as adequate staff or funding.


Det norske, BP use Aveva’s engineering IM tools

Digital information hub helps newbuild data handover. Asset lifecycle IM for brownfield data revamp.

Writing in Aveva World Magazine, editor Camille Nédélec-Lucas reports on a major deployment of Aveva’s Digital Information Hub concept on Det norske oljeselskap’s (DNO) Ivar Aasen field. Consultants VisioNova selected Aveva’s progressive handover solution to ‘accelerate and derisk’ the transition of the platform from its EPC-managed capex project phase through to commissioning and operations. VisioNova’s Asbjørn Mangerud said ‘Managing handover needs to become part of the operator’s core business.’

BP Norway has likewise implemented technology from Aveva whose Asset Life Cycle Information Management solution underpins BP Norway’s Operations Information Hub (OIH). The OIH provides BP Norway’s employees and contractors with transparent access to the latest revisions of technical information on production assets, while ensuring that this information adheres to BP’s ‘upstream master class library,’ a global standard for equipment information. BP is now working on automatic exchange of data with suppliers’ systems.


AssaiDCMS reports with open source plug-in

Engineering document management toolset embeds Actuate’s Birt reporting toolset.

Engineering document management specialist AssaiDCMS now offers reporting and data visualization leveraging the Birt open source reporting toolset.

Assai claims that Birt provides a flexible environment that allows users to meet company-specific reporting requirements without the need for ‘expensive tools’ such as Business Objects or Crystal reports.

Assai’s reports are now all available in Birt. With some training, end users can customize these or build their own user-defined reports that can be added to the Assai portfolio.

Birt technology is used to create data visualizations and reports that can be embedded into rich client and web applications. The open source code is a top-level project in the Eclipse foundation, with a freemium business model supported by Actuate. Key users and contributors include Cisco, Open Text and IBM. Birt claims over 12 million downloads of the software.


ECCMA’s Peter Benson on material safety datasheets

Director argues for regulatory backing for e-commerce standards.

Writing in the September 2015 issue of the ECCMA newsletter, executive director Peter Benson argues for regulatory pressure to promote industry e-commerce standards. The ISO 22745 (eOTD-r-XML) standard for material safety data sheets (MSDS), is used by large buyers of chemical products to streamline the supply chain and enhance safety.

While it is possible to convert manufacturers’ formats to the XML standard, it would be better if suppliers provided their own specifications in the format. To make this happen and to ensure safe and secure movement of product requires regulation.

One ECCMA member has already succeeded in persuading its government to adopt the standard and others are developing mandatory national technical specification templates which will be published in the ECCMA catalog template library (eDRR). ECCMA will also be working with local agencies to assist national manufacturers in the creation and registration of their standardized technical specifications.


BP’s production data streams into Amazon cloud

Hosted edition of GE Predix a.k.a. the ‘industrial internet' replaces legacy ‘Isis’ system.

Last month we wondered (aloud) if GE was hosting BP’s production data in the cloud. We now have it on good authority that this is indeed the case. Production data from BP’s North Sea fields is streamed into an instance of GE’s Predix ‘Industrial Internet’ application running on Amazon’s ‘elastic compute’ cloud in Dublin. Connectivity and privacy is assured by the use of Amazon web services (AWS) ‘direct connect’, a 10 Gbps private network link to the cloud.

The cloud solution is the consecration of several years of BP’s Field of the Future program, in particular its ‘Isis’ integrated surveillance information system (Oil ITJ Jan 2007). For GE, the deal is groundbreaking as it sees one of the first deployments of its software in the cloud.

Isis and Predix have now fused into BP’s Production Management Advisor, a ‘cross functional’ real time system that ‘ensures the best operational decisions are made.’ For GE, the PMA represents some $4.75 million of software and services in its initial North Sea deployment. The contract was won following a proof of concept demo. Along with its monitoring function, PMA serves as system of record for well test, pressure transient analysis, completions and alerts. The deal also reflects a big win for GE in that its Proficy historian has displaced BP’s OSIsoft PI System. Total, ConocoPhillips and Statoil are also Predix users.


Oil country big data wheeling and dealing

Noah, Seven Lakes, Maana, Navport, Halliburton, Accenture step up with analytics offerings.

This month sees new teamings of service providers seeking to apply artificial intelligence and machine learning to various oil and gas data challenges.

Noah Consulting recently announced a big data hook up with Seven Lakes Technologies whose Well lifecycle manager and analytics solutions help companies prioritize wells for intervention.

Industrial big data specialist Maana has entered the fray with the onboarding of former Chevron executive, Peter Breunig. Maana’s ‘big-data-fueled solution’ is to be applied to oil’s massive data sets in the search for improved operational efficiencies. Maana investors include Chevron, ConocoPhillips, GE and Intel.

Houston-based NavPort is leading a ‘data revolution’ in unconventional oil and gas with the launch of NavPort Analytics, a suite of data and analytic and business intelligence solutions.

Speaking at the recent fall PIDX meet, Robello Samuel unveiled Halliburton’s Internet of things for the oil rig. This is delivered as a special edition of DecisionSpace adapted to Cisco’s ‘FOG’-nodes for use in ‘at-the-edge’ computing. A downhole ‘X-FOG’ framework is poetically described as ‘deep edge’ computing.

Finally having deployed IBM Watson (OITJ Vol 20 N°6), Woodside is going for big data broke with help from Accenture to implement predictive analytics for maintenance and process-control across LNG assets.


Technical Toolboxes/EICE for Marathon’s digital oilfield

Parties win bid for ‘definition phase’ of Eagle Ford document and data project.

Technical Toolboxes, Energy Industry Consultants & Engineers (Eice) and EMC have jointly won a competitive bid for the definition phase of Marathon Oil’s Eagle Ford digital oilfield project. This engagement involves a review of Marathon’s current electronic production data management process, from field data collection through to the end-user. The partners are to identify opportunities for cost saving and production efficiencies in the near and longer term. Once identified, the partners will investigate the viability of implementing identified opportunities using Marathon’s existing tools, processes and staff and will help develop Marathon’s digital oil field vision for the Eagle Ford.

Eice is a specialist in ‘transforming silo-based work processes into a collaborative environment for decision making.’ Technical Toolboxes is a provider of software solutions, consulting services and training for engineering and technical professionals. EMC, well, you know them, they are now part of Dell!


DepthInsight’s ‘enormous’ modeling capability

16 trillion cell geomodeling capability, database and API from Chinese GridWorld developer.

Beijing GridWorld Software Technology has challenged the geo-modeling world with a 16 trillion cell structural model on display at the SPE ATCE. Gridworld’s model covers 1,900 kmē and includes over 100,000 wells.
The ‘DepthInsight Enormous Modeling Platform’ is claimed to have ‘no scale or resolution limitations.’ The ‘consistent, coherent and evolutionary’ model is formed by a ‘seamless merging of sub models,’ any part of which can be extended, updated or exported to other applications.

Model ‘fragments’ can be built piecemeal and stored in a database for subsequent update. Fragments can be created at different times and scales and assembled as required. DepthInsight supports both orthogonal and Pebi* gridding. Arbitrary cross sections and contour surfaces can be exracted from the database. GridWorld’s modeling technology has been embedded in BGP’s GeoEast interpretation system. An application programming interface is available for developers. Checkout the DepthInsight video and visit GridWorld.
* Perpendicular bisector.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.