April 2013


Shared earth modeling

‘Semantic’ approach to oil and gas field modeling proposed in new publication from IFP Energies Nouvelles. ‘Knowledge-driven’ solution to inform next generation Energistics’ Resqml protocol.

A new book by a team of researchers led by Michel Perrin and Jean-François Rainaud at IFP Energies Nouvelles, the French petroleum institute, breaks new ground in geological modeling, proposing a ‘semantic’ framework embracing model topology, stratigraphic relationships and grids.

Shared Earth Modeling* (SEM), a hefty 350 page work, is subtitled ‘knowledge-driven solutions for building and managing subsurface 3D geological models.’ For knowledge-driven, read the ubiquitous application of semantic-web ontology-based technology.

In his introduction, Total’s Dominique Lefebvre compares the thrust of SEM to the actions of Aureliano Buendia in Gabriel García Márquez’ ‘One hundred years of solitude’ who, in an effort to combat memory loss, labels every object in his village. Likewise, ‘ontologies will be the labels of our geological models.’ The image is particularly apt in the context of the ‘graying workforce!’

SEM’s early chapters provide an accessible introduction to earth modeling while introducing the ‘knowledge framework’ that describes stratigraphic relationships, faults and other objects. Various gridding techniques are discussed in the context of commercial tools and the current state of the art. Particular attention is given to Energistics’ Resqml initiative for reservoir model data exchange. Here SEM distances itself from the current Resqml approach (and indeed most current XML-based exchange formats), contrasting their ‘data driven’ modeling with the ‘knowledge-driven’ approach of the SEM. This difference is illustrated in a chapter on seismic interpretation where faults and reflectors are viewed as ontology ‘instances’ and manipulated with the Stanford ‘Protégé’ editor. Several SEM authors were involved with Resqml although the current standard eschews the semantic vision.

The team is now working on a Resqml 2 proposal which will introduce simple ontological concepts and act as a bridge to a future semantic world. SEM winds up presenting several semantic applications that run atop of the framework including E-Wok (Oil ITJ Jan 09) and Petrolege and Stratalege developed at the University of Rio Grande do Sul, Brazil.

Ontology management is illustrated with a workflow editor running atop of an Ontodb database of data from Total’s Alwyn field. A ‘WebLab’ platform is proposed to pull all the semantic strands together. All in all, SEM is a fascinating read. No oil and gas research department should be without a copy. More on SEM in this month’s editorial.

* Editions Technip 2013. ISBN 9782710810025.


IHS bags Fekete

Reservoir engineer added to growing portfolio of upstream software. IHS to provide global marketing reach to Harmony suite.

IHS continues its E&P software buying spree with the acquisition of well test software boutique Fekete Associates. Calgary-based Fekete’s flagship ‘Harmony’ suite of reservoir and production engineering tools are used to optimize production from new and existing assets.

IHS Chairman and CEO Jerre Stead said, ‘Fekete plays an important role in our strategy to combine our information and expertise with the best tools and technologies used in key energy workflows. Feteke’s products build on existing IHS Energy solutions. The acquisition will provide new opportunities to expand Fekete’s offerings in the global energy marketplace.’

Feteke president David Dunn added, ‘The acquisition offers Fekete’s clients seamless connections to data that is integral to a proper well or asset performance analysis. The IHS global reach will expand the delivery of cutting-edge engineering services and products worldwide and help position key support personnel across critical growth markets such as the new frontier of unconventional resources.’ More from Feteke.


Reflections on Shared Earth Modeling and the semantic web

Neil McNaughton delves deeper into the subject of this month’s lead. Can a book be news? It can, if it proposes a novel approach to intractable problems like interoperability and data management.

Our reports in this month’s issue on the 2013 Microsoft global energy forum and the Houston PPDM data management symposium are at the same time interesting and disappointing. They are interesting because it is always good to hear how folks are using technology to actually achieve stuff. They are disappointing in that, despite IT’s constant re-invention of itself, there is nothing much really that is new.

It was therefore interesting to receive a book for review that really does offer ‘something new.’ So much new stuff in fact that having devoted this month’s lead to a brief presentation of Shared Earth Modeling* (SEM) I propose to continue in the same vein.

But first, can a book be news? You bet it can! If the concepts developed in SEM pan out, the whole industry will be revolutionized. Not just modeling applications and workflow, but many data management issues will be fixed too. But semantic technology, despite a decade of effort and backing from the great and good, especially Tim Berners-Lee (TBL), has singularly failed to set the world alight.

When I picked up SEM, I first thought, oh dear, here we go again. 20 or so authors—this is just going to be another collection of papers which some editor has thrown together. Wrong. SEM is a well written coherent whole which does a great job of introducing a tough subject and in tying a lot of disparate themes together. My only serious criticism of SEM is that it follows the inexplicable French publishing tradition of not having an index.

My next question for SEM (the concept rather than the book) is, will it work? The ideal means of introducing a new technology to a vertical such as oil and gas is just to steal it from another vertical. Thus was signal processing lifted from the telecom business and repurposed in geophysics back in the mid 20th century. Like wise the digital oilfield works because is uses horizontal technology from the process control business.

But while the intellectual property ‘theft’ paradigm has worked in the past, it is not the only game in town. The oil and gas industry has developed bags of its own IP –in specialist fields like seismic and deepwater. Why shouldn’t the geo-modeling community lead the way in industrial application of semantics?

SEM is about more than geometry. A chapter on the use of ontologies for analyzing data in natural language is particularly interesting. If you think that this sort of stuff is pie in the sky, consider the case of IBM’s Watson. The Jeopardy-winning machine runs on a stack of open source software that includes ‘natural language processing, semantic analysis, information retrieval, automated reasoning and machine learning.’

Now Watson does not, as far as I can tell, use exactly the same ‘semantic web’ technology as SEM but it comes close. Watson has been reported as using various public ‘sem web’ data sources including Yago and dbPedia. The latter is a semantic interface to Wikipedia. Yago is another huge knowledge base that is generated automatically from Wikipedia, GeoNames and WordNet. Yago claims to hold 447 million facts about 9.8 million ‘entities.’

The great thing about Watson though is that it is such a compelling use case. Answer a Jeopardy question, everyone can relate to that! Unfortunately, the same cannot be said for many of the component technologies of the semantic web. TBL’s promise of a web of data and machine to ‘machine interaction’ were followed by a long and laborious period during which semantic standards have slowly evolved. Today, when you check out some of the technologies that underpin the SEM, they appear lacking in purpose.

But this is where SEM, the book, is so good—it confronts the obscure R&D stuff with the real world problem set of earth modeling and shows what might be a compelling application of the new technology.

What’s next? Clearly there is a lot of mileage in SEM for the standards bodies. Energistics, PPDM for a start, but also pretty well all involved in the Standards Leadership Council. Someone needs to take a good look at the SEM approach and compare it with the only other major semantic standard in oil and gas, ISO 15926. And of course there is Resqml—pretty much the focus of SEM.

SEM lead author Jean-François Rainaud told us the following ‘Resqml was partly inspired by the work done by the team behind SEM. But the Resqml group had to be realistic and stick with the technologies deployed in today’s geomodeling application software. If we get too far ahead of ourselves, we would have a hard time getting take-up. We are currently working on Resqml V2 and will be introducing semantic relationships between geological objects and relationships. This will allow us to codify relationships such as ‘Fault F2, on its hanging wall side interrupts horizon interpretation H1 on both sides.’ We are also working to ensure that the Resqml V2 data model is structured so it can later incorporate more of the semantic concepts described in SEM. We have a paper in preparation** for the EAGE in London where we will be presenting Resqml V2.’

Perhaps the proof of the pudding will come when the geometry component of say, ISO 15926, can be picked up in a semantic editor and reused alongside an ontology from the SEM. Even if such interoperability doesn’t materialize, the idea of labeling objects with a tag that means something is probably better than using a Windows GUID as has been suggested elsewhere. Connecting said objects with domain-specific concepts would seem to make more sense than shoehorning everything into a database.

There’s more on semantics in this issue—from OneGeology (page 4) and from PCA/Mimosa (page 5). Enjoy!

* Editions Technip 2013. ISBN 9782710810025.

** SPE 164794-MS (authors from Total, Texas A&M, IFP, Paradigm, Schlumberger and Energistics).

@neilmcn


Book Review—Oil industry E&P data management

Steve Hawtin’s self-published oeuvre advocates a ‘DAMA’ approach. But does it work?

Steve Hawtin’s new book ‘The Management of Oil Industry Exploration and Production Data*’ (MEPD) is a 150 page introduction to what might be qualified as the Dama** approach. An introductory chapter laboriously makes the economic case for managing data. Newcomers to the subject may wonder what the case against could be. Hawtin is a consultant with Schlumberger although this is barely evident in MEPD. There is nothing on interesting topics such as Petrel data management. Applications are pretty well absent from the debate as is any evaluation of commercial tools for data management and quality assurance—not even Finder!

Hawtin writes well and editorializes constantly. Some readers may be nonplussed with lines like ‘most geoscientists do not have training in semiotics.’ But rest assured, MEPD offers not semiotics, but a structured approach to data management leveraging the DAMA ‘body of knowledge’ and other ‘standards’ from worthy organizations like the Project management and Software engineering institutes.

The DAMA approach assumes a well staffed data organization and offers more roles than most E&P shops would tolerate. Hawtin enumerates ‘the most obvious’ ones in a list of 18 roles that include ‘availability manager,’ ‘change manager,’ and so on.

The application of project management and workflow tools such as ITIL, PRINCE and RACI beget more acronyms as in SWARM, ‘stakeholders wakeup, ability, review, measure.’ But as MEPD fails to produce any evidence to the contrary, one has to ask if such approaches tend to create a state of ‘paralysis by analysis.’

All in all MEPD makes for an entertaining read for those familiar with the topic and interested in Hawtin’s views. But MEPD fails as a practical manual by covering too much, too thinly and too idiosyncratically.

* 2013—self-published via Amazon CreateSpace. ISBN 9781481904643.
** Data management association.


IBM PureSystems for Chinese contractor

‘Smarter computing’ solution provides processing speed up to Sichuan Geophysical Company.

Chinese geophysical contractor Sichuan Geophysical Company of Petroleum Administration Bureau (SCGC), a China National Petroleum Corporation unit, has ‘adopted’ IBM PureSystems. IBM’s PureFlex ‘smarter computing solution’ is claimed to speed up its computing seven fold over SCGC’s previous system. SCGC operates in China, Turkmenistan, Pakistan and Ecuador providing services from engineering and geological research, seismic acquisition and down hole services.

SCGC’s Deng Yali said, ‘Exploration can generate a terabyte of data per day and petroleum engineers may devote up to 70% of his or her time mining this data.’ IBM and local partner Sichuan Zhonglu T&T deployed the new solution which (according to the release at least) runs ‘CAD/CAE’ professional software on Flex System x240, PureFlex p750 and IBM’s general parallel file system.

Reading between the lines of a rather obscure release (and obfuscating marketing material) it appears that the system targets principally seismic processing. The PureSystems ‘family’ is claimed to be an alternative to current enterprise computing models, where multiple and disparate systems require ‘significant resources to set up and maintain.’ The ‘Pure’ ecosystem comprises application deployment, data management ‘tuned for cloud computing’ and is capable of consolidating ‘over 100 databases on a single system.’ More from IBM.


ARMA announces information governance certification

Records management association and Pearson Vue team on professional test.

Arma, the American Records Management Association has announced an information governance professional (IGP) certification program. Information governance focuses on the creation, organization, management, protection, use and disposal of information. The Arma IGP certificate is intended to demonstrate competency in the above, along with an ability to work closely with legal/compliance departments, information technology and lines of business to implement an effective IG program.

Candidates are required to pass an exam consisting of a timed, 140 multiple-choice question test. The exam, which was developed by subject-matter experts under the guidance of psychometricians (experts in measurement and test development), is designed to measure the knowledge, skills, and abilities required to perform competently as an IG professional.

The test assesses competency in the following domains: managing information risk and compliance, developing the IG strategic plan, developing the IG framework, establishing the IG program, establishing IG business integration and oversight, aligning technology with the IG framework.

The computer-based examination is administered at a network of secure test sites owned and operated by Pearson Vue Worldwide through its partner,
Professional Testing, which also helped develop the test. The plan is to offer the exam at specific test windows throughout the year.

Certification is incorporated under an independent governing board for the purpose of maintaining autonomy in all of its certification practices and decisions and is not linked to or restricted by any Arma International product or service. The IGP certification is awarded solely on an individual’s ability to meet the certification requirements. Certification is awarded for three years and must be renewed every three years thereafter with a new test!

Applicants pay $599 upfront of which $499 is refunded to those who the Certification Board determines do not meet eligibility requirements. A tutorial is available and there is more from Arma.


OneGeology vs. EarthCube

Specialist defends ontological credentials of OneGeology and Geosciml.

Following what was looking like a potential ‘fork’ in online geoscience taxonomy between the EU OneGeology project and a new US-backed ‘EarthCube’ project (OITJ Jan 2013), we received a clarification on the ontological credentials of OneGeology from John Laxton of the British Geological Survey (BGS).

Laxton notes that underpinning OneGeology is Geosciml, a geological interchange standard based on the Open Geospatial Consortium’s Geography Markup Language GML and developed by the Commission for Geoscience Information.

Geosciml now offers a degree of semantic interoperability with simple vocabularies for some of the Geosciml concepts which have been implemented in SKOS RDF. These are also available as a ‘vocabulary service’ that illustrates the structure encoded in RDF*. BGS has also implemented the ICS stratigraphic chart in RDF.

Alongside the global OneGeology web map service, another OneGeology-Europe EU-funded project did investigate the extension of OneGeology-Global to achieve a level of semantic harmonisation between the services of the participating European geological surveys. Geosciml is now a component of the Inspire data specification which leverages both the Geosciml data model and vocabularies.

* May require Internet Explorer in ‘compatibility mode to work.


Weglein’s ‘antidote’ to full waveform inversion

M-OSRP head rails against ‘full waveform inversion’ announcement from SEG.

In a preamble to his presentation of a ‘timely and necessary antidote to indirect methods and so-called full waveform inversion,’ Art Weglein, who heads-up the University of Houston’s ‘Mission-oriented seismic research program (M-OSRP) railed against the ‘latest geophysical stampede, technical bubble and self-proclaimed seismic cure-all’ that is full wave-form inversion (FWI).

A recent announcement for the upcoming 2013 SEG Workshop on FWI opened with the line, ‘full waveform inversion has emerged as the final and ultimate solution to the earth resolution and imaging objective.’ For Weglein, such language ‘has no place anywhere in science, especially in exploration seismology.’

The method, from both a fundamental viewpoint and from practical considerations, ‘hardly deserves the label inversion,’ let alone such ‘extreme and unjustified claims.’ Weglein believes that the ‘FWI’ issue is not just a matter of semantics. It is a ‘substantive issue of what data, and what algorithms are called for by direct inversion to achieve certain seismic processing objectives.’


‘NAS for Dummies’—guidance or puffery?

Wiley Books and Avere Systems free booklet is more marketing spin than science.

A new publication from Wiley Books, Network Attached Storage for Dummies, authored by Allen Taylor as an ‘Avere Systems’ special edition purports to offer a guide to network storage in the age of the cloud. The 40 page booklet (a free download) describes various levels of storage and discusses their merits and drawbacks. The focus is latency—of disk drives, storage filers and CPUs, which leads naturally in to Avere’s key offering, the core/edge filer. This divvies up the storage system into relatively slow, conventional ‘core’ filer and a fast ‘edge’ filer architected to optimize the use of expensive and limited quantities of solid state memory and high end disk. This handles read/write requests from clients at the network edge. Such systems are required to counter what has become an established trend. Although disks are getting much bigger, they are not getting any faster. Gone are the days when it made sense to throw a large number of disks at a performance problem.

So does NAS for dummies provide any useful guidance beyond its marketing role for Avere? We would argue not. Its discussion of RAID is minimal and avoids the use of RAID arrays and high end network cards to provide performance. One has to question Wiley as to the wisdom of letting such blatant marketing material masquerade as a ‘book.’ But as we have previously observed, there are no ‘free’ lunches or books.


CCSI releases first set of modeling tools

Carbon capture software toolset announced to accelerate deployment at ‘hundreds of plants.’

The US Carbon Capture Simulation Initiative (CCSI) released its first set of computational tools and models late last year. These, the initial components of the CCSI toolset, provide models and computational capabilities to accelerate the commercial development of carbon capture technologies. The CCS toolset has application in power generation, refining and natural gas production. The new tools address process synthesis and optimization with physics-based models of proposed capture equipment and processes along with a framework for the ‘quantification of the uncertainty of model predictions.’

The toolset includes the creation of ‘reduced order models’ of reacting multiphase flow simulations and utilities for running thousands of process simulations concurrently for optimization and uncertainty quantification. The pre-release of the CCSI toolset comes ahead of schedule reflecting ‘intense industry interest’ in getting early access to the tools and the ‘phenomenal’ progress of the team!

The CCSI, a partnership among national laboratories, industry and academic institutions is developing computational models and simulation tools to accelerate the commercialization of CCS which is envisaged to be deployed at ‘hundreds of power plants.’ The CCSI did not respond to our request for a link to their software. Perhaps they were too busy gassing.


Software, hardware short takes

RFdyn, SeisWare, Senergy, Terraspark, Earthrisk, ESI, ETL, GIS Media, INT, Knowledge Reservoir.

Rock Flow DynamicstNavigator 4.0 supports compositional and thermal simulations, assisted history matching and uncertainty assessment. A new GUI runs simulations on remote clusters.

SeisWare 8.0 is out and includes an attribute library from Rock Solid Images, a 3D co-visualization window for RGBA attribute blending, SEG-Y data management and coordinate transformation.

The 3.8 release of Senergy Software’s Oilfield Data Manager includes a new GUI, heatmap and analysis stick plots and enhancements to the reservoir performance module 3D viewer.

TerraSpark Geosciences reports a 360x speedup with the 1.9 release of its Insight Earth flagship. Interpretation workflows have been streamlined for work in complex 3D systems such as turbidites and carbonates. Support for 16 bit data and Nvidia’s latest Kepler graphics cards has been added.

Energy Solutions International has released GasStream, a gas management system spanning gathering, processing and transportation.

The 5.2 release of ETL SolutionsTransformation Manager includes a debugger.

GIS Media has released a beta version of GeoFold, an ArcGIS 10 add-in for calculating seismic coverage from shot and receiver geometry.

INT has announced GeoToolkit.NET 3.5 for C# developers with support for deviated well track headers in log and well schematics and a multi-well capability. All components now run on the .NET Framework 4.

Knowledge Reservoir has announced a new hosted decision support tool for assessing non conventional oil and gas plays. CompassKB offers a best-practice roadmap and wiki written by KR’s subject matter experts.

The 1.3 release of Kongsberg’s LedaFlow transient multiphase simulator for wells and pipelines includes a new separator model and new parallel code gives an average 70 % speedup on multi-CPU hardware.

New Petrel plugins this month include Blueback Reservoir’s project tracker utilities, WesternGeco’s low frequency property estimatior and L&T Infotech’s auto tracker and Q calculator.

Dolphin Group unit Open Geophysical has released OpenCPS 3.0 seismic processing package with new support for crooked lines, FXY deconvolution and 3D SRME and Fourier-domain regularization.


Posc/Caesar—Mimosa joint owner-operators’ forum

Mimosa president on oil and gas interop pilot. Operations and maintenance special interest group to bridge gap between ISO 15926 and ISO 18435/13774. Jord/iRing project update.

The Posc/Caesar Association (PCA) and Mimosa held a joint owner operators forum hosted by BP in Houston last month. Both organizations work towards the standardization of data exchange in capital intensive assets including offshore platforms, FPSOs and refineries. Norwegian PCA approaches the problem from the standpoint of construction and handover, Mimosa from the standpoint of ongoing maintenance, repair and operations (MRO).

Mimosa president Alan Johnston presented the results of the oil and gas interoperability (OGI) pilot. OGI is an ISO Technical Committee 184 project. TC 184 addresses automation systems and integration. The OGI technical section is working on ‘OpenO&M,’ a foundation architecture and ‘system of systems.’ Suppliers will develop and maintain OGI compliant adaptors that will enable ‘repeatable, scalable industry-driven solutions for oil and gas and other critical infrastructure.’ All of which is encapsulated in the ISO 18435 standard. OpenO&M use case #1, scenario #1 addresses ‘greenfield handover for the oil and gas industry.’ All this is explained in slide N° 12 of Johnston’s 47 side deck, a PowerPoint masterpiece with more acronyms, standards and taxonomies than you could shake a stick at!

Of course, greenfield handover is pretty much the N° 1 use case for PCA’s ISO 15926 standard too. Indeed the OGI pilot itself leverages the PCA ISO 15926 reference data library. The pilot is led by Worley Parsons and involves interoperability and handover of a debutanizer piping and instrumentation diagram (P&ID) blending data from Aveva P&ID, Bently OpenPlant and Intergraph P&ID. Various transformation engines running in IBM’s IIC environment blend the data together for capture and analysis in OSIsoft’s PI historian. All runs inside IBM ‘ISBM’ which we suppose is WebSphere.

The OGI pilot’s success has led to Mimosa and PCA setting up a special interest group (SIG) to further cooperation across ISO 15926 and 18435 and to make O&M data ‘completely compliant’ with the ISO 15926 data model and reference data library.

Mimosa/PCA joint operations and maintenance (O&M) special interest group results was presented by Markus Stumptner (University of South Australia). PCA has defined and lives by the ISO 15026 engineering and construction standard. Mimosa by the ISO 18435/13774 stack for O&M activity. The SIG plans to bridge the gap between these standards with an ‘Open O&M’ transformation engine and service bus. The plan is to build on the composite architecture used in the OGI Pilot, using the ISO 15926 reference data library. Phase II of the project sees Microsoft contributing ‘prior art’ in the form of Chemra, the chemical industry reference architecture (no, not Mura apparently…).

The Fiatech/PCA joint operational reference data project Jord continues apace. As Ian Glendinning (GlencoIS and PCA) revealed. The Fiatech iRing project has tested PCA’s reference data service and found it wanting. Jord has fixed the problem with all reference data now exposed as ‘resolvable, queryable’ endpoints. The new iRing/Jord ensemble promises a peer-to-peer architecture supporting validated, certified data exchange between all stakeholders.

A scalable operational Jord platform is scheduled for 2014. Jord is an un-ashamedly semantic web-centric project. Current work includes a new RDF endpoint along with OWL/RDF triple store representations. For those unfamiliar with these still relatively exotic technologies, Jord includes technical training in the ISO15926 data model, the library and in RDF/OWL and the Sparql query language. More from Posc/Caesar.


SPE Digital Energy, The Woodlands, Texas part II

The second part of our report from Digital Energy hears from IBM on big data. Multiple presentations from Kuwait Oil and partner Halliburton on the ‘intelligent digital field.’ Schlumberger on running compute intensive Petrel and Matlab jobs on the cluster. And again, from KOC, IDF ‘smart flows.’

IBM’s approach to big data was sketched out by Mike Brulé. Oils are currently unable to accommodate exponentially exploding data. The first principle, physics-based approach to analysis has been a ‘stumbling block’ in the partnership with IT which wants do everything with data mining. But big data represents an opportunity to combine both. Brulé proposed a complex architecture operating at ‘mega to exa scales’ and at speeds from slow loop to real time. Real time streaming data ‘tuples’ feed a complete modeling and simulation environment. This spans accoustics, microseismic, ‘OpenCV’ for visualization and, of course, Hadoop. IBM already has three Hadoop implementations in oil and gas and is showing analytical speed up from ‘days to seconds.’ The approach combines full physics models where appropriate, e.g. for pipeline optimization with empirical/statistical models where physics is less well understood. The techniques have been successful in slugging avoidance and corrosion mitigation. IBM flagship project is Statoil’s environmental monitoring program. Another use is in exploration bidding where data can never be analyzed fast enough. Here a data cockpit ‘brings data together’ for the decision makers. In the Q&A doubts were expressed on two counts—regarding the difficulty of maintaining models and of propagating and comprehending uncertainty across multiple models. One observer asked, ‘Is this why the upstream is reticent to apply these techniques?’

Kuwait Oil (KOC) and supplier Halliburton pulled off something of a coup by dominating the last day of the conference with multiple presentations on KOC’s intelligent digital field (IDF) project. Halliburton’s Doug Johnson gave an overview of the Sabriyah IDF infrastructure that was designed to ‘turn data to actionable information.’ Data from instrumented wells and mobile sensors such as multi phase meters is consolidated at Emerson RTUs at the well site. A WiMax canopy connects field units to the corporate network and on to the collaboration center. Here, pre-defined workflows cover use cases from geology through to operations. KOC’s tool set includes integrated production models, AI/Neural networks and expert rules. There are multiple ways of getting at the data—the idea is to present them all and let the ‘smart user’ decide. One smart work flow covers production ‘gains and losses.’ Operators are shown what a well’s true potential is and have tools to tell them how it should be operating. These include real time pattern analysis and alerts for particular well events. Neural net model prediction has been shown to gives an accurate 30 day forecast of oil rate and water cut. Users can also perform ‘what if’ scenarios to check out what happens if pump or choke settings are changed. The system has been in use for two years, turning data into something that can be intuitively understood. In the Q&A, Johnson was asked about the data infrastructure and underlying ‘plumbing.’ He acknowledged that this is an entirely ‘data driven’ activity and workflows are highly dependent on the historian. But the data infrastructure was all in place before Halliburton came to the party. A KOC rep added that Finder and other repositories were used to capture multiple data types but that there was ‘no common integration platform.’ This just ‘does not exist.’

Harish Goel (KOC) delved deeper into the IDF with a talk on diagnostics and optimization. A complex injection program is run from the North Kuwait collaboration center. The key issue is more about managing water than producing oil. KOC uses Halliburton’s DecisionSpace for production to build its production workflows. The front end user interface is said to be key to take up. Lots of AI is embedded in the system leaving engineers to do their job. Statistical and analytical tools embed the smart workflows to filter correlate and perform Monte Carlo analysts and Pareto plots. One case study of artificial lift optimization resulted in a 420 bopd hike from one well and there are ‘many similar stories.’ Goel wound up contrasting traditional linear workflows that take days and produce ‘fragmented’ information. In collaboration mode, simultaneous action is possible by all stakeholders on the same workflow and data. Lessons learned included the need for ‘multi disciplinary people.’ Change management and an active partnership with IT are also required. In the Q&A, Goel added that the current program is a pilot on two smart wells? The plan is to convert all wells to smart completions over the next decade. Multiple interacting AI-derived models make for lots of ‘moving parts’ and complexity. KOC has handled this by a ‘top down,’ iterative approach and with ample resources—up to 250 people were working on the project at its peak. Goel was asked if the IDF closed the automation loop. The answer is no, not for now. KOC is waiting until the system and its people mature.

Dzevat Omeragic described an interesting Petrel workflow to automate geomodel update. Schlumberger’s researchers are performing geomodeling at borehole scale, inter alia to simulate log responses. This activity leverages a library of physics models running as a service on a high performance compute cluster. The library is shared with Petrel and other applications—notably Matlab. Current well placement workflows are limited to point by point inversion. Using the HPC compute service it is possible to perform full 3D inversion. Changes in pick propagate to the model geometry and a solver computes new locations of pillar nodes. The HPC facility is currently a research project for internal use. But, ‘there is no technical reason why the system could not be used by third party applications.’

In a second presentation, Harish Goel outlined the KOC IDF’s ‘report and distribution’ smart flow for assigning roles in real time. This smart flow ensures that users in different teams that impact the work flow record their actions. The system generates alarms and keeps track of where everyone is in the workflow. GUI widgets (tickets) show job number, status and days overdue. Supervisors can see quickly if there are gas lift or ESP tickets outstanding. An optimisation ticket is generated when an engineer chooses an optimization point. This is routed to stakeholders and the system tracks what actions are taken. The approach is said to minimize the risk of wrong or dangerous requests. Going forward, continuing enhancements to the IDF program are bootstrapped with the report and distribution workflow. Like any major project, many technologies are relatively untested. This system lets KOC track who did what and what worked. ‘The big picture is that we want to build a knowledge base of what works in what circumstances.’ More from Digital Energy.


Microsoft 2013 GEF

Microsoft Global Energy Forum hears from Afren on trials of Petrel Studio. Chevron’s global upstream petrotechnical portal. Hess’ ‘subsurface milestone and deliverables’ project. A theme of the GEF was mobile computing with presentations from Chevron, Devon and Halliburton. Shell/Covisint ‘be careful about who can search what!’ But … still nothing on MURA!

Schlumberger’s Petrel is a poster child for Microsoft as it spearheaded the move from Unix-based interpretation systems to Windows a decade ago. But the ‘single user’ PC paradigm still presents integration problems and ‘Petrel data management’ is something of an oxymoron. Enter ‘Studio,’ a new E&P ‘knowledge environment’ for Petrel. Studio is said to offer a knowledge management and data collaboration environment along with ‘tight integration’ with the Microsoft business platform.

Ricardo Ramirez presented Afren’s experience running Petrel and Studio on SQL Server. Afren wanted to improve collaboration between geosciences, engineering and data management across its different geographies. Studio was evaluated as a potential enhancement to Afren’s current Petrel reference project workflows.

A test used three petrotechnical users and three data managers. Various workflows were evaluated such as exchange of surfaces, faults and tops, sharing of user-edited data and quality flags. Studio’s access rights management and database performance were also tested. The results were that exchange of surfaces and faults and other data types was ‘significantly faster’ and that multi user access was possible. Sharing well sections still required a separate template-based export/import. Multi-user access worked well and seismic data links could be transferred and shared in a seismic data ‘pool’ similar to the reference project.

Data Management functionality allowed for multiple versions of data with access control. A ‘significant’ productivity improvement was noted—reference project data transfer with Studio was around 10 times faster. Data transfer workflows were ‘simple and intuitive’ with fewer mouse clicks. The pre-packaged hardware enabled rapid deployment. Afren is a smallish UK-based E&P shop with under 300 employees. This fact and the small team involved in the pilot means your mileage may vary.

Bill Gilmore presented Chevron’s global upstream petrotechnical portal (PTP). Chevron is automating information collection for its engineering teams. A development by Accenture and its majority-owned Avanade unit has leveraged Microsoft’s SQL database solutions to ‘realize a significant improvement in productivity and free resources to focus on increasing production.’ The portal is a starting point from which to obtain data, select workflows and provide tools to decision makers.

Currently, Chevron’s data is stored in applications specific to a particular data type. Some data types may be stored in more than one application and each has its own access permissions. This means that manual data integration is often required to the extent that petrotechnical still spend 30–70% of their time finding, conditioning and verifying data. The skill sets required for this type of activity are disappearing as experienced staff retire. Enter the PTP, a one-stop shop for data retrieval and visualization. The PTP is a web-based system developed in Microsoft Silverlight using the Petroweb Navigator toolkit. The latter provides charts, data grids and GIS base maps. The PTP has simplified access to data in ESRI ArcGIS, SQLServer and SSAS (Microsoft’s business intelligence offering) and Oracle. Search is quicker and easier and usability is improved through a ‘simple and common interface.’ Future plans include a migration from Silverlight to HTML5.

Rick Beaubouef introduced Hess’ subsurface milestone and deliverables (SMD) guidelines for project management and stewardship of technology. These have been leveraged in ‘Pathfinder2020,’ one of several Hess applications of 3GiG’s Prospect Director. P2020 was ‘inspired’ by the Windows 8 tablet user interface which led (curiously) to the designation ‘iSMD.’ The system has proved popular and has seen immediate deployment on the worldwide Hess desktop. The moral of the tale? Quality content is not enough. Users need immediate access, simplicity and usability. Developers need a compelling use case.

Mobile computing was something of a theme at the conference. Chevron’s upstream workflow transformation program has a mobility component. Michael Burt and Ravi Malla described how mobile decision support is being provided to Chevron’s San Joaquin Valley field workers. The system provides real-time dashboards embedded in business processes and workflows. Operator’s activity is ‘exposed’ to management in real-time. Under the hood is the ubiquitous OSIsoft PI System and SharePoint 2013. Technology from M2M Corp. enables wireless based real-time data sync. Other components include the CIRA X mobile gateway and NetMotion mobile VPN. The system uses a wireless hotspot in the truck with rugged laptops and handheld devices for fieldworkers.

Devon has been working with Microsoft consulting services to offer its knowledge workers the same compute environment at the office and at home—without having to lug any hardware around. This means limiting device options for security and being realistic as to what applications are needed. Devon is currently rolling out Windows 7 and will be trialling Windows in 2013 including integrated voice/video/IM into a single unified platform.

Halliburton’s presented a prototype mobile ‘always-connected’ version of DecisionSpace for Production. The unit included technology from Qualcomm and iLink Systems running on a Windows RT tablet from Dell.

Shell Oil’s Adrian Estala provided an update on its work with Covisint on securing its information via a hosted identify federation gateway. This leverages Microsoft’s unified access gateway at the network edge and Covisint as the external federation gateway for 2-factor authentication. Oil and gas presents some tricky security issues. Estala observed, ‘Be very careful about what you allow the external parties to search. Even if they can’t open a document, just viewing the title may be risky.’ A highly granular two factor authentication using the SAML protocol is used for all SharePoint access restricting users to the data and sites that are required. A ‘reverse proxy’ provides critical capabilities for secure traffic inspection and for facilitating authentication. Shell uses Microsoft’s UAG Proxy. External users dial in either with a Cipher card or receive an SMS text message containing their 2nd factor credentials.

But, and a big but at that, for the second year running, there was nothing about Microsoft’s upstream reference architecture ‘Mura!’ More from the GEF homepage.


Folks, facts, orgs ...

CMG, Aker, Allegro, CGG, CSC, Dassault, Dresser, Exprodat, Flotek, Fugro, GE, Great American, Halliburton, ICD, IHRDC, IHS, Ipcos, OGC, OGP, Oilfield Instrumentation, P2ES, K-Reservoir, Seitel, CGI, Quanta Services, 30 Point, Corpro, Rockwell, Ryder Scott, SAIC, Senergy, SGI.

Ryan Schneider is to replace Computer Modelling Group retiree Ron Kutney as VP marketing and Canadian sales. David Hicks is VP Eastern Hemisphere.

Koosum Kalyan has been nominated to the Aker Solutions board. He was previously with Shell.

Steven Ferrigno is MD EMEA for Allegro Development Corp and Mark Weaser is MD Asia Pacific.

CGG has appointed Luiz Braga to Latin America director.

CSC has named Maruf Majed, VP and general manager for the AMEA region.

Marc Kassis heads-up Dassault Systèmes new Dubai Internet City office.

Jan Kees van Gaalen is executive VP and CFO of Dresser-Rand, succeeding retiree Mark Baldwin.

Thierry Gregorius has joined Exprodat as principle consultant. He was formerly with Shell and Landmark Information Group.

Flotek Industries’ Board has elected former KPMG audit partner Richard Walton as Executive VP and CFO.

Maarten Schönfeld, formerly of Shell, has been nominated for appointment as member of the Fugro’s supervisory board replacing F.J.G.M. Cremers who resigned.

General Electric is to build a $110 million global oil and gas research center in Oklahoma.

Great American Group has appointed Robert Callaway as head of its new oil and gas unit.

Halliburton has opened a new completion technology and manufacturing center in Singapore.

Dan McNease has joined Independence Contract Drilling’s board. He is also a member of the HitecVision board.

Charles Brankman has joined IHRDC as director of instructional programs. He hails from C12 Energy.

Jerre Stead is executive chairman of IHS. Scott Key is president and CEO.

Kieron Lennox and Andy Coward have joined Ipcos as business developers.

The Open Geospatial Consortium has created a Middle East and North Africa Forum for outreach and education.

Carla Lloret has joined the Oil and Gas Producers’ association as environmental coordinator.

Oilfield Instrumentation has added Scott Deaton to its business development team. He hails from Schlumberger.

Kevin Harris has joined P2ES as Director of global alliances and channels. He was formerly with Pros, a Houston software boutique.

Knowledge Reservoir has appointed Chuck Severson as VP reservoir management. He was previously with Hyperdynamics Corp.

Seitel has appointed Stephen Graham Hallows as senior VP HSSE. He joins from BP America.

CGI Group has established a Security Centre of Excellence, in Ottawa, Ontario.

Founder and executive chairman of Quanta Services, John Colson, is to retire as executive chairman of the board.

Business columnist Loren Steffy has joined strategic communications firm 30 Point Strategies.

Steve McCallum is new Regional Manager for Corpro in Europe and the Caspian. He hails from National Oilwell Varco.

Rockwell Automation has elected Phillip Holloman and Larry Kingsley to its board.

Claudia Oramas has been promoted from senior petroleum technician to associate petroleum engineer at Ryder Scott.

Robert Logan is now CIO of SAIC and the new Leidos company.

Senergy has appointed Allan Mathieson as carbon capture and storage team leader. Don DiBenedetto and Ron Hoogenboom have been appointed as regional subsurface managers.

Bob Braham has joined SGI as senior VP and chief marketing officer, reporting directly to SGI President and CEO, Jorge Titinger. He was formerly VP of marketing for EMC’s Enterprise Storage Division.

Correction

Regarding last month’s lead, ‘SAP HANA for Oil and Gas,’ SAP’s Ken Landgren tells us that it is SAP’s Syclo mobile service not SoftPro’s SignDoc that is deployed.


Done deals

GE, Aramco, Citec, Energy Ventures, Aker, Ipcos, Shell, Technip, Tibco, FleetCor.

GE is to acquire Lufkin Industries for approximately $3.3 billion cash.

AGR Group is to split into separate petroleum and drilling service companies managed by Åge Landro and David Hine.

Aramco’s Energy Ventures unit has taken a stake in Sekal, a drilling automation software provider and developer of DrillScene and DrillTronics.

Finnish engineering and IM service company Citec is to acquire Akilea Engineering, a French oil and gas consultancy.

Energy Ventures has invested in Abrado Wellbore Services to development its video-based diagnostic technology.

Aker Solutions has paid £75 million for Enovate Systems. The company also acquired Managed Pressure Operations for $69 million.

Ipcos’ management has bought the company from its majority shareholder KBC Private Equity.

Shell Technology Ventures is to make investments of ‘several hundred million dollars’ in emerging technology companies over the next six to eight years.

Technip has acquired Norwegian offshore engineering and services contractor Ingenium AS.

Tibco Software has acquired Maporama Solutions, a privately-held, cloud-based provider of location intelligence and geospatial analytics solutions.

FleetCor Technologies has acquired Telenav’s enterprise business. The company also acquired ‘certain fuel card assets’ from GE Capital Australia.


EU kicks off ‘Tosca’ operations R&D

EOS Solutions to provide virtual reality component to €4 million safety critical operations program.

Irish oil and gas service provider Celtic Oil has selected EOS Solutions as a technology partner in the EU-sponsored ‘Tosca’ R&D project. Tosca, ‘total operations management for safety critical activities’ is a European Commission project funded under the 7th Framework program. The aim of the project is to create a commercially viable operations management system for small and medium size European businesses for which operational safety and security are paramount. The plan is for an affordable, customizable system for use across many industries.

EOS was selected for its 4D simulation technology, a combo of virtual reality, ‘discrete event simulation’ and CAD interoperability. Tosca integrates operations into a performance management system that addresses safety, quality and productivity throughout the project lifecycle. The industrial domain of application is process control including chemical, power gen and oil & gas. Tosca is a three year, € 4.2 million project of which €3.1 million is financed from the public purse. EOS Solutions is a Quantum Ventures of Michigan company. More from EOS, Celtic and Tosca.


CSB—poor design and safety documentation in Chevron

Chemical Safety Board findings from 2012 refinery fire ‘apply to all refineries, plants and industry.’

The Chemical safety board (CSB) has issued a draft report on the 2012 fire at a Chevron refinery in Richmond, California. The fire occurred when corroded piping in a crude oil processing unit ruptured causing a hydrocarbon release and a vapor cloud that ignited. Nineteen workers narrowly escaped death or serious injury as they were engulfed in the vapor cloud. The interim report found that Chevron had missed opportunities to apply inherently safer design and failed to identify and evaluate hazards.

The CSB team recommends that Chevron should perform damage mechanism hazard reviews and ensure that safeguards are in place to control identified hazards. The CSB also recommends the reporting of process safety KPIs to improve oversight by regulatory agencies. Recommendations were also made to local authorities to strengthen local industrial safety ordinances and ‘drive the risk of major accidents as low as reasonably practicable.’

CSB chairperson Rafael Moure-Eraso said, ‘Our findings and recommendations are directed at the Richmond accident but we believe they apply to all refineries, chemical plants and general industry. There is a national need to base safety principles on inherently safer designs and applying effective safeguards to control damage mechanisms such as corrosion. Regulatory agencies must maintain sufficient professional expertise to effectively oversee these highly technical industries.’


DNV Software updates risk management solution

‘EasyRisk’ brand discontinued in favor of Synergi Life risk and QHSE platform.

DNV Software has announced a new solution for managing operational, project and enterprise risk, incorporating the ‘EasyRisk’ Manager with the Synergi Life risk and QHSE-management solution. The web-based software lets risk managers base their decisions on real time information.

Are Føllesdal Tjønn, DNV Software MD said, ‘Our risk management software—including software for process risk, occupational risk, qualitative and quantitative risk management—is now complemented with the leading tool for operational, project and enterprise risk management. It also includes barrier management and Bow Ties. Customers will be able to use the Synergi Life portfolio to manage all risk aspects of their business, from the detailed and technical risk identification and assessment to their overall business risk strategy.’

Synergi Life provides best practice risk management support aligned with ISO 31000 and DNV’s Risk Management Framework. Live reporting of risk information and the ability to aggregate risk with immediate access to underlying causes are two important benefits of using the Risk Management module in Synergi Life. The EasyRisk brand is to be discontinued. Watch the Synergi video.


Accenture on E&P cloud computing

Baker Hughes, Shell cited as cloud computing users.

A new 24 page publication from Accenture hails cloud computing as the dawn of a new era for energy companies. It is held as an inevitability that industry and energy companies will migrate to the cloud. According to Accenture, Shell Oil is using an Amazon ‘virtual private cloud’ to analyze the massive amount of data produced by ‘super sensitive’ seismic sensors co-developed with HP. Shell’s drillers and geophysicists use the cloud for ‘analytics.’ And Shell has been piloting Hadoop in the cloud for big data analytics and sophisticated authentication. Another oil country clouder is Baker Hughes which runs compute intensive TubeFlow simulations on the Microsoft Azure cloud.


Sales, contracts, partnerships and deployments

Atos, Emerson, Kepware, EnerSys, CiDRA, Expro, Endeeper, Flotek, Gulf Energy, FMC, Foster Wheeler, Oildex, GE, Phusion, Intergraph, Jacobs, PennWell, GITA, Petrofac, Yokogawa.

Atos has signed a 3-year framework agreement with Shell to provide ‘IT and non IT’ services to support its business activities in the Netherlands.

Statoil has awarded Emerson Process Management a $33 million contract to upgrade safety and automation systems on the Visund oil platform in the Norwegian North Sea.

Kepware Technologies has partnered with EnerSys to market its KEPServerEX communications platform. The deal covers electronic flow measurement related products and the LinkMaster and RedundancyMaster solutions. Keware also announced that its KEPServerEX v5 passed functionality testing carried out by an ‘internationally recognized, independent research laboratory’ located in San Antonio, TX.

CiDRA Oilsands and Expro Meters are to partner on selling passive and active sonar flow systems and services. The deal targets the oil sands and heavy oil market in Western Canada.

DNV and Statoil have launched a training program to enhance knowledge of ‘particular Arctic challenges.’

Lagesed, the Sedimentary Geology Laboratory of Rio de Janeiro Federal University has acquired updated versions of Brazilian software developer Endeeper’s Petroledge and RockViewer systems for sedimentary petrographic knowledge management.

Flotek Industries and Gulf Energy are to build an advanced oilfield chemistry production in Oman and will create a state-of-the-art R&D organization for the Middle East and North Africa.

FMC Technologies has received a $96 million order from Statoil for subsea equipment on the Smorbukk South extension and has renewed a five year framework agreement for subsea operations services on the Norwegian continental shelf. FMC also won a contract from BP for the supply of subsea equipment to its Mad Dog phase 2 project.

A Foster Wheeler unit has been awarded a contract by Pecket Energy to perform feasibility, engineering studies and cost estimates for a ‘substitute natural gas’ production facility near Punta Arenas, Chile. The company also won a consultancy contract from Apache Khalda for the Qasr Compression Project in Egypt.

Gastar Exploration has deployed Oildex’s ePayables solution, Spendworks, to help manage and optimize its invoice workflow.

GE Oil & Gas has received a $620 million, 22-year award for the provision of ‘advanced technology services’ to QGC’s Queensland Curtis LNG plant, Australia.

Phusion (previously Pearson-Harper) has been appointed as data management contractor for the INPEX-led Ichthys LNG Project in a six year, £11.5 million deal.

Swedish Nynas has signed a general agreement with Intergraph to use its SmartPlant Enterprise suite in all its plants and projects worldwide.

Santos has named Jacobs Engineering Group to the panel of service providers to its eastern Australia business unit.

PennWell has acquired the Oil & Gas Pipeline Conference from Willbros Engineers and entered into an agreement with the Geospatial Information & Technology Association (GITA) to produce its annual conference and exhibition.

Zadco has awarded Petrofac’s joint venture with Mubadala Petroleum, Petrofac Emirates a contract for the upper Zakum, UZ750 development in Abu Dhabi. Petrofac Emirates’ share of the contract is $2.9 billion.

Yokogawa has been selected to supply the integrated automation and safety system for the new liquefaction trains at Cheniere’s LNG facility in Sabine Pass, LA, which is being developed by its subsidiary Sabine Pass Liquefaction.


Multi phase flow lab opens in Groningen, Netherlands

DNV Kema to offer ‘realistic’ test facilities and standards for multiphase flow and fiscal meters.

DNV’s Kema unit is opening a multiphase flow laboratory at its Groningen, Netherlands HQ. The facility will allow equipment manufacturers and oil and gas companies to test, validate and calibrate multiphase technologies such as separators and flow meters. Multiphase flow meters (MPFM) are used in oil and gas production and gas trading.

MPFMs have application in non conventional production where gas may be comingled with fluids. Likewise deepwater production systems may require MPFMs to provide operators and regulators with accurate compositional flow rates. Downstream, liquefied natural gas systems need accurate fluid component assessment as small deviations of measured volumes cause ‘financial risk.’ The laboratory promises a ‘true-to-life’ environment designed to increasing MPFM accuracy and improve separator efficiency.

Laboratory manager Ron ten Cate said, ‘There are various techniques available for multiphase flow measurement but the technology is immature and needs more R&D. Our clients told us that this was being held back by a lack of suitable test facilities. We plan new products and services for the medium and long term globally.’

The lab is actually an upgrade of Kema’s existing wet gas closed loop facility and will be able to recreate real field conditions. These include a full range of multiphase fluid compositions at realistic temperatures, pressures and flow rates. The unit is also designed to help with the development of equipment standards and testing protocols. More from DNV Kema.


Wireless world

Shell at the WIB. GE’s wireless II. Cybera One for Shell retail. Yokogawa’s ‘wireless anywhere.’

Shell’s Berry Mulder, speaking at the 50th anniversary meet of the Netherlands’ WIB international instrument users’ association’s seminar last month said that the wireless hype was over. It is now ‘just another way of communicating.’ Wireless is now mature enough for condition monitoring applications and with care, can be used in more critical applications. However, security, DCS integration and standardization are roadblocks to wider use.

GE has shoehorned its latest wireless router, the MDSTM Orbit MCR-4G into its ‘Industrial Internet’ marketing paradigm (Oil ITJ Jan 2013). The unit provides a secure connection from local networks across cellular networks to the office. Instrument data can be securely transmitted from the field and well workers can access the corporate network. The unit combines a WiFi hotspot with a Verizon 4G LTE modem. Security features include AES 128 bit encryption, Radius and AAA servers. The system is compliant with NERC CIP and FIPS 140-2.

Shell Oil is to decommission its VSAT satellite-based payment card solution by year end 2013. The system will be replaced by a ‘next generation’ secure internet platform with failover 3G/4G wireless connectivity. Retail outlets have the option to deploy the Cybera One secure application platform as the central network device and application server. Shell’s Scott Taylor said, ‘Cybera One addresses the whole puzzle from networking, security, broadband connectivity, point-of-sale, loyalty and future application deployment. The economics are astounding when compared to the fragmented alternatives or big box retail solutions.’

Yokogawa’s Wireless Anywhere’ concept leverages the ISA100.11a standard for plant-wide monitoring and control applications. ISA100.11a is compatible with wired communication standards such as Fieldbus, HART and Profibus.


E-learing news from Petrofac, Oilennium, Spie oil & gas

Petrofac and SkillsXP. Oilennium’s ConTrainer. Spie teams with Petroskills.

Much movement on the e-learning front this month. The V10.0 release of Petrofac training services’ (PTS) training and competence management software, SkillsXP is a ground up rewrite providing faster operation and improved functionality. The solution ensures that ‘human assets’ (i.e. people!) acquire and maintain the appropriate skill sets. SkillsXP integrates online services such as Google and LinkedIn and embeds the Oilennium e-learning solution acquired by Petrofac last year (OITJ Dec 12).

PTS’ Oilennium unit has launched ‘ConTrainer,’ stand-alone eLearning modules for use offshore and at remote locations without internet access. Norway-based Dolphin Geophysical has been using ConTrainer since last September for HSE training onboard its seismic fleet.

Paris-headquartered Spie oil & gas services has signed an agreement with Petroskills to ‘enhance its worldwide competency development programs.’ SPIE Oil & Gas provides on-the-job training, long term training programs and nationalization plans. Petroskills oversees the Petroskills Alliance, a joint venture with BP, Shell and OGCI to provide competency-based training. The combined offering targets operators, technicians, supervisors, engineers and managers.


Siemens ‘soft controllers’ for Apache’s drillers

Apache driller vaunts merits of Siemens’ WinAC software-based controllers.

Siemens reports that Apache is to deploy its ‘WinAC’ software controllers to replace conventional rack-mounted systems on it ‘worldwide drilling fleet.’ In a post on the Siemens-sponsored ‘Totally integrated automation*’ website, Apache’s Jim Rogers explained that the ‘soft controllers’ provide a ‘real-time deterministic control engine’ in a small footprint. WinAC offers a PC-based supervisor for Siemens S7 controllers.

The system consolidates automation components and logic into a single industrial PC that is certified for use in hazardous environments. Rogers opined, ‘It’s not process control, it’s equipment control. The driller still operates the rig through the controller.’ Many rigs are already equipped with Siemens S7 PLCs. For those that are not, WinAC can operate through communications interfaces such as Profinet and Profibus.

*Hosted by Automation World.


Pervasive extends Marathon’s e-commerce on Azure cloud

NFR Studies uses Paradigm’s toolset, a third party modeler and CSMP++ process simulator.

Pervasive Software is building on a 10 year business relationship with Marathon Oil with a new initiative to onboard ‘hundreds’ of new suppliers. Marathon’s increasing activity in the Eagle Ford shale area has led to a near quadrupling of its supplier count. Marathon uses Pervasive’s Business Xchange (PBX) managed services for e-invoicing, remittances and other electronic business document exchange.

PBX is hosted in the Windows Azure cloud. Pervasive’s Markus Bockle explained, ‘Offering our services on the Azure cloud means we can easily scale-up to accommodate hundreds of additional suppliers. We also get the flexibility to deliver innovative subscription and usage-based pricing that makes automated trading affordable for Marathon and to its ‘long tail’ of suppliers.’

Pervasive, a member of the Petroleum Industry Document Exchange (PIDX) standards body, provides electronic exchange of supply chain-related documents to many of the world’s largest oil and gas operators and suppliers. Pervasive was acquired by Actian this month. More from Pervasive.


Skua models Vietnam’s fractured basement reservoir

Dynamic reservoir modeling systems project sees ‘limited commercial release’ by year end.

Modeling Vietnam’s fractured basement oilfields has proved problematical as conventional ‘pillar grid’ based tools fail to capture the complex details of inclined crosscutting faults. An unnamed Vietnamese joint venture used Paradigm’s Skua modeling software to build a structural framework that accurately captured the interesting faults. Skua’s stratigraphic functionality was leveraged to model the top basement and onlapping sediments in a single operation, while preserving the complex fault network.

A tetrahedral mesh was built by Austrian consultants NFR Studies using ‘another software package.’ This was used to compute geo-mechanical properties and support flow simulation. Simulations on the unstructured grid were performed using the Complex systems platform, CSMP++, an object-oriented finite-element toolbox for complex, multi-physics process simulation. CSMP++ was developed by a consortium of Universities, led by the Montan University of Leoben and now marketed by ETH Zurich. The next step is to perform detailed flow characterizations of key fractured zones in the basement. Following further tests carried out by NFR Studies, the company’s plans include use of the SKUA tetrahedral mesh constructor to generate the corresponding unstructured mesh. More from Skua, NFR and CSM++.


DNV rolls out ‘user friendly’ version of software standard

OS-D203 standard for offshore integrated software dependent systems gets facelift.

DNV has announced a ‘new and more user-friendly version’ of its standard for offshore integrated software dependent systems (ISDS) a.k.a. OS-D203. ISDS is an offshore standard (DNV-OS-D203) and guideline document (DNV-RP-D201) for verification and classification of systems using ‘extensive software control.’ The new version updates the original 2009 publication that has been applied by oil companies, equipment suppliers and rig owners. The standard covers system and software quality assurance processes and provides a framework for assessing reliability, availability, maintainability and safety of such systems. The 100 page document contains plethoric definitions of terms, tables and other minutiae such that one wonders what the earlier less ‘user friendly’ version might have been like. According to the standard’s Wikipedia page, the qualification process, which starts at the requirements specification stage of a new project, is performed ‘in collaboration with DNV specialists.’

System integrators and suppliers are evaluated to ensure they have the prerequisites for delivering good quality software. The process is tracked through ‘systematic reviews, inspections, and testing.’ Knut Ording, manager of DNV’s new systems and software reliability unit said, ‘The revision means that yards, owners and suppliers may now more easily determine the scope and efforts related to implementing ISDS.’


Fiatech and Comint report on VR in construction

‘Advancing asset knowledge though virtual reality’ report. VR in the age of Google glasses.

With the advent (or threat?) of ubiquitous usage of Google glasses, the report, ‘Advancing asset knowledge though virtual reality’ is timely. The 33 page document—a free download from—was produced by specialists from Fiatech and ‘Comint’, (Construction opportunities for mobile IT). Augmented Reality is ‘a live direct or indirect view of a physical, real-world environment whose elements are augmented by computer-generated imagery.’ The technology is expected to help mobile workers leverage digital information onsite.

The document reports on field trials involving highways and railroad projects in the UK. One used static cameras to create ‘Earthmine’ geo-referenced cubic panoramas (à la Google Street View) which can blend photo imagery and model data. Another pilot blended Earthmine imagery with real-time sensor data and live video streams.

The project encountered issues making the different components interoperate. Code examples were ‘inadequate’ and the camera’s API was ‘very limited.’ But the test yielded positive results and showed that merging data and video streams from a mobile camera device is ‘viable.’


Seismic Hadoop and Jia Baodong’s 2010 thesis

What was that ‘seismic Hadoop’ thing? University of Stavanger thesis explains all.

On the better late than never principle (and since it was a poster child for Hadoop at Digital Energy) we report on Jia Baodong’s 2010 Masters Thesis from the University of Stavanger (UIS) on ‘Data acquisition in Hadoop.’ As explained in the abstract, oil and gas data is ‘big’ and contains much useful information. Accessing such may be impractical or time consuming. Hadoop/MapReduce is a potential solution to the data mining question—but first, data has to be imported to the Hadoop file system.

The UIC Hadoop cluster ingests ‘historical’ Witsml drilling data supplied by Statoil’s service providers. Once in the cluster, ‘reasoning algorithms’ (in the ‘Pig’ script) are applied to identify interesting information in the data. To test real time data loading the project used a high volume Twitter feed. The thrust of the thesis is that loading data to Hadoop with Chuckwa, an open source ‘data collection engine’ is better than without. Along the road, the thesis gives a glimpse of other components of UIC’s big data solution. Notably ‘DataStorm,’ an ontology-driven framework for ‘intelligent data analysis.’ DataStorm was derived from Stanford University’s BioStorm.


© 1996-2021 The Data Room SARL All rights reserved. Web user only - no LAN/WAN Intranet use allowed. Contact.